Future Executors

Looking through Iced I’ve found the Sandbox will always use the null executor and the default Application executor is Threadpool. Why is this?

I don’t understand what’s the point of the null executor in the first place, doesn’t dropping all futures mean that the main event loop wont work? But it clearly does so it must conflict with something else.

Also why do the default executors of Sandbox and Application differ?

My guess is that for simple sync only apps that would save compilation time and reduce binary size.

Sandbox cannot run any async primitives.

I know that, but my question is wouldn’t not running async primitives means sandbox can’t execute the main event loop within iced_winit I believe it’s called run_instance. My question is why it works, I don’t understand the main event loop should be dropped right?

Not really!

Anasync function is just a function that returns an opaque type that implements the Future trait. We are calling run_instance here:

let mut instance = Box::pin({
    let run_instance = run_instance::<A, E, C>(
        // ...

    #[cfg(feature = "trace")]
    let run_instance =
        run_instance.instrument(info_span!("Application", "LOOP"));


Then, instead of executing the returned Future in an executor, we just poll it manually here:

 if let Some(event) = event {
    event_sender.start_send(event).expect("Send event");

    let poll = instance.as_mut().poll(&mut context);

    match poll {
        task::Poll::Pending => {
            if let Ok(Some(flow)) = control_receiver.try_next() {
                *control_flow = flow;
        task::Poll::Ready(_) => {
            *control_flow = ControlFlow::Exit;

I explained this approach with more detail in this PR: Rebuild widget tree only after an application update by hecrj · Pull Request #597 · iced-rs/iced · GitHub

why not use block_on()to poll the future, it would greatly simplify the code, and all the executors have it.

Please, @Redhawk18, read what I said carefully. We do not want to block; we poll manually to achieve concurrent execution.