Parallel work with durable fan-in
Spawn N tasks in parallel, collect their results — every spawn and every result is a durable promise.
A workflow spawns four notification channels (email, SMS, Slack, push) in parallel and joins their results. Total wall time approaches the slowest channel, not the sum. If one channel fails and retries, the others stay checkpointed — they don't re-send.
TypeScript: @resonatehq/sdk v0.10.1 (current). Python: resonate-sdk v0.6.x against the legacy Resonate Server. Rust: v0.1.0, in active development.
Notification fan-out across email, SMS, Slack, and push using ctx.beginRun.
Notification fan-out across email, SMS, Slack, and push using ctx.rfi handles.
Three-way parallel processing using ctx.run().spawn() + DurableFuture.
The problem#
Sequential calls — even three or four — turn into the latency tax you don't notice until you do. Fan them out manually with Promise.all (or its language equivalent) and you've collapsed the latency, but you've also given up checkpointing: a crash mid-batch re-runs everything when the workflow recovers, and a retry of one branch blasts the same email twice to the customer.
Resonate's solution#
Each branch becomes its own durable promise. Spawn them with beginRun (TypeScript) or .spawn() (Rust) — the workflow doesn't block. Then await each handle to fan in. If a single branch fails and retries, the others are already checkpointed; they don't re-execute. Crash recovery only re-runs whatever was in flight when the worker died.
Code walkthrough#
The fan-out is N spawn calls. The fan-in is N awaits. The fact that each is durable is what makes this safe — every other implementation has subtle re-execution bugs.
import type { Context } from "@resonatehq/sdk";
import { sendEmail, sendSms, sendSlack, sendPush, type OrderEvent } from "./channels";
export function* notifyAll(ctx: Context, event: OrderEvent, simulateCrash: boolean) {
const start = Date.now();
// Fan-out: spawn all 4 channels at once. Each beginRun returns a future.
const emailFuture = yield* ctx.beginRun(sendEmail, event);
const smsFuture = yield* ctx.beginRun(sendSms, event);
const slackFuture = yield* ctx.beginRun(sendSlack, event);
const pushFuture = yield* ctx.beginRun(sendPush, event, simulateCrash);
// Fan-in: each future is durable. If push retries, email/sms/slack stay
// checkpointed and do NOT re-send.
const results = [
yield* emailFuture,
yield* smsFuture,
yield* slackFuture,
yield* pushFuture,
];
return {
orderId: event.orderId,
channelsNotified: results.filter((r) => r.success).length,
totalMs: Date.now() - start,
results,
};
}from resonate import Context
from channels import send_email, send_push, send_slack, send_sms
def notify_all(ctx: Context, event: dict, simulate_crash: bool):
# Fan-out: ctx.rfi(...) returns a handle without blocking.
email_p = yield ctx.rfi(send_email, event).options(id=f"{ctx.id}.email")
sms_p = yield ctx.rfi(send_sms, event).options(id=f"{ctx.id}.sms")
slack_p = yield ctx.rfi(send_slack, event).options(id=f"{ctx.id}.slack")
push_p = yield ctx.rfi(send_push, event, simulate_crash).options(id=f"{ctx.id}.push")
# Fan-in: each handle is durable. If push retries, email/sms/slack stay
# checkpointed and do NOT re-send.
email = yield email_p
sms = yield sms_p
slack = yield slack_p
push = yield push_p
results = [email, sms, slack, push]
return {
"order_id": event["order_id"],
"channels_notified": sum(1 for r in results if r.get("success")),
"results": results,
}The Python equivalent of TypeScript's ctx.beginRun is ctx.rfi — it returns a handle immediately, and the workflow awaits each handle to fan in.
use resonate::prelude::*;
use serde::{Deserialize, Serialize};
#[derive(Serialize, Deserialize, Clone)]
struct WorkItem { id: u32, data: String }
#[derive(Serialize, Deserialize, Debug)]
struct WorkResult { id: u32, output: String }
#[resonate::function]
async fn fan_out_fan_in(ctx: &Context, items: Vec<WorkItem>) -> Result<Vec<WorkResult>> {
// Fan-out: spawn three items in parallel.
let h1 = ctx.run(process_item, items[0].clone()).spawn().await?;
let h2 = ctx.run(process_item, items[1].clone()).spawn().await?;
let h3 = ctx.run(process_item, items[2].clone()).spawn().await?;
// Fan-in: each handle is individually durable.
let r1 = h1.await?;
let r2 = h2.await?;
let r3 = h3.await?;
Ok(vec![r1, r2, r3])
}
#[resonate::function]
async fn process_item(item: WorkItem) -> Result<WorkResult> {
Ok(WorkResult {
id: item.id,
output: format!("Processed: {} (item #{})", item.data, item.id),
})
}The TypeScript example includes a --crash flag that simulates the push channel failing on its first attempt. When you run it with --crash, the workflow retries the push channel — but watching the logs, email/SMS/Slack don't re-send. They're each their own durable promise, already resolved.
Run it locally#
git clone https://github.com/resonatehq-examples/example-fan-out-fan-in-ts
cd example-fan-out-fan-in-ts
bun installRun the happy path (all 4 channels in parallel):
bun run src/index.tsThen run with the crash flag — push fails, retries, but the other three stay completed:
bun run src/index.ts -- --crashThe log output shows the wall time vs the sum of the individual channel times — fan-out collapses the total to roughly max(channels).
git clone https://github.com/resonatehq-examples/example-fan-out-fan-in-py
cd example-fan-out-fan-in-py
uv syncbrew install resonatehq/tap/resonate
resonate serveuv run python main.pyOr with the crash flag — push fails, retries, but the other three stay completed:
uv run python main.py --crashgit clone https://github.com/resonatehq-examples/example-fan-out-fan-in-rs
cd example-fan-out-fan-in-rs
cargo buildbrew install resonatehq/tap/resonate
resonate devcargo runresonate invoke fan-out.1 \
--func fan_out_fan_in \
--arg '[{"id":1,"data":"a"},{"id":2,"data":"b"},{"id":3,"data":"c"}]'Try the recovery story#
While the fan-out is mid-flight, kill the worker. Resonate replays only the branches that hadn't checkpointed; everything that completed before the kill is reused from durable promises. The wall time goes up by however long the worker was down, but no branch double-executes.
Related#
- Recursive factorial — the same primitive, applied to a recursive call graph.
- Deep research agent — fan-out applied to LLM tool calls.