Recursive factorial across worker instances
A workflow that calls itself with `ctx.rpc` — recursion is automatically distributed and durable.
A workflow that computes n! by calling itself with n-1. Each recursive call dispatches via ctx.rpc to whichever worker in the group claims it — recursion ends up distributed across machines without any explicit coordination, and every step is checkpointed so a worker crash mid-recursion resumes cleanly.
TypeScript: @resonatehq/sdk v0.10.1 (current). Python: resonate-sdk v0.6.x against the legacy Resonate Server. Rust: v0.1.0, in active development.
Workflow recurses via ctx.rpc, dispatched across the factorial-workers group.
Workflow recurses via ctx.rpc, dispatched across the factorial-worker group.
Workflow recurses via ctx.rpc, dispatched across the factorial-workers group.
The problem#
Naive recursion runs in a single process. If the recursion is deep, you blow the stack. If the recursion is heavy (each call does real work), you can't parallelize across machines without writing your own dispatch layer. Add crashes to the picture and you also need to remember which branches finished and which need to retry.
Resonate's solution#
Use ctx.rpc instead of a direct function call. The recursive call goes to the Resonate Server, which dispatches it to whichever worker in the target group is available — so a deep recursion fans out across N workers automatically. Each call's result is a durable promise, so a worker crash mid-recursion just hands the unfinished call to a survivor.
Code walkthrough#
The pattern is one self-recursive function plus a client that kicks it off.
The recursive workflow#
import { Resonate, type Context } from "@resonatehq/sdk";
const resonate = new Resonate({
url: "http://localhost:8001",
group: "factorial-workers",
});
function* factorial(ctx: Context, n: number): Generator<any, number, any> {
if (n <= 1) return 1;
// ctx.rpc dispatches to any worker in the group — this includes the worker
// running this very call, but might also be a different machine.
const result = yield* ctx.rpc(
"factorial",
n - 1,
ctx.options({ target: "poll://any@factorial-workers" }),
);
return n * result;
}
resonate.register("factorial", factorial);from resonate import Resonate
from threading import Event
resonate = Resonate.remote(group="factorial-worker")
@resonate.register
def factorial(ctx, n):
if n <= 1:
return 1
# ctx.rpc dispatches to any worker in the group.
result = yield ctx.rpc("factorial", n - 1).options(
target="poll://any@factorial-worker", id=f"factorial-{n-1}",
)
return n * result
resonate.start()
print("Factorial worker is running...")
Event().wait()use resonate::prelude::*;
#[resonate::function]
async fn factorial(ctx: &Context, n: u64) -> Result<u64> {
if n <= 1 {
return Ok(1);
}
// ctx.rpc dispatches to any worker in the group.
let result: u64 = ctx
.rpc("factorial", n - 1)
.target("poll://any@factorial-workers")
.await?;
Ok(n * result)
}
#[tokio::main]
async fn main() {
let resonate = Resonate::new(ResonateConfig {
url: Some("http://localhost:8001".into()),
group: Some("factorial-workers".into()),
..Default::default()
});
resonate.register(factorial).unwrap();
tokio::signal::ctrl_c().await.unwrap();
}Kicking it off#
The client looks identical to any other RPC call — Resonate doesn't care that the workflow happens to recurse internally.
import { Resonate } from "@resonatehq/sdk";
const resonate = new Resonate({
url: "http://localhost:8001",
group: "factorial-client",
});
const n = Number(process.argv[2] ?? 5);
const result = await resonate.rpc(
`factorial-${n}`,
"factorial",
n,
resonate.options({ target: "poll://any@factorial-workers" }),
);
console.log(`Factorial of ${n} is ${result}`);
resonate.stop();from resonate import Resonate
from argparse import ArgumentParser
resonate = Resonate.remote(group="factorial-client")
parser = ArgumentParser()
parser.add_argument("n", type=int)
args = parser.parse_args()
result = resonate.options(target="poll://any@factorial-worker").rpc(
f"factorial-{args.n}", "factorial", n=args.n,
)
print(f"Result: {result}")use resonate::prelude::*;
#[tokio::main]
async fn main() {
let resonate = Resonate::new(ResonateConfig {
url: Some("http://localhost:8001".into()),
..Default::default()
});
let n: u64 = std::env::args().nth(1).and_then(|s| s.parse().ok()).unwrap_or(5);
let result: u64 = resonate
.rpc(&format!("factorial-{n}"), "factorial", n)
.target("poll://any@factorial-workers")
.await
.unwrap();
println!("Factorial of {n} is {result}");
}Run it locally#
Start the server, run a few workers, then dispatch from the client.
git clone https://github.com/resonatehq-examples/example-recursive-factorial-ts
cd example-recursive-factorial-ts
npm installbrew install resonatehq/tap/resonate
resonate devnpx tsx factorialWorker.tsnpx tsx factorialClient.ts 10Watch the three worker terminals — factorial(10), factorial(9), factorial(8) … fan out across the workers as recursion descends.
git clone https://github.com/resonatehq-examples/example-recursive-factorial-py
cd example-recursive-factorial-py
uv syncbrew install resonatehq/tap/resonate
resonate serveuv run python factorial_worker.pyuv run python factorial_client.py 10Watch the three worker terminals — factorial(10), factorial(9), factorial(8) … fan out across the workers as recursion descends.
git clone https://github.com/resonatehq-examples/example-recursive-factorial-rs
cd example-recursive-factorial-rs
cargo buildbrew install resonatehq/tap/resonate
resonate devcargo run --bin workercargo run --bin client -- 10Try the recovery story#
While the recursion is in flight (start with n=20 to give it some depth), kill one of the workers. Resonate reassigns its in-flight factorial(k) to a survivor. Already-completed sub-results stay completed; only the unfinished branch resumes. The final answer is unchanged.
Related#
- Hello world — the same primitives without the recursion.
- Load balancing — focused on the dispatch layer that makes the fan-out work.