We at ottex.ai use bunny.net to deploy globally an openrouter like speach-to-text API (5 continents, 26 locations, idle cost 3$).
Highly recommend their Edge Containers product, super simple and has nice primitives to deploy globally for a low latency workloads.
We connect all containers to one redis pubsub server to push important events like user billing overages, top-ups etc. Super simple, very fast, one config to manage all locations.
There is no cold starts at all. It’s running non-stop.
Bunny bills per resource utilization (not provisioned) and since we run backend on Go it consumes like 0.01 CPU and 15mb RAM per idle container and costs pennies.
Highly recommend their Edge Containers product, super simple and has nice primitives to deploy globally for a low latency workloads.
We connect all containers to one redis pubsub server to push important events like user billing overages, top-ups etc. Super simple, very fast, one config to manage all locations.