Concurrency & Queues
Configure what you want to happen when there is more than one run at a time.
Controlling concurrency is useful when you have a task that can’t be run concurrently, or when you want to limit the number of runs to avoid overloading a resource.
One at a time
This task will only ever have a single run executing at a time. All other runs will be queued until the current run is complete.
Parallelism
You can execute lots of tasks at once by combining high concurrency with batch triggering (or just triggering in a loop).
Be careful with high concurrency. If you’re doing API requests you might hit rate limits. If you’re hitting your database you might overload it.
Your organization has a maximum concurrency limit which depends on your plan. If you’re a paying customer you can request a higher limit by contacting us.
Defining a queue
As well as putting queue settings directly on a task, you can define a queue and reuse it across multiple tasks. This allows you to share the same concurrency limit:
Setting the concurrency when you trigger a run
When you trigger a task you can override the concurrency limit. This is really useful if you sometimes have high priority runs.
The task:
Triggering from your backend and overriding the concurrency:
Concurrency keys and per-tenant queuing
If you’re building an application where you want to run tasks for your users, you might want a separate queue for each of your users. (It doesn’t have to be users, it can be any entity you want to separately limit the concurrency for.)
You can do this by using concurrencyKey
. It creates a separate queue for each value of the key.
Your backend code:
Was this page helpful?