When you create user-defined APIs that allow sending data to your Flows (PUSH API), it is not always possible to control how often and how much data your client will send. To prevent the system from degradation or complete overload in case of a large number of calls with huge payloads in a short period of time, it is possible to configure throttling and queuing for asynchronous
PUT requests. By utilizing throttling and queuing, you can specify how many calls should be processed per minute and whether excess calls should be queued, as well as in which order queued messages should be processed.
To enable throttling, navigate to the
Listeners section under
Settings, then check
Enabled checkbox, and specify the desired Throughput. Throughput is measured in the number of requests per minute. Any request that exceeds the throughput will be dropped, and the 429 Too Many Requests HTTP status code returned to the sender.
Only requests to Listeners with the method set to
PUT will be counted against throughput.
Instead of immediately dropping throttled requests, it is possible to put those excess requests into queue and process them at a later time. In order to enable queuing, navigate to the
Listeners section under
Settings, and check
Queue throttled requests checkbox.
Queuing can only be enabled if Throttling is enabled.
Additional queue settings
Number of queue processing threads: maximum number of threads that can process requests from the queue in parallel. If queuing has been already enabled, changes to this setting won't take effect until service reboot.
Default priority: default Listener request priority in queue, when not explicitly set in Listener itself.
Throttle when above priority: throttle requests with priority above this, even when queuing is enabled.
Listener endpoint queue priority setting
It is possible to specify endpoint requests priority in queue for each individual Listener Connection by selecting
Queue priority under
Priority values range from 1 to 10. Where 1 is the highest priority (queued requests with this priority will be processed first) and 10 is the lowest priority. Queued requests with the same priority are processed in the order they were added.
Throttling and queuing statistics
Current throttling and queuing statistics can be viewed under
Statistics by clicking on
View asyncronous POST requests throttling/queuing statistics (chart icon next to the number of Listener calls today). You will be presented with two charts. The first one shows a graph of requests per minute for the past three hours, and the second one shows the number of requests stored in a queue at the moment by priority.
Throttling and queue processing flowchart
The flowchart below shows details of throttling and queue processing steps and decisions made by the system: