When you create user defined APIs that allow sending data to your flows (PUSH API) it is not always possible to control how often and how much data your client will send. To prevent system from degradation, or complete overload in case of large number of calls with huge payloads in a short period of time it is possible to configure throttling and queuing for asynchronous POST and PUT requests. By utilizing throttling and queuing you can specify how many calls should be processed per minute and whether excess calls should be queued, as well as in which order queued messages should be processed.
Throttling
To enable throttling navigate to the Listeners section under Settings, then check "Enabled" checkbox and specify desired "Throughput". Throughput is measured in number of requests per minute. Any request that exceeds the throughput will be dropped and 429 "Too Many Requests" HTTP status code returned to the sender.
Only requests to listeners with method set to POST or PUT will be counted against throughput.
Queuing
Instead of immediately dropping throttled requests it is possible to put those excess requests into queue and process them at a later time. In order to enable queuing navigate to the Listeners section under Settings and check "Queue throttled requests" checkbox.
Queuing can only be enabled if Throttling is enabled.
Additional queue settings
- Number of queue processing threads - Maximum number of threads that can process requests from the queue in parallel. If queuing has been already enabled, changes to this setting won't take affect until service reboot.
- Default priority - Default listener request priority in queue, when not explicitly set in listener itself.
- Throttle when above priority - Throttle requests with priority above this, even when queuing is enabled.
Listener endpoint queue priority setting
It is possible to specify endpoint requests priority in queue for each individual listener connection by selecting "Queue priority" under "POST/PUT parameters".
Priority values range from 1 to 10. Where 1 is the highes priority (queued requests with this priority will be processed first) and 10 is the lowest priority. Queued requests with the same priority processed in order they were added.
Throttling and Queuing Statistics
Current throttling and queuing statistics can be viewed under Statistics by clicking on "View asyncronous POST requests throttling/queuing statistics" (chart icon next to number of Listener calls today). You will be presented with two chart. First one shows graph of requests per minute for the past three hours and second one shows number of requests stored in queue at the moment by priority.
Throttling and Queue Processing Flowchart
Flowchart below shows details of throttling and queue processing steps and decisions made by the system.
Comments
0 comments
Please sign in to leave a comment.