In large site, highly concurrent often can not be avoided, it will involve a certain limiting
Limiting
In each API and service system, access to each interface has a certain upper limit, when reached withstand the range of interfaces, it is necessary to take certain measures to ensure the availability and downgrade services, the system is expected to prevent more than pressure causes paralysis. So for each service interface or access restrictions, denial of access, drainage and other services, and waiting in line
Common leaky bucket algorithm limiting algorithm and token pass algorithm
Leaky Bucket Algorithm
Seen from the figure, a leaky bucket (Leaky Bucket) can withstand a lot of water (request), and then the water at a constant speed, at a routing service request, but each interface has a corresponding rate, when the water is too large, request a sudden sharp increase in the transmission rate, then the bucket will fill quickly in a short time, then it will reject other requests, leaky bucket algorithm can impose restrictions on access requests and data
Token-pass algorithm
Token pass algorithm constant speed to a constant token bucket placed inside, like water bucket (request), such as 1000 into tokens, each to a request, to take a token from the token on the inside , when the token is to get finished, or other services refused to wait for service.
The size of the token bucket may be fixed at a constant rate itself continuously produce the token. If the token is not consumed or is consumed to produce the speed is slower, the token will continue to increase until the bucket is full. Token back again generated will overflow from the tub. Finally, you can save the maximum number of tokens in the bucket never exceed the bucket size
In contrast, the leaky bucket algorithm imposes certain limitations transmission rate, and can limit the speed of the ship through the token rate, we can also address some emergency situations.