java in two strategies to deal with high concurrency

Objective: To improve usability

Spillway queue by ExecutorService

// thread pool comprising 20 threads 
Private ExecutorService ExecutorService = Executors.newFixedThreadPool (20 is);

Concurrent pressure downstream will have to submit the code placed in the thread pool method, as follows:

// pool submit the calling thread synchronization method is simple spillway queue congestion window 20 (the same time only a server process 20 requests, other requests waiting queue) 
Future <Object> = ExecutorService.submit Future (a Callable new new <Object> () {
@Override
public Object call () throws Exception {
// here call interface pressure downstream of concurrent
return null;
}
});
the try {
Future.get ();
} the catch (InterruptedException E) {
e.printStackTrace ( );
} the catch (ExecutionException E) {
e.printStackTrace ();
}

 

By the case of Guava RateLimit load balancing to achieve stand-alone internal current limit

Guava need to join rely

<dependency>
    <groupId>com.google.guava</groupId>
    <artifactId>guava</artifactId>
    <version>18.0</version>
  </dependency>

 

private RateLimiter rateLimiter RateLimiter.create = (300); // 限制300tps

Plus the following code in the method entry:

    // entrance limiting 
        IF (! {RateLimiter.tryAcquire ()) 
            // throws an exception ; 
        }

 

At last:

Here only a brief description of how the most simple to use, highly concurrent reduce stress on the system, there is no analysis of internal-source implementation, hope of understanding

Guess you like

Origin www.cnblogs.com/wuba/p/11516985.html