How to execute non blocking HTTP calls in Java?

Aayush Dwivedi :

I have a third party API, which I call using an HTTP GET request. Each request takes a few seconds to get a response.

Currently I am using a CompletableFuture which I am executing on a FixedThreadPool of size 64. This is causing the threads to be blocked until it recieves a response for the GET request, i.e. the threads sit idle after sending the GET response until they recieve a response. So my maximum number of simultaneous requests I can send out is limited by my thread size i.e. 64 here.

What can I use instead of CompletableFuture so that my threads don't sit idle waiting for the response?

Stephen C :

As @user207421 says:

  • A truly asynchronous (i.e. event driven) HTTP client application is complicated.
  • A multi-threaded (but fundamentally synchronous) HTTP client application is simpler, and scales to as many threads as you have memory for.
  • The actual bottleneck is likely to be EITHER your physical network bandwidth, OR your available CPU. If you have hit any of those limits, then:

    • increasing the number of threads in your thread pool is not going to help, and
    • switching to an asynchronous model is not going to help.
  • A third possibility is that the bottleneck is server-side resource limits or rate limiting. Increasing the client side thread count might help, have no effect, or make the problem worse. It will depend on how the server is implemented, etc.

If your bottleneck is actually the number of threads, then one simple thing you could try is reducing the worker thread stack size so that you can run more of them. The default stack size is typically 1MB, and that is likely to be significantly more than it needs to be.

There are a few Java asynchronous HTTP client libraries around. But I have never used one and cannot recommend one. And like @user207421, I am not convinced that the effort of changing will actually pay off.


What can I [do] so that my threads don't sit idle waiting for the response?

This is actually not the problem. An idle thread is only using memory (and some secondary effects which probably don't matter here).

If there is something else for your client to do while it is waiting, the thread scheduler will switch to a different thread.

So my maximum number of simultaneous requests I can send out is limited by my thread size i.e. 64 here.

That is true. However, sending more simultaneous requests probably won't help. If the client-side threads are sitting idle, that probably means that the bottleneck is either the network, or something on the server side. If this is the case, adding more threads won't increase throughput. Instead individual requests will take (on average) longer, and throughput will stay the same ... or possibly drop if the server starts dropping requests from its request queue.

Finally, if you are worried of the overheads of a large pool of worker threads sitting idle (waiting for the next task to do), use an execution service or connection pool that can shrink and grow its thread pool to meet changing workloads.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=344754&siteId=1