The idea> Load Balancing Service - On Gateway

Premise scenario:

There is a service A, sends large amounts of data to the service B, this action is called a. Service B, will accept the data and make forwarding, call this action as b. A very swift action, action b slow. Because the service is a cluster A, and another service operation requires C b provides control information. (As shown below)

Now the question is, due to the quick b a very slow, inefficient, create multiple service if B, then services A, C will also be "implicated", if we take into account load balancing and so on ...... so, this is not a permanent solution. Then you need a management control messages and data distribution, management of multiple B service, and perform load balancing strategy gateway service.

 

Preliminary ideas:

A service will first gateway application resources (want to request a service B), the gateway will be based on certain policies, analyze all the services below in your own B, which is "relatively free", then the service B "address" issued to service A, service after service B A sends this data. Of course, we must also send a corresponding control information to the appropriate service B. Figure, lines A and Table Redis or connected to the gateway: control, request or response message, the blue line represents the path a large amount of data.

 

Currently this work is not in a later update.

Published 12 original articles · won praise 15 · views 1315

Guess you like

Origin blog.csdn.net/LucifeR_Shun/article/details/103971846