Introduction to several common usage scenarios of message queues

1. Introduction

Message queuing middleware is an important component in a distributed system. It mainly solves problems such as application coupling, asynchronous messaging, and traffic cutting. Achieve high performance, high availability, scalability and eventually consistent architecture. The most used message queues are ActiveMQ, RabbitMQ, ZeroMQ, Kafka, MetaMQ, RocketMQ.

Two, message queue application scenarios

The following introduces the common usage scenarios of message queues in practical applications: asynchronous processing, application decoupling, traffic cutting, and message communication.

1. Asynchronous processing

Scenario description: After users register, they need to send registration emails and registration SMS. There are two traditional methods: serial and parallel.

Serial mode : After successfully writing the registration information into the database, send the registration email and then send the registration SMS. After all the above three tasks are completed, return to the customer.
Introduction to several common usage scenarios of message queues
Parallel mode : After the registration information is successfully written into the database, the registration message is sent at the same time as the registration email. After the above three tasks are completed, return to the client. The difference with serial is that the parallel method can increase the processing time.
Introduction to several common usage scenarios of message queues
Assuming that each of the three business nodes uses 50 milliseconds, regardless of network and other overheads, the serial time is 150 milliseconds, and the parallel time may be 100 millisecond.

Because the number of requests processed by the CPU per unit time is constant, suppose the CPU throughput is 100 times in 1 second. The CPU can handle 7 requests (1000/150) in 1 second in serial mode. The number of requests processed in parallel is 10 times (1000/100).

Summary : As described in the above case, the performance (concurrency, throughput, response time) of the traditional system will have bottlenecks. How to solve this problem?

The introduction of message queues will not be necessary business logic, asynchronous processing. The reconstructed architecture is as follows:
Introduction to several common usage scenarios of message queues
According to the above agreement, the user's response time is equivalent to the time the registration information is written into the database, which is 50 milliseconds. After registering emails and sending short messages to the message queue, they return directly. Therefore, the speed of writing to the message queue is very fast and can basically be ignored. Therefore, the user's response time may be 50 milliseconds. Therefore, after the architecture is changed, the throughput of the system is increased to 20QPS per second. It is 3 times higher than serial, and twice higher than parallel!

2. Application decoupling

Scenario description: After the user places an order, the order system needs to notify the inventory system. The traditional approach is that the order system calls the interface of the inventory system. As shown below:
Introduction to several common usage scenarios of message queues
Disadvantages of the traditional model :

If the inventory system is inaccessible, the order reduction will fail, resulting in the failure of the order and the coupling of the order system and the inventory system.

How to solve the above problems? The solution after the application of the message queue is introduced, as shown in the figure below:
Introduction to several common usage scenarios of message queues
Order system: After the user places an order, the order system completes the persistence processing, writes the message to the message queue, and returns the user order successfully

Inventory system: subscribe to the message of the order, use pull/push method to obtain the order information, and the inventory system performs inventory operations based on the order information

If: The inventory system cannot be used normally when the order is placed. It does not affect the normal order placement, because after the order is placed, the order system writes to the message queue and no longer cares about other follow-up operations. Realize the application decoupling of order system and inventory system.

3. Traffic cut

Traffic cutting is also a common scenario in message queues, which is generally used in spike or group grab activities!

Application scenario: The spike activity usually causes a surge in traffic and application hangs due to excessive traffic. To solve this problem, it is generally necessary to add a message queue at the front end of the application.

The number of active people can be controlled, and the application can be relieved by high flow in a short time.
Introduction to several common usage scenarios of message queues
After receiving the user's request, the server first writes it to the message queue. If the message queue length exceeds the maximum number, the user request is directly discarded or the error page is jumped to.

The spike service performs follow-up processing according to the request information in the message queue.

4. Log processing

Log processing refers to the use of message queues in log processing, such as Kafka applications, to solve a large number of log transmission problems. The architecture is simplified as follows: the
Introduction to several common usage scenarios of message queues
log collection client is responsible for log data collection and regular write and write to the Kafka queue; the Kafka message queue is responsible for the reception, storage and forwarding of log data; log processing applications: subscribe to and consume log data in the Kafka queue.

The following is an application case of Sina
Introduction to several common usage scenarios of message queues
Kafka log processing : Kafka : message queue for receiving user logs;

Logstash : Do log analysis, unified into JSON output to Elasticsearch;

Elasticsearch : The core technology of real-time log analysis service, a schemaless, real-time data storage service, organizes data through index, and has powerful search and statistics functions;

Kibana : Based on the data visualization component of Elasticsearch, the super data visualization ability is an important reason why many companies choose ELK stack.

5. News communication

Message communication means that message queues generally have built-in efficient communication mechanisms, so they can also be used in pure message communication. Such as the realization of point-to-point message queues, or chat rooms.
Introduction to several common usage scenarios of message queues
Introduction to several common usage scenarios of message queues
Introduction to several common usage scenarios of message queues


Guess you like

Origin blog.51cto.com/14475593/2561336