EMQ realizes IoT data collection sinking to Kafka

Software requirements:

Need to use EMQ Enterprise Edition

Scene requirements:

The Internet of Things platform needs to access the data collected by the smart transformer collection device, and transfer the data to the server's MQ (EMQ) based on the MQTT protocol. EMQ forwards the data to the Kafka cluster, and sparkStreaming connects to Kafka for real-time analysis and processing.

Ready to work:

  • Upload data preparation:
{
    
    
  "id": "NXP-058659730253-963945118132721-22", // 客户端识别码
  "speed": 32.12, // 车辆速度
  "direction": 198.33212, // 行驶方向
  "tachometer": 3211, // 发动机转速,数值大于 8000 时才需存储
  "dynamical": 8.93, // 瞬时油耗
  "location": {
    
     // GPS 经纬度数据
    "lng": 116.296011,
    "lat": 40.005091
  },
  "ts": 1563268202 // 上报时间
}
  • Create Kafka topic
./kafka-topics.sh --create --partitions 3 --replication-factor 2 --topic emqtopic --zookeeper node01:2181,node02:2181,node03:2181

Configuration instructions:

  • Create a resource
    Open EMQ X Dashboard, enter the resource page on the left menu, click the New button, and enter Kafka server information to create a resource.
    Insert picture description here
    The network environment of the nodes in the EMQ X cluster may be different from each other. After the resource is successfully created, click the status button in the list to check the resource connection status of each node. If the resource on the node is not available, please check whether the configuration is correct, network connectivity, and click Connect button to manually reconnect.
    Insert picture description here

  • Create rule
    Go to the rule page of the left menu and click the New button to create a rule. Here choose to trigger event message release, and trigger the rule for data processing when the message is released.
    After selecting the trigger event, we can see optional fields and sample SQL on the interface:
    Insert picture description here

  • Filter required fields The
    rule engine uses SQL statements for processing/integer terminal messages or connection events, etc. In this business, we only need to filter out the key fields in the payload for use. To do this, you can use the payload. format to select fields in the payload. In addition, in addition to the content in the payload, the id information of the message needs to be saved. The SQL can be configured as the following format:

SELECT
  payload.id as client_id, payload.speed as speed, 
  payload.tachometer as tachometer,
  payload.ts as ts, id
FROM
  "message.publish"
WHERE
  topic =~ 't/#'

  • Use the SQL test function for output testing.
    With the SQL test function, we can view the data output after the current SQL processing in real time. This function requires us to specify the payload and other simulated raw data.
    The payload data is as follows, pay attention to change the value of the tachometer to meet the SQL conditions:
{
    
    
  "id": "NXP-058659730253-963945118132721-22",
  "speed": 32.12,
  "direction": 198.33212,
  "tachometer": 9001,
  "dynamical": 8.93,
  "location": {
    
    
    "lng": 116.296011,
    "lat": 40.005091
  },
  "ts": 1563268202
}

Click the SQL test switch button, change topic and payload to the information in the scene, click the test button to view the data output: the
Insert picture description here
test output data is:

{
    
    
  "client_id": "NXP-058659730253-963945118132721-22",
  "id": "589A429E9572FB44B0000057C0001",
  "speed": 32.12,
  "tachometer": 9001,
  "ts": 1563268202
}
  • After adding the response action and bridging the message to the Kafka
    SQL conditional input and output, we continue to add the corresponding action, configure the written SQL statement, and bridge the screening result to Kafka.
    Click the Add button in the response action, select the bridge data to the Kafka action, select the resource just selected, and fill in the emqtopic created above for the Kafka topic
    Insert picture description here

test

Use the Websocket tool in the Dashboard to test.
Switch to the Tools --> Websocket page, use any information client to connect to EMQ X, and send the following information to the message card after the connection is successful:

{
    
    
  "id": "NXP-058659730253-963945118132721-22",
  "speed": 32.12,
  "direction": 198.33212,
  "tachometer": 8081,
  "dynamical": 8.93,
  "location": {
    
    
    "lng": 116.296011,
    "lat": 40.005091
  },
  "ts": 1563268202
}

Insert picture description here
Click the send button, and then use the Kafka command to check whether the message is successfully produced: So
Insert picture description here
far, we have realized the business development of using the rule engine to bridge the message to Kafka through the rule engine.

Guess you like

Origin blog.csdn.net/weixin_44455388/article/details/108253441