0. Introduction to Architecture
Simulate online real-time streams, such as user operation logs. After collecting data, process it. For the time being, only consider data collection and use Html+Jquery+Nginx+Ngx_kafka_module+Kafka
to realize it. Among them, Ngx_kafka_module is an open source Kafka
component specifically used to interface with Nginx .
1. Requirements description
1.1 Use html
and jquery
simulate user request logs
Including the following items:
User id: user_id, access time: act_time, operation: (action, including click, job_collect, cv_send, cv_upload)
Enterprise code job_code
1.2 Use Nginx to accept requests in 1.1
1.3 After receiving the request, use ngx_kafka_module to send the data to the topic tp_individual of Kafka.
1.4 Use a consumer to consume the topic in Kafka, observe
2. Build steps
2.1 Kafka
Since the ready-made installed docker-kafka image is used, it can be started directly.
2.2 Install Nginx and start
$ cd /usr/local/src
$ git clone [email protected]:edenhill/librdkafka.git
# 进入到librdkafka,然后进行编译
$ cd librdkafka
$ yum install -y gcc gcc-c++ pcre-devel zlib-devel
$ ./configure
$ make && make install
$ yum -y install make zlib-devel gcc-c++ libtool openssl openssl-devel
$ cd /opt/hoult/software
# 1.下载
$ wget http://nginx.org/download/nginx-1.18.0.tar.gz
# 2.解压
$ tar -zxf nginx-1.18.0.tar.gz -C /opt/hoult/servers
# 3. 下载模块源码
$ cd /opt/hoult/software
$ git clone [email protected]:brg-liuwei/ngx_kafka_module.git
# 4. 编译
$ cd /opt/hoult/servers/nginx-1.18.0
$ ./configure --add-module=/opt/hoult/software/ngx_kafka_module/
$ make && make install
# 5.删除Nginx安装包
$ rm /opt/hoult/software/nginx-1.18.0.tar.gz
# 6.启动nginx
$ cd /opt/hoult/servers/nginx-1.18.0
$ nginx
3. Related configuration
3.1 nginx configuration nginx.conf
#pid logs/nginx.pid;
events {
worker_connections 1024;
}
http {
include mime.types;
default_type application/octet-stream;
#log_format main '$remote_addr - $remote_user [$time_local] "$request" '
# '$status $body_bytes_sent "$http_referer" '
# '"$http_user_agent" "$http_x_forwarded_for"';
#access_log logs/access.log main;
sendfile on;
#tcp_nopush on;
#keepalive_timeout 0;
keepalive_timeout 65;
#gzip on;
kafka;
kafka_broker_list linux121:9092;
server {
listen 9090;
server_name localhost;
#charset koi8-r;
#access_log logs/host.access.log main;
#------------kafka相关配置开始------------
location = /kafka/log {
#跨域相关配置
add_header 'Access-Control-Allow-Origin' $http_origin;
add_header 'Access-Control-Allow-Credentials' 'true';
add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS';
kafka_topic tp_individual;
}
#error_page 404 /404.html;
}
}
3.2 Start Kafka producer and consumer
# 创建topic
kafka-topics.sh --zookeeper linux121:2181/myKafka --create --topic tp_individual --partitions 1 --replication-factor 1
# 创建消费者
kafka-console-consumer.sh --bootstrap-server linux121:9092 --topic tp_individual --from-beginning
# 创建生产者测试
kafka-console-producer.sh --broker-list linux121:9092 --topic tp_individual
3.3 Write Html + Jquery code
<!DOCTYPE html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1,shrink-to-fit=no">
<title>index</title>
<!-- jquery cdn, 可换其他 -->
<script src="https://cdn.bootcdn.net/ajax/libs/jquery/3.5.1/jquery.js"></script>
</head>
<body>
<input id="click" type="button" value="点击" οnclick="operate('click')" />
<input id="collect" type="button" value="收藏" οnclick="operate('job_collect')" />
<input id="send" type="button" value="投简历" οnclick="operate('cv_send')" />
<input id="upload" type="button" value="上传简历" οnclick="operate('cv_upload')" />
</body>
<script>
function operate(action) {
var json = {'user_id': 'u_donald', 'act_time': current().toString(), 'action': action, 'job_code': 'donald'};
$.ajax({
url:"http://192.168.18.128:9090/kafka/log",
type:"POST" ,
crossDomain: true,
data: JSON.stringify(json),
// 下面这句话允许跨域的cookie访问
xhrFields: {
withCredentials: true
},
success:function (data, status, xhr) {
// console.log("操作成功:'" + action)
},
error:function (err) {
// console.log(err.responseText);
}
});
};
function current() {
var d = new Date(),
str = '';
str += d.getFullYear() + '-';
str += d.getMonth() + 1 + '-';
str += d.getDate() + ' ';
str += d.getHours() + ':';
str += d.getMinutes() + ':';
str += d.getSeconds();
return str;
}
</script>
</html>
Will be a.html
placed in the directory of nginx, browser access 192.168.18.128:9090
4. Demo
4.1 First start the zk cluster, kafka cluster
4.2 Then create topic, create consumer, create producer, test topic
4.3 Start nginx to visit the page, click to observe the status of consumers
The whole process is as follows:
Wu Xie, Xiao San Ye, a little rookie in the background, big data, and artificial intelligence.
Please pay attention to more