Clickhouse to connect to external data sources (Mysql Kafka)

clickhouse to connect to mysql data source

  1. Establish the tested database and table on linux123 and insert the data
create database bigdataClickHouse;
use bigdataClickHouse;

create table student(
     id int,
     name varchar(40),
     age int);
insert into student values(3,'jack',12);
insert into student values(4,'jackness',120);

  1. Create a table with mysql as the engine in clickhouse
CREATE TABLE mysql_table2 ( `id` UInt32, `name` String, `age` UInt32 )ENGINE = MySQL('linux123:3306', 'bigdataClickHouse', 'student', 'root', '12345678')
  1. The mysql data can be found in clickhouse

Insert picture description here

Connect to kafka

  1. Start kafka and create topic
启动:kafka-server-start.sh -daemon config/server.properties
创建主题:kafka-topics.sh --zookeeper localhost:2181/myKafka --create --topic clickhouseTest --partitions 1 --replication-factor 1

  1. Create table kafka topic in clickhouse
CREATE TABLE queue(q_date String,level String,message String) ENGINE=Kafka SETTINGS kafka_broker_list='linux122:9092',kafka_topic_list='clickhouseTest',kafka_group_name='group33',kafka_format='CSV',kafka_num_consumers=1;

4. Create the daily table

	 CREATE TABLE daily ( day Date, level String, total UInt64 ) ENGINE = SummingMergeTree(day, (day, level), 8192); 

5. Create a materialized view

CREATE MATERIALIZED VIEW consumer TO daily AS SELECT q_date as day,level,message FROM queue;

6. Send messages to Kafka's topic

kafka-console-producer.sh --topic clickhouseTest --broker-list localhost:9092
2020-02-20,level2,message2
2020-02-21,level1,message1

7. Query data in daily

Check the log and found an error:
StorageKafka (queue): Can't get assignment. It can be caused by some issue with consumer group (not enough partitions?). Will keep trying.
How to solve this?

Guess you like

Origin blog.csdn.net/weixin_38813363/article/details/113863830