网上关于1.7.0的flink的对接代码几乎没有 甚至1.5x以后的几户没有 这里建议能看懂官网的还是看官网 这里也只是因为个人图方便 从官网搞下来的
首先 上依赖
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-scala_2.12</artifactId>
<version>1.7.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-scala_2.12</artifactId>
<version>1.7.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-streams</artifactId>
<version>2.1.1</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_2.11</artifactId>
<version>1.10.0</version>
</dependency>
使用哪个consumer 官网英文解释 大概就是 1.0.0以上版本的kafka 直接使用 FlinkKafkaConsumer 即可 0.8的就使用
FlinkKafkaConsumer08 。。。。
Flink’s Kafka consumer is called FlinkKafkaConsumer08
(or 09 for Kafka 0.9.0.x versions, etc. or just FlinkKafkaConsumer
for Kafka >= 1.0.0 versions). It provides access to one or more Kafka topics.
核心代码
val stream: DataStream[String] = ...
val myProducer = new FlinkKafkaProducer011[String](
"localhost:9092", // broker list
"my-topic", // target topic
new SimpleStringSchema) // serialization schema
// versions 0.10+ allow attaching the records' event timestamp when writing them to Kafka;
// this method is not available for earlier Kafka versions
myProducer.setWriteTimestampToKafka(true)
stream.addSink(myProducer)
val properties = new Properties()
properties.setProperty("bootstrap.servers", "localhost:9092")
// only required for Kafka 0.8
properties.setProperty("zookeeper.connect", "localhost:2181")
properties.setProperty("group.id", "test")
stream = env
.addSource(new FlinkKafkaConsumer08[String]("topic", new SimpleStringSchema(), properties))
.print()