Kafka Connector的Consumer配置SSL认证

开发SinkConnector和开发SinkeConnectorTask略

配置connect-standalone-consumer.properties

# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#    http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
bootstrap.servers=hostname:19093

security.protocol=SSL
ssl.truststore.location=/home/xxx/kafka_ssl_key/client.truststore.jks
ssl.truststore.password=123456
ssl.keystore.location=/home/xxx/kafka_ssl_key/server.keystore.jks
ssl.keystore.password=123456
ssl.key.password=123456
# These are defaults. This file just demonstrates how to override some settings.


# The converters specify the format of data in Kafka and how to translate it into Connect data. Every Connect user will
# need to configure these based on the format they want their data in when loaded from or stored into Kafka
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=com.xxxx.common.kafka.connector.AlreadyBytesConverter
# Converter-specific settings can be passed in by prefixing the Converter's setting with the converter we want to apply
# it to
key.converter.schemas.enable=false
value.converter.schemas.enable=false

offset.storage.file.filename=/tmp/connect.offsets
# Flush much faster than normal, which is useful for testing/debugging
offset.flush.interval.ms=10000

# Set to a list of filesystem paths separated by commas (,) to enable class loading isolation for plugins
# (connectors, converters, transformations). The list should consist of top level directories that include
# any combination of:
# a) directories immediately containing jars with plugins and their dependencies
# b) uber-jars with plugins and their dependencies
# c) directories immediately containing the package directory structure of classes of plugins and their dependencies
# Note: symlinks will be followed to discover dependencies or plugins.
# Examples:
# plugin.path=/usr/local/share/java,/usr/local/share/kafka/plugins,/opt/connectors,
#plugin.path=


consumer.bootstrap.servers=hostname:19093
consumer.security.protocol=SSL
consumer.ssl.truststore.location=/home/xxx/kafka_ssl_key/client.truststore.jks
consumer.ssl.truststore.password=123456
consumer.ssl.keystore.location=/home/xxxx/kafka_ssl_key/server.keystore.jks
consumer.ssl.keystore.password=123456
consumer.ssl.key.password=123456

配置Connector

name=xxxx-event
connector.class=com.xxx.xxxxxx.common.kafka.connector.JDBCSinkConnector
#topics=cdh-hive-audit-logs-topic
topics=my-topic
tasks.max=3
# format to use for the date to append at the end of the index name, optional
# if empty or null, no suffix will be used



运行命令

./connect-standalone.sh ../conf/connect-consumer.properties ../conf/xxxxx-event.properties
发布了94 篇原创文章 · 获赞 55 · 访问量 11万+

猜你喜欢

转载自blog.csdn.net/Suubyy/article/details/89672709