Kafka currently supports SSL, SASL / Kerberos, SASL / PLAIN three authentication mechanisms
kafka scope of certification
kafka client and kafka server (broker)
between the broker and the broker
between the broker and the zookeeper
zookpeer certification
In zookeeper installation conf directory root directory, create a file zk_server_jaas.conf
Server { org.apache.kafka.common.security.plain.PlainLoginModule required username="admin" password="123456" user_admin="123456" user_zk001="123456" user_zk002="123456" user_zk003="123456" user_zk004="123456" user_zk005="123456"; };
# Where username and password authentication for brokers and zk, user_ * zk client and zk server for authentication
# user_zk001 = "123456" represents zk001 user name, password is 123456
zoo.cfg add the following configuration
authProvider.1=org.apache.zookeeper.server.auth.SASLAuthenticationProvider requireClientAuthScheme=sasl jaasLoginRenew=3600000
The jar package under kafka libs catalog copy zookeeper lib directory:
since the authentication packet used when the org.apache.kafka.common.security.plain.PlainLoginModule, which kafka-client.jar this is all that is required corresponding jar lib directory are copied to the root directory of the installation zookeeper, probably to copy these jar
kafka-clients-2.1.1.jar lz4-java-1.5.0.jar osgi-resource-locator-1.0.1.jar slf4j-api-1.7.25.jar snappy-java-1.1.7.2.jar
Zk modify the startup parameters, modifications bin / zkEnv.sh, together with the end of file
SERVER_JVMFLAGS=" -Djava.security.auth.login.config=$ZOOCFGDIR/zk_server_jaas.conf"
kafka broker's authentication configuration
Config directory, create kafka_server_jaas.conf
KafkaServer { org.apache.kafka.common.security.plain.PlainLoginModule required username="admin" password="123456" user_admin="123456" user_alice="123456" user_write="123456" user_read="123456" user_kafka001="123456" user_kafka002="123456" user_kafka003="123456" user_kafka004="123456" user_kafka005="123456"; }; Client { org.apache.kafka.common.security.plain.PlainLoginModule required username="admin" password="123456"; };
Modify config / server.properties
= SASL_PLAINTEXT in the Listeners: // 192.168.180.128:8123 advertised.listeners = SASL_PLAINTEXT: // 192.168.180.128:8123 security.inter.broker.protocol = SASL_PLAINTEXT sasl.enabled.mechanisms = PLAIN sasl.mechanism.inter.broker.protocol = PLAIN . # allow.everyone IF .no.acl.found = to true super.users = the User: ADMIN . the Authorizer class .name = kafka.security.auth.SimpleAclAuthorizer . ZooKeeper the SET the .acl = to true #listeners, real bind for server # advertised.listeners, for the development of the user, if not set, the direct use of listeners
Modify bin / kafka-server-start.sh
if [ "x$KAFKA_OPTS" ]; then export KAFKA_OPTS="-Djava.security.auth.login.config=/export/ap-comm/server/kafka_2.11-2.1.1/config/kafka_server_jaas.conf" fi
The above authentication part has been configured
verification
Start kafka
../bin/kafka-server-start.sh server.properties
Creating zk_client_jaas.conf config directory
Client { org.apache.kafka.common.security.plain.PlainLoginModule required username="admin" password="123456"; };
Modify kafka-topics.sh add configuration
if [ "x$KAFKA_OPTS" ]; then export KAFKA_OPTS="-Djava.security.auth.login.config=/export/ap-comm/server/kafka_2.11-2.1.1/config/zk_client_jaas.conf" fi
Creating topic
../bin/kafka-topics.sh --create --zookeeper localhost:2189 --replication-factor 1 --partitions 1 --topic test015
Modify bin / kafka-acls.sh add the following configuration
if [ "x$KAFKA_OPTS" ]; then export KAFKA_OPTS="-Djava.security.auth.login.config=/export/ap-comm/server/kafka_2.11-2.1.1/config/zk_client_jaas.conf" fi
write read user empowerment
../bin/kafka-acls.sh --authorizer kafka.security.auth.SimpleAclAuthorizer --authorizer-properties zookeeper.connect=localhost:2189 --add --allow-principal User:write --operation Write --topic test015 ../bin/kafka-acls.sh --authorizer kafka.security.auth.SimpleAclAuthorizer --authorizer-properties zookeeper.connect=localhost:2189 --add --allow-principal User:read --operation Read --group test-group --topic test015
View all permissions
../bin/kafka-acls.sh --list --authorizer-properties zookeeper.connect=localhost:2189
kafka client authentication configuration
creation kafka_write_jaas.conf config / under
KafkaClient { org.apache.kafka.common.security.plain.PlainLoginModule required username="write" password="123456"; };
Modify bin / kafka-console-producer.sh add the following configuration
if [ "x$KAFKA_OPTS" ]; then export KAFKA_OPTS="-Djava.security.auth.login.config=/export/ap-comm/server/kafka_2.12-1.1.1/config/kafka_write_jaas.conf" fi
Creating producer.config config / under
bootstrap.servers=192.168.180.128:8123 compression.type=none security.protocol=SASL_PLAINTEXT sasl.mechanism=PLAIN
producer start-up test
../bin/kafka-console-producer.sh --broker-list 192.168.180.128:8123 --topic test015 --producer.config producer.config
Creating kafka_read_jaas.conf config / under
KafkaClient { org.apache.kafka.common.security.plain.PlainLoginModule required username="read" password="123456"; };
Modify bin / kafka-console-consumer.sh
if [ "x$KAFKA_OPTS" ]; then export KAFKA_OPTS="-Djava.security.auth.login.config=/export/ap-comm/server/kafka_2.11-2.1.1/config/kafka_read_jaas.conf" fi
Creating consumer.config config / under
security.protocol=SASL_PLAINTEXT sasl.mechanism=PLAIN group.id=test-group
consumer start-up test
../bin/kafka-console-consumer.sh --bootstrap-server 192.168.180.128:8123 --topic test015 --from-beginning --consumer.config consumer.config
Java client authentication
mvn dependence
<dependency> <groupId>org.apache.kafka</groupId> <artifactId>kafka_2.12</artifactId> <version>0.11.0.1</version> </dependency>
Production data: KafkaProducerSasl.java:
package kafka; import org.apache.kafka.clients.producer.KafkaProducer; import org.apache.kafka.clients.producer.Producer; import org.apache.kafka.clients.producer.ProducerRecord; import java.util.Properties; public class KafkaProducerSasl { public final static String TOPIC = "test010"; private static void producer() throws InterruptedException { System.setProperty("java.security.auth.login.config", "E:/work/saslconf/kafka_write_jaas.conf"); Properties props = new Properties(); props.put("bootstrap.servers", "192.168.180.128:8123"); props.put("acks", "all"); props.put("retries", 0); props.put("batch.size", 16384); props.put("linger.ms", 1); props.put("buffer.memory", 33554432); props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer"); props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer"); props.put("security.protocol", "SASL_PLAINTEXT"); props.put("sasl.mechanism", "PLAIN"); Producer<String, String> producer = new KafkaProducer<>(props); while (true){ long startTime = System.currentTimeMillis(); for (int i = 0; i < 100; i++) { producer.send(new ProducerRecord<>(TOPIC, Integer.toString(i), Integer.toString(i))); } System.out.println(System.currentTimeMillis()-startTime); Thread.sleep(5000); } } public static void main(String[] args) { try { producer(); } catch (InterruptedException e) { e.printStackTrace(); } } }
Consumption data: KafkaConsumerSasl.java:
package kafka; import org.apache.kafka.clients.consumer.ConsumerRecord; import org.apache.kafka.clients.consumer.ConsumerRecords; import org.apache.kafka.clients.consumer.KafkaConsumer; import java.util.Collections; import java.util.Properties; public class KafkaConsumerSasl { public static void consumer() throws Exception { System.setProperty("java.security.auth.login.config", "E:/work/saslconf/kafka_read_jaas.conf"); Properties props = new Properties(); props.put("bootstrap.servers", "192.168.180.128:8123"); props.put("enable.auto.commit", "true"); props.put("auto.commit.interval.ms", "1000"); props.put("group.id", "test-group"); props.put("session.timeout.ms", "6000"); props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer"); props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer"); props.put("security.protocol", "SASL_PLAINTEXT"); props.put("sasl.mechanism", "PLAIN"); KafkaConsumer<String, String> consumer = new KafkaConsumer<>(props); consumer.subscribe(Collections.singletonList("test010")); while (true) { long startTime = System.currentTimeMillis(); ConsumerRecords<String, String> records = consumer.poll(1000); System.out.println(System.currentTimeMillis() - startTime); System.out.println("recieve message number is " + records.count()); for (ConsumerRecord<String, String> record : records) { System.out.printf("offset = %d, key = %s, value = %s, partition = %d %n", record.offset(), record.key(), record.value(), record.partition()); } } } //http://www.open-open.com/lib/view/open1412991579999.html public static void main(String[] args) throws Exception { consumer(); } }
kafka installation package download:
wget http://mirror.bit.edu.cn/apache/kafka/2.1.1/kafka_2.11-2.1.1.tgz
Reference Documents
http://kafka.apache.org/documentation/#security
https://www.jianshu.com/p/74f84fbd1f3f
https://blog.csdn.net/linux12a/article/details/77375274
https://www.cnblogs.com/smartloli/p/9191929.html
https://blog.csdn.net/bao_since/article/details/89101226
https://stackoverflow.com/questions/40196689/kafka-topic-authorization-failed
https://stackoverflow.com/questions/42635682/kafka-on-cloudera-test-topic-authorization-failed
https://www.cnblogs.com/xiao987334176/p/10077512.html
https://www.cnblogs.com/ilovena/p/10123516.html