1. IDM – create user login myuser
2. IDM – create user group kafka_dev_myuser_rw_grp, kafka_dev_myuser_ro_grp and add user login myuser to that group
so
bash-4.2$ id myuser uid=12345678(myuser) gid=12345678(myuser), 87654321(kafka_dev_myuser_rw_grp), 87654320(kafka_dev_myuser_ro_grp)
3. Hue/Hive sql create role:
0: jdbc:hive2://host> CREATE ROLE kafka_dev_myuser_rw_role; 0: jdbc:hive2://host> GRANT ROLE kafka_dev_myuser_rw_role TO GROUP kafka_dev_myuser_rw_grp; 0: jdbc:hive2://host> SHOW ROLE GRANT GROUP kafka_dev_myuser_rw_grp; +---------------------------+---------------+-------------+----------+--+ | role | grant_option | grant_time | grantor | +---------------------------+---------------+-------------+----------+--+ | kafka_dev_myuser_rw_role | false | NULL | -- | +---------------------------+---------------+-------------+----------+--+ 0: jdbc:hive2://host> CREATE ROLE kafka_dev_myuser_ro_role; 0: jdbc:hive2://host> GRANT ROLE kafka_dev_myuser_ro_role TO GROUP kafka_dev_myuser_ro_grp; 0: jdbc:hive2://host> SHOW ROLE GRANT GROUP kafka_dev_myuser_ro_grp; +---------------------------+---------------+-------------+----------+--+ | role | grant_option | grant_time | grantor | +---------------------------+---------------+-------------+----------+--+ | kafka_dev_myuser_ro_role | false | NULL | -- | +---------------------------+---------------+-------------+----------+--+ 1 row selected (0.106 seconds)
4. Setup topic and producer and consumer group access for roles:
cd /run/cloudera-scm-agent/process/1234-kafka-KAFKA_BROKER/ kinit -kt kafka.keytab kafka/host.hadoop.com@HADOOP.COM
kafka-sentry -lp -r kafka_dev_myuser_rw_role
HOST=*->TOPIC=my-topic->action=write HOST=*->TOPIC=my-topic->action=describe
or set action to all
kafka-sentry -lp -r kafka_dev_myuser_ro_role
HOST=*->TOPIC=my-topic->action=describe HOST=*->TOPIC=my-topic->action=read HOST=*->CONSUMERGROUP=*->action=describe HOST=*->CONSUMERGROUP=*->action=read
optionally host and consumer group can be restricted to specific values:
HOST=127.0.0.1->TOPIC=my-topic->action=read HOST=10.0.0.123->CONSUMERGROUP=my-flume-group->action=read
5. Send data using Kafka console producer:
Init kerberos keytab session:
kinit -kt /path/to/me.keytab me@HADOOP.COM
If that fails due to expired password, change the password and re – create / generate kerberos keytab:
ipa-getkeytab -s my-kdc-server.hadoop.com -p me@HADOOP.COM -P -k me.keytab
Create jks file based on crt file:
keytool -importcert -file /path/to/gsnonpublicroot2.crt -keystore /path/to/my.jks
Enter keystore password: mypass1234 Trust this certificate? [no]: yes Certificate was added to keystore
producer.properties:
security.protocol=SASL_SSL sasl.mechanism=GSSAPI sasl.kerberos.service.name=kafka ssl.truststore.location=/path/to/my.jks ssl.truststore.password=mypass1234
export KAFKA_OPTS="-Djava.security.auth.login.config=jaas.conf" kafka-console-producer --broker-list host1:9093,host2:9093 --topic my-topic --producer.config producer.properties
20/12/14 03:49:05 INFO producer.ProducerConfig: ProducerConfig values: ... Debug is true storeKey true useTicketCache false useKeyTab true doNotPrompt false ticketCache is null isInitiator true KeyTab is /path/to/me.keytab refreshKrb5Config is false principal is me@HADOOP.COM tryFirstPass is false useFirstPass is false storePass is false clearPass is false principal is me@HADOOP.COM Will use keytab Commit Succeeded ... 20/12/14 03:49:06 INFO authenticator.AbstractLogin: Successfully logged in. 20/12/14 03:49:06 INFO utils.AppInfoParser: Kafka version : 1.0.1-kafka-3.1.1 20/12/14 03:49:06 INFO utils.AppInfoParser: Kafka commitId : unknown >hello from bartek >^
6. Read 2 messages using kafka console consumer:
It does not require ksession to be present: klist: No credentials cache found
consumer.properties:
security.protocol=SASL_SSL sasl.kerberos.service.name=kafka sasl.mechanism=GSSAPI #group.id=my-consumer-group #auto.offset.reset=earliest
jaas.conf:
KafkaClient { com.sun.security.auth.module.Krb5LoginModule required useKeyTab=true debug=true storeKey=true useTicketCache=false renewTicket=true keyTab="/path/to/me.keytab" principal="me@HADOOP.COM"; };
export KAFKA_OPTS="-Djava.security.auth.login.config=/path/to/jaas.conf" kafka-console-consumer --bootstrap-server host1:9093,host2:9093 --consumer.config /path/to/consumer.properties --topic my-topic --from-beginning --formatter kafka.tools.DefaultMessageFormatter --property print.key=true --property print.value=true --max-messages 2
Debug is true storeKey true useTicketCache false useKeyTab true doNotPrompt false ticketCache is null isInitiator true KeyTab is /path/to/me.keytab refreshKrb5Config is false principal is me@HADOOP.COM tryFirstPass is false useFirstPass is false storePass is false clearPass is false principal is me@HADOOP.COM Will use keytab Commit Succeeded
Output data