flink on yarn消费开启kerberos的kafka

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

flink on yarn消费开启kerberos的kafka

zjfplayer@hotmail.com
大家好,
        请问各位flink on yarn如何消费开启kerberos的kafka,以及如何sink到开启kerberos的hbase
       
        现在尝试过在flink-conf.yaml中添加了如下配置:
security.kerberos.login.use-ticket-cache: false
security.kerberos.login.keytab: /home/zjf/zjf.keytab
security.kerberos.login.principal: zjf@TDH
security.kerberos.login.contexts: Client,KafkaClient
zookeeper.sasl.service-name: zookeeper
zookeeper.sasl.login-context-name: Client
        其中的keytab已经尝试过能正常运行kafka自带的发送消费shell脚本了,但是实际运行flink任务的时候,发现有org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata的报错,无法消费。

        并且已经尝试过在yaml中再加入如下配置:
env.java.opts.jobmanager: -Djava.security.auth.login.config=/home/zjf/jaas.conf -Djava.security.krb5.conf=/home/zjf/krb5.conf -Dsun.security.krb5.debug=true
env.java.opts.taskmanager: -Djava.security.auth.login.config=/home/zjf/jaas.conf -Djava.security.krb5.conf=/home/zjf/krb5.conf  -Dsun.security.krb5.debug=true
        直接就报了cannot locate default realm的错误

        flink版本为1.8.1



[hidden email]
Reply | Threaded
Open this post in threaded view
|

回复:flink on yarn消费开启kerberos的kafka

蒋佳成(Jiacheng Jiang)
krb5.conf没起作用。我是在环境变量里设置的,FLINK_ENV_JAVA_OPTS=-Djava.security.krb5.conf=xxxxx/krb5.conf




------------------ 原始邮件 ------------------
发件人: "[hidden email]"<[hidden email]&gt;;
发送时间: 2020年5月9日(星期六) 下午4:00
收件人: "user-zh"<[hidden email]&gt;;
主题: flink on yarn消费开启kerberos的kafka



大家好,
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 请问各位flink on yarn如何消费开启kerberos的kafka,以及如何sink到开启kerberos的hbase
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 现在尝试过在flink-conf.yaml中添加了如下配置:
security.kerberos.login.use-ticket-cache: false
security.kerberos.login.keytab: /home/zjf/zjf.keytab
security.kerberos.login.principal: zjf@TDH
security.kerberos.login.contexts: Client,KafkaClient
zookeeper.sasl.service-name: zookeeper
zookeeper.sasl.login-context-name: Client
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 其中的keytab已经尝试过能正常运行kafka自带的发送消费shell脚本了,但是实际运行flink任务的时候,发现有org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata的报错,无法消费。

&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 并且已经尝试过在yaml中再加入如下配置:
env.java.opts.jobmanager: -Djava.security.auth.login.config=/home/zjf/jaas.conf -Djava.security.krb5.conf=/home/zjf/krb5.conf -Dsun.security.krb5.debug=true
env.java.opts.taskmanager: -Djava.security.auth.login.config=/home/zjf/jaas.conf -Djava.security.krb5.conf=/home/zjf/krb5.conf&nbsp; -Dsun.security.krb5.debug=true
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 直接就报了cannot locate default realm的错误

&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; flink版本为1.8.1



[hidden email]