Flink sql client 连接使用kerberos 认证的hive的问题

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Flink sql client 连接使用kerberos 认证的hive的问题

john
flink version: 1.10.1
Hive versos: 2.11
flink-conf.yaml 安全配置:
# security
security.kerberos.login.use-ticket-cache: true
security.kerberos.login.keytab: /tmp/bigdata.keytab
security.kerberos.login.principal: [hidden email]

问题描述:
使用bin/yarn-session.sh提交一个session集群在log中可以看到是使用bigdata登录成功。
2020-05-29 11:45:05,833 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: yarn.application-attempts, 4
2020-05-29 11:45:05,833 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: security.kerberos.login.use-ticket-cache, true
2020-05-29 11:45:05,833 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: security.kerberos.login.keytab, /tmp/bigdata.keytab
2020-05-29 11:45:05,834 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: security.kerberos.login.principal, [hidden email]
2020-05-29 11:45:06,576 INFO  org.apache.hadoop.security.UserGroupInformation               - Login successful for user bigdata using keytab file /tmp/bigdata.keytab
2020-05-29 11:45:06,577 INFO  org.apache.flink.runtime.security.modules.HadoopModule        - Hadoop user set to bigdata(auth:KERBEROS), credentials check status: true

但是:
使用sql-client.sh连接hive提交一个简单查询的时候,貌似没有读取到flink-conf.yaml里的securit配置。
即:还是使用我shell端的登录用户。

不知道sql-client-defaults.yaml如何配置才能正确的读取flink-conf.yaml,并使用这个配置文体,往yarn上提交作业。非常感谢!!!!!
Reply | Threaded
Open this post in threaded view
|

Re: Flink sql client 连接使用kerberos 认证的hive的问题

godfrey he
hi john,请问你用sql client跑的是yarn per job 还是 yarn session 模式?
security 相关的配置需要放到 flink-conf.yaml 中,是因为 sql client不负责启动一个flink cluster,
只负责提交sql job。

Best,
Godfrey

john <[hidden email]> 于2020年5月29日周五 下午12:09写道:
flink version: 1.10.1
Hive versos: 2.11
flink-conf.yaml 安全配置:
# security
security.kerberos.login.use-ticket-cache: true
security.kerberos.login.keytab: /tmp/bigdata.keytab
security.kerberos.login.principal: [hidden email]

问题描述:
使用bin/yarn-session.sh提交一个session集群在log中可以看到是使用bigdata登录成功。
2020-05-29 11:45:05,833 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: yarn.application-attempts, 4
2020-05-29 11:45:05,833 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: security.kerberos.login.use-ticket-cache, true
2020-05-29 11:45:05,833 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: security.kerberos.login.keytab, /tmp/bigdata.keytab
2020-05-29 11:45:05,834 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: security.kerberos.login.principal, [hidden email]
2020-05-29 11:45:06,576 INFO  org.apache.hadoop.security.UserGroupInformation               - Login successful for user bigdata using keytab file /tmp/bigdata.keytab
2020-05-29 11:45:06,577 INFO  org.apache.flink.runtime.security.modules.HadoopModule        - Hadoop user set to bigdata(auth:KERBEROS), credentials check status: true

但是:
使用sql-client.sh连接hive提交一个简单查询的时候,貌似没有读取到flink-conf.yaml里的securit配置。
即:还是使用我shell端的登录用户。

不知道sql-client-defaults.yaml如何配置才能正确的读取flink-conf.yaml,并使用这个配置文体,往yarn上提交作业。非常感谢!!!!!