flink kafka SQL Connectors 传递kerberos 参数

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

flink kafka SQL Connectors 传递kerberos 参数

lydata
 flink v1.11.1  kafka使用了kerberos
下面DDL 是支持 kerberos 参数


CREATETABLEkafkaTable(
...
)WITH('connector'='kafka',
'topic'='user_behavior',
'properties.bootstrap.servers'='localhost:9092',
'properties.group.id'='testGroup', 'security.protocol'='SASL_PLAINTEXT',
'sasl.mechanism'='GSSAPI',
'sasl.kerberos.service.name'='kafka',
'format'='csv',
'scan.startup.mode'='earliest-offset'
)


是否支持上面的参数?
Reply | Threaded
Open this post in threaded view
|

Re:flink kafka SQL Connectors 传递kerberos 参数

lydata


是否需要这3个参数,或者下面参数是否支持?




'security.protocol'='SASL_PLAINTEXT',
'sasl.mechanism'='GSSAPI',
'sasl.kerberos.service.name'='kafka',












在 2020-07-30 16:38:11,"lydata" <[hidden email]> 写道:

> flink v1.11.1  kafka使用了kerberos
>下面DDL 是支持 kerberos 参数
>
>
>CREATETABLEkafkaTable(
>...
>)WITH('connector'='kafka',
>'topic'='user_behavior',
>'properties.bootstrap.servers'='localhost:9092',
>'properties.group.id'='testGroup', 'security.protocol'='SASL_PLAINTEXT',
>'sasl.mechanism'='GSSAPI',
>'sasl.kerberos.service.name'='kafka',
>'format'='csv',
>'scan.startup.mode'='earliest-offset'
>)
>
>
>是否支持上面的参数?
Reply | Threaded
Open this post in threaded view
|

Re: flink kafka SQL Connectors 传递kerberos 参数

Leonard Xu
Hi,
kafka properties 的参数是可以透传的,你试试下面:

‘properties.security.protocol'='SASL_PLAINTEXT',
‘properties.sasl.mechanism'='GSSAPI’,
‘properties.sasl.kerberos.service.name'='kafka',

祝好
Leonard


> 在 2020年7月30日,17:00,lydata <[hidden email]> 写道:
>
>
>
> 是否需要这3个参数,或者下面参数是否支持?
>
>
>
>
> 'security.protocol'='SASL_PLAINTEXT',
> 'sasl.mechanism'='GSSAPI',
> 'sasl.kerberos.service.name'='kafka',
>
>
>
>
>
>
>
>
>
>
>
>
> 在 2020-07-30 16:38:11,"lydata" <[hidden email]> 写道:
>> flink v1.11.1  kafka使用了kerberos
>> 下面DDL 是支持 kerberos 参数
>>
>>
>> CREATETABLEkafkaTable(
>> ...
>> )WITH('connector'='kafka',
>> 'topic'='user_behavior',
>> 'properties.bootstrap.servers'='localhost:9092',
>> 'properties.group.id'='testGroup', 'security.protocol'='SASL_PLAINTEXT',
>> 'sasl.mechanism'='GSSAPI',
>> 'sasl.kerberos.service.name'='kafka',
>> 'format'='csv',
>> 'scan.startup.mode'='earliest-offset'
>> )
>>
>>
>> 是否支持上面的参数?

Reply | Threaded
Open this post in threaded view
|

Re:Re: flink kafka SQL Connectors 传递kerberos 参数

lydata






谢谢 ,我试试














在 2020-07-30 17:34:41,"Leonard Xu" <[hidden email]> 写道:

>Hi,
>kafka properties 的参数是可以透传的,你试试下面:
>
>‘properties.security.protocol'='SASL_PLAINTEXT',
>‘properties.sasl.mechanism'='GSSAPI’,
>‘properties.sasl.kerberos.service.name'='kafka',
>
>祝好
>Leonard
>
>
>> 在 2020年7月30日,17:00,lydata <[hidden email]> 写道:
>>
>>
>>
>> 是否需要这3个参数,或者下面参数是否支持?
>>
>>
>>
>>
>> 'security.protocol'='SASL_PLAINTEXT',
>> 'sasl.mechanism'='GSSAPI',
>> 'sasl.kerberos.service.name'='kafka',
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> 在 2020-07-30 16:38:11,"lydata" <[hidden email]> 写道:
>>> flink v1.11.1  kafka使用了kerberos
>>> 下面DDL 是支持 kerberos 参数
>>>
>>>
>>> CREATETABLEkafkaTable(
>>> ...
>>> )WITH('connector'='kafka',
>>> 'topic'='user_behavior',
>>> 'properties.bootstrap.servers'='localhost:9092',
>>> 'properties.group.id'='testGroup', 'security.protocol'='SASL_PLAINTEXT',
>>> 'sasl.mechanism'='GSSAPI',
>>> 'sasl.kerberos.service.name'='kafka',
>>> 'format'='csv',
>>> 'scan.startup.mode'='earliest-offset'
>>> )
>>>
>>>
>>> 是否支持上面的参数?
Reply | Threaded
Open this post in threaded view
|

Re: flink kafka SQL Connectors 传递kerberos 参数

Leonard Xu
不知道你的问题是能否通过这个解决

我看了下目前文档里缺少了传递kafka properties 的部分,我建了个issue[1]把文档补齐

Best
Leonard
[1] https://issues.apache.org/jira/browse/FLINK-18768 <https://issues.apache.org/jira/browse/FLINK-18768>


> 在 2020年7月30日,17:52,lydata <[hidden email]> 写道:
>
>
>
>
>
>
>
> 谢谢 ,我试试
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> 在 2020-07-30 17:34:41,"Leonard Xu" <[hidden email]> 写道:
>> Hi,
>> kafka properties 的参数是可以透传的,你试试下面:
>>
>> ‘properties.security.protocol'='SASL_PLAINTEXT',
>> ‘properties.sasl.mechanism'='GSSAPI’,
>> ‘properties.sasl.kerberos.service.name'='kafka',
>>
>> 祝好
>> Leonard
>>
>>
>>> 在 2020年7月30日,17:00,lydata <[hidden email]> 写道:
>>>
>>>
>>>
>>> 是否需要这3个参数,或者下面参数是否支持?
>>>
>>>
>>>
>>>
>>> 'security.protocol'='SASL_PLAINTEXT',
>>> 'sasl.mechanism'='GSSAPI',
>>> 'sasl.kerberos.service.name'='kafka',
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> 在 2020-07-30 16:38:11,"lydata" <[hidden email]> 写道:
>>>> flink v1.11.1  kafka使用了kerberos
>>>> 下面DDL 是支持 kerberos 参数
>>>>
>>>>
>>>> CREATETABLEkafkaTable(
>>>> ...
>>>> )WITH('connector'='kafka',
>>>> 'topic'='user_behavior',
>>>> 'properties.bootstrap.servers'='localhost:9092',
>>>> 'properties.group.id'='testGroup', 'security.protocol'='SASL_PLAINTEXT',
>>>> 'sasl.mechanism'='GSSAPI',
>>>> 'sasl.kerberos.service.name'='kafka',
>>>> 'format'='csv',
>>>> 'scan.startup.mode'='earliest-offset'
>>>> )
>>>>
>>>>
>>>> 是否支持上面的参数?