Flink connect hive with hadoop HA

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Flink connect hive with hadoop HA

sunfulin
Hi, guys
I am using Flink 1.10 and test functional cases with hive intergration. Hive with 1.1.0-cdh5.3.0 and with hadoop HA enabled.Running flink job I can see successful connection with hive metastore, but cannot read table data with exception:


java.lang.IllegalArgumentException: java.net.UnknownHostException: nameservice1
at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:374)
at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:310)
at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:668)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:604)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:148)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2598)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)


I am running a standalone application. Looks like I am missing my hadoop conf file in my flink job application classpath. Where should I config ?
Reply | Threaded
Open this post in threaded view
|

Re: Flink connect hive with hadoop HA

Khachatryan Roman
Hi,

Could you please provide a full stacktrace?

Regards,
Roman


On Mon, Feb 10, 2020 at 2:12 PM sunfulin <[hidden email]> wrote:

> Hi, guys
> I am using Flink 1.10 and test functional cases with hive intergration.
> Hive with 1.1.0-cdh5.3.0 and with hadoop HA enabled.Running flink job I can
> see successful connection with hive metastore, but cannot read table data
> with exception:
>
> java.lang.IllegalArgumentException: java.net.UnknownHostException:
> nameservice1
> at
> org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:374)
> at
> org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:310)
> at
> org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:668)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:604)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:148)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2598)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
>
> I am running a standalone application. Looks like I am missing my hadoop
> conf file in my flink job application classpath. Where should I config ?
>
>
>
>
Reply | Threaded
Open this post in threaded view
|

Re: Flink connect hive with hadoop HA

Bowen Li
Hi sunfulin,

Sounds like you didn't config the hadoop HA correctly on the client side
according to [1]. Let us know if it helps resolve the issue.

[1]
https://stackoverflow.com/questions/25062788/namenode-ha-unknownhostexception-nameservice1




On Mon, Feb 10, 2020 at 7:11 AM Khachatryan Roman <
[hidden email]> wrote:

> Hi,
>
> Could you please provide a full stacktrace?
>
> Regards,
> Roman
>
>
> On Mon, Feb 10, 2020 at 2:12 PM sunfulin <[hidden email]> wrote:
>
>> Hi, guys
>> I am using Flink 1.10 and test functional cases with hive intergration.
>> Hive with 1.1.0-cdh5.3.0 and with hadoop HA enabled.Running flink job I can
>> see successful connection with hive metastore, but cannot read table data
>> with exception:
>>
>> java.lang.IllegalArgumentException: java.net.UnknownHostException:
>> nameservice1
>> at
>> org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:374)
>> at
>> org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:310)
>> at
>> org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176)
>> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:668)
>> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:604)
>> at
>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:148)
>> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2598)
>> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
>>
>> I am running a standalone application. Looks like I am missing my hadoop
>> conf file in my flink job application classpath. Where should I config ?
>>
>>
>>
>>
>
Reply | Threaded
Open this post in threaded view
|

Re:Re: Flink connect hive with hadoop HA

sunfulin
Hi ,guys
Thanks for kind reply. Actually I want to know how to change client side haddop conf while using table API within my program. Hope some useful sug.











At 2020-02-11 02:42:31, "Bowen Li" <[hidden email]> wrote:

Hi sunfulin,


Sounds like you didn't config the hadoop HA correctly on the client side according to [1]. Let us know if it helps resolve the issue.


[1] https://stackoverflow.com/questions/25062788/namenode-ha-unknownhostexception-nameservice1









On Mon, Feb 10, 2020 at 7:11 AM Khachatryan Roman <[hidden email]> wrote:

Hi,


Could you please provide a full stacktrace?


Regards,
Roman




On Mon, Feb 10, 2020 at 2:12 PM sunfulin <[hidden email]> wrote:

Hi, guys
I am using Flink 1.10 and test functional cases with hive intergration. Hive with 1.1.0-cdh5.3.0 and with hadoop HA enabled.Running flink job I can see successful connection with hive metastore, but cannot read table data with exception:


java.lang.IllegalArgumentException: java.net.UnknownHostException: nameservice1
at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:374)
at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:310)
at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:668)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:604)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:148)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2598)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)


I am running a standalone application. Looks like I am missing my hadoop conf file in my flink job application classpath. Where should I config ?




 
Reply | Threaded
Open this post in threaded view
|

Re: Re: Flink connect hive with hadoop HA

Robert Metzger
There's a configuration value "env.hadoop.conf.dir" to set the hadoop
configuration directory:
https://ci.apache.org/projects/flink/flink-docs-master/ops/config.html#env-hadoop-conf-dir
If the files in that directory correctly configure Hadoop HA, the client
side should pick up the config.

On Tue, Feb 11, 2020 at 3:39 AM sunfulin <[hidden email]> wrote:

> Hi ,guys
> Thanks for kind reply. Actually I want to know how to change client side
> haddop conf while using table API within my program. Hope some useful sug.
>
>
>
>
>
> At 2020-02-11 02:42:31, "Bowen Li" <[hidden email]> wrote:
>
> Hi sunfulin,
>
> Sounds like you didn't config the hadoop HA correctly on the client side
> according to [1]. Let us know if it helps resolve the issue.
>
> [1]
> https://stackoverflow.com/questions/25062788/namenode-ha-unknownhostexception-nameservice1
>
>
>
>
> On Mon, Feb 10, 2020 at 7:11 AM Khachatryan Roman <
> [hidden email]> wrote:
>
>> Hi,
>>
>> Could you please provide a full stacktrace?
>>
>> Regards,
>> Roman
>>
>>
>> On Mon, Feb 10, 2020 at 2:12 PM sunfulin <[hidden email]> wrote:
>>
>>> Hi, guys
>>> I am using Flink 1.10 and test functional cases with hive intergration.
>>> Hive with 1.1.0-cdh5.3.0 and with hadoop HA enabled.Running flink job I can
>>> see successful connection with hive metastore, but cannot read table data
>>> with exception:
>>>
>>> java.lang.IllegalArgumentException: java.net.UnknownHostException:
>>> nameservice1
>>> at
>>> org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:374)
>>> at
>>> org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:310)
>>> at
>>> org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176)
>>> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:668)
>>> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:604)
>>> at
>>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:148)
>>> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2598)
>>> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
>>>
>>> I am running a standalone application. Looks like I am missing my hadoop
>>> conf file in my flink job application classpath. Where should I config ?
>>>
>>>
>>>
>>>
>>
>
>
>