[flink-1.11] 读kafka写hive,IDEA运行成功,打成jar包,提交到yarn运行报错

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

[flink-1.11] 读kafka写hive,IDEA运行成功,打成jar包,提交到yarn运行报错

nashcen
代码在IDEA运行成功,打成jar包,提交到yarn运行报错。一开始以为是少包,后来把所有依赖包都打了进来,全局dependency.scope设为compile,依然报错。

启动命令:
nohup \
$FLINK_HOME/bin/flink run \
--class
com.athub.dcpoints.scala.connector.table.hive.OdsDcpointsProdKafkaFlinkHiveApp
\
--target yarn-per-job \
--jobmanager yarn-cluster \
--yarnjobManagerMemory 1024m \
--yarntaskManagerMemory  4096m \
--parallelism 4 \
/bigdata/athub/deploy/kafka-flink-hive-1.0.jar \
>/dev/null 2>/bigdata/athub/deploy/kafka-flink-hive-err.log &


报错日志:
------------------------------------------------------------
 The program finished with the following exception:

org.apache.flink.client.program.ProgramInvocationException: The main method
caused an error: Unable to create a source for reading table
'hive_catalog.dc_ods.ods_dcpoints_prod_kafka_source'.

Table options are:

'connector'='kafka'
'format'='json'
'is_generic'='true'
'json.fail-on-missing-field'='false'
'json.ignore-parse-errors'='true'
'properties.bootstrap.servers'='prod-bigdata-03:9092,prod-bigdata-04:9092,prod-bigdata-05:9092,prod-bigdata-06:9092'
'scan.startup.mode'='earliest-offset'
'topic'='ods_dcpoints_prod'
        at
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:302)
        at
org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:198)
        at
org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:149)
        at
org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:699)
        at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:232)
        at
org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:916)
        at
org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:992)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at
org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
        at
org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:992)
Caused by: org.apache.flink.table.api.ValidationException: Unable to create
a source for reading table
'hive_catalog.dc_ods.ods_dcpoints_prod_kafka_source'.





--
Sent from: http://apache-flink.147419.n8.nabble.com/
Reply | Threaded
Open this post in threaded view
|

Re: [flink-1.11] 读kafka写hive,IDEA运行成功,打成jar包,提交到yarn运行报错

Rui Li
Hi,

可以检查一下是不是缺少了kafka connector的依赖,还有一种可能是SPI的service文件被覆盖了,这种情况的话可以用maven
shade plugin的ServicesResourceTransformer来合并不同jar里的service文件

On Thu, Sep 24, 2020 at 7:17 PM nashcen <[hidden email]> wrote:

>
> 代码在IDEA运行成功,打成jar包,提交到yarn运行报错。一开始以为是少包,后来把所有依赖包都打了进来,全局dependency.scope设为compile,依然报错。
>
> 启动命令:
> nohup \
> $FLINK_HOME/bin/flink run \
> --class
>
> com.athub.dcpoints.scala.connector.table.hive.OdsDcpointsProdKafkaFlinkHiveApp
> \
> --target yarn-per-job \
> --jobmanager yarn-cluster \
> --yarnjobManagerMemory 1024m \
> --yarntaskManagerMemory  4096m \
> --parallelism 4 \
> /bigdata/athub/deploy/kafka-flink-hive-1.0.jar \
> >/dev/null 2>/bigdata/athub/deploy/kafka-flink-hive-err.log &
>
>
> 报错日志:
> ------------------------------------------------------------
>  The program finished with the following exception:
>
> org.apache.flink.client.program.ProgramInvocationException: The main method
> caused an error: Unable to create a source for reading table
> 'hive_catalog.dc_ods.ods_dcpoints_prod_kafka_source'.
>
> Table options are:
>
> 'connector'='kafka'
> 'format'='json'
> 'is_generic'='true'
> 'json.fail-on-missing-field'='false'
> 'json.ignore-parse-errors'='true'
>
> 'properties.bootstrap.servers'='prod-bigdata-03:9092,prod-bigdata-04:9092,prod-bigdata-05:9092,prod-bigdata-06:9092'
> 'scan.startup.mode'='earliest-offset'
> 'topic'='ods_dcpoints_prod'
>         at
>
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:302)
>         at
>
> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:198)
>         at
> org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:149)
>         at
>
> org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:699)
>         at
> org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:232)
>         at
>
> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:916)
>         at
>
> org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:992)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:422)
>         at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
>         at
>
> org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>         at
> org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:992)
> Caused by: org.apache.flink.table.api.ValidationException: Unable to create
> a source for reading table
> 'hive_catalog.dc_ods.ods_dcpoints_prod_kafka_source'.
>
>
>
>
>
> --
> Sent from: http://apache-flink.147419.n8.nabble.com/
>


--
Best regards!
Rui Li