请教 hive streaming 报错

classic Classic list List threaded Threaded
7 messages Options
Reply | Threaded
Open this post in threaded view
|

请教 hive streaming 报错

abc15606
版本为:Flink 1.11.0


2020-08-24 13:33:03,019 ERROR org.apache.flink.runtime.webmonitor.handlers.JarRunHandler   [] - Unhandled exception.

java.lang.IllegalAccessError: tried to access class org.apache.flink.streaming.api.functions.sink.filesystem.DefaultBucketFactoryImpl from class org.apache.flink.streaming.api.functions.sink.filesystem.HadoopPathBasedBulkFormatBuilder

at org.apache.flink.streaming.api.functions.sink.filesystem.HadoopPathBasedBulkFormatBuilder.<init>(HadoopPathBasedBulkFormatBuilder.java:70) ~[?:?]

at org.apache.flink.connectors.hive.HiveTableSink.consumeDataStream(HiveTableSink.java:197) ~[?:?]

at org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLegacySink.translateToPlanInternal(StreamExecLegacySink.scala:114) ~[flink-table-blink_2.11-1.11.0.jar:1.11.0]

at org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLegacySink.translateToPlanInternal(StreamExecLegacySink.scala:48) ~[flink-table-blink_2.11-1.11.0.jar:1.11.0]

at org.apache.flink.table.planner.plan.nodes.exec.ExecNode$class.translateToPlan(ExecNode.scala:58) ~[flink-table-blink_2.11-1.11.0.jar:1.11.0]

at org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLegacySink.translateToPlan(StreamExecLegacySink.scala:48) ~[flink-table-blink_2.11-1.11.0.jar:1.11.0]

at org.apache.flink.table.planner.delegation.StreamPlanner$$anonfun$translateToPlan$1.apply(StreamPlanner.scala:67) ~[flink-table-blink_2.11-1.11.0.jar:1.11.0]

at org.apache.flink.table.planner.delegation.StreamPlanner$$anonfun$translateToPlan$1.apply(StreamPlanner.scala:66) ~[flink-table-blink_2.11-1.11.0.jar:1.11.0]

at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) ~[flink-dist_2.11-1.11.0.jar:1.11.0]

at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) ~[flink-dist_2.11-1.11.0.jar:1.11.0]

at scala.collection.Iterator$class.foreach(Iterator.scala:891) ~[flink-dist_2.11-1.11.0.jar:1.11.0]

at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) ~[flink-dist_2.11-1.11.0.jar:1.11.0]

at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) ~[flink-dist_2.11-1.11.0.jar:1.11.0]

at scala.collection.AbstractIterable.foreach(Iterable.scala:54) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
Reply | Threaded
Open this post in threaded view
|

Re: 请教 hive streaming 报错

Rui Li
hive相关的依赖是怎么添加的啊?这两个类的package名字是一样的,按说可以访问。不确定是不是因为通过不同的classloader加载导致的。

On Mon, Aug 24, 2020 at 2:17 PM McClone <[hidden email]> wrote:

> 版本为:Flink 1.11.0
>
>
> 2020-08-24 13:33:03,019 ERROR
> org.apache.flink.runtime.webmonitor.handlers.JarRunHandler   [] - Unhandled
> exception.
>
> java.lang.IllegalAccessError: tried to access class
> org.apache.flink.streaming.api.functions.sink.filesystem.DefaultBucketFactoryImpl
> from class
> org.apache.flink.streaming.api.functions.sink.filesystem.HadoopPathBasedBulkFormatBuilder
>
> at
> org.apache.flink.streaming.api.functions.sink.filesystem.HadoopPathBasedBulkFormatBuilder.<init>(HadoopPathBasedBulkFormatBuilder.java:70)
> ~[?:?]
>
> at
> org.apache.flink.connectors.hive.HiveTableSink.consumeDataStream(HiveTableSink.java:197)
> ~[?:?]
>
> at
> org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLegacySink.translateToPlanInternal(StreamExecLegacySink.scala:114)
> ~[flink-table-blink_2.11-1.11.0.jar:1.11.0]
>
> at
> org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLegacySink.translateToPlanInternal(StreamExecLegacySink.scala:48)
> ~[flink-table-blink_2.11-1.11.0.jar:1.11.0]
>
> at
> org.apache.flink.table.planner.plan.nodes.exec.ExecNode$class.translateToPlan(ExecNode.scala:58)
> ~[flink-table-blink_2.11-1.11.0.jar:1.11.0]
>
> at
> org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLegacySink.translateToPlan(StreamExecLegacySink.scala:48)
> ~[flink-table-blink_2.11-1.11.0.jar:1.11.0]
>
> at
> org.apache.flink.table.planner.delegation.StreamPlanner$$anonfun$translateToPlan$1.apply(StreamPlanner.scala:67)
> ~[flink-table-blink_2.11-1.11.0.jar:1.11.0]
>
> at
> org.apache.flink.table.planner.delegation.StreamPlanner$$anonfun$translateToPlan$1.apply(StreamPlanner.scala:66)
> ~[flink-table-blink_2.11-1.11.0.jar:1.11.0]
>
> at
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
> ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>
> at
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
> ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>
> at scala.collection.Iterator$class.foreach(Iterator.scala:891)
> ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>
> at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
> ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>
> at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>
> at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
> ~[flink-dist_2.11-1.11.0.jar:1.11.0]



--
Best regards!
Rui Li
Reply | Threaded
Open this post in threaded view
|

Re: 请教 hive streaming 报错

liangck
In reply to this post by abc15606
遇到同样的问题,请问解决了吗。我是flink-connector-hive和hive-exec打进jar包里提交的。但是
flink-connector-hive里有个org.apache.flink.streaming.api.functions.sink.filesystem.HadoopPathBasedBulkFormatBuilder类,引用了streaming-java包里的org.apache.flink.streaming.api.functions.sink.filesystem.DefaultBucketFactoryImpl。估计是因为类加载器不同导致无法引用报错。



--
Sent from: http://apache-flink.147419.n8.nabble.com/
Reply | Threaded
Open this post in threaded view
|

Re: 请教 hive streaming 报错

Rui Li
Hi,

怀疑是类加载的问题的话可以尝试把所有依赖的jar都放到lib下面试试,保证这些依赖是同一个classloader加载的

On Tue, Sep 1, 2020 at 9:42 AM liangck <[hidden email]> wrote:

> 遇到同样的问题,请问解决了吗。我是flink-connector-hive和hive-exec打进jar包里提交的。但是
>
> flink-connector-hive里有个org.apache.flink.streaming.api.functions.sink.filesystem.HadoopPathBasedBulkFormatBuilder类,引用了streaming-java包里的org.apache.flink.streaming.api.functions.sink.filesystem.DefaultBucketFactoryImpl。估计是因为类加载器不同导致无法引用报错。
>
>
>
> --
> Sent from: http://apache-flink.147419.n8.nabble.com/
>


--
Best regards!
Rui Li
Reply | Threaded
Open this post in threaded view
|

Re: 请教 hive streaming 报错

liangck
最后 加了好多包到 flink/lib 下。我的任务是好了。hive-exec包中依赖的protobuf-java是2.5.0而且是直接把
protobuf-java
打进了jar包。我们这边有其他的任务依赖的protobuf版本是3.5.1,不兼容,服务起不来。。。请问下大佬们有什么好的办法吗。



--
Sent from: http://apache-flink.147419.n8.nabble.com/
Reply | Threaded
Open this post in threaded view
|

Re: 请教 hive streaming 报错

liangck
In reply to this post by Rui Li
最后加了好多jar包到 flink/lib
下,任务跑起来了。但是hive-exec中依赖的protobuf版本是2.5.0而且打进了jar包里,和其他任务里依赖的protobuf版本3.5.1不兼容。。请问下大佬们有什么好办法吗?



--
Sent from: http://apache-flink.147419.n8.nabble.com/
Reply | Threaded
Open this post in threaded view
|

Re: 请教 hive streaming 报错

Rui Li
可以试试不把hive-exec和PB 3.5.1的jar放到lib下面,而是通过命令行参数的方式指定这两个依赖?

On Wed, Sep 2, 2020 at 5:52 PM liangck <[hidden email]> wrote:

> 最后加了好多jar包到 flink/lib
>
> 下,任务跑起来了。但是hive-exec中依赖的protobuf版本是2.5.0而且打进了jar包里,和其他任务里依赖的protobuf版本3.5.1不兼容。。请问下大佬们有什么好办法吗?
>
>
>
> --
> Sent from: http://apache-flink.147419.n8.nabble.com/



--
Best regards!
Rui Li