Hi,
现在遇到个问题:Flink消费kafka作业在IDEA上编译执行正常,但是打包后发布到集群上运行报错,已将对应的jar包放到flink的lib路径下了,提交作业无报错! 请大神帮忙分析一下原因,谢谢!!! 环境如下: Flink:1.7.2 Kafka:1.1.0 Scala:2.11.8 报错信息如下: org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot load user class: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer ClassLoader info: URL ClassLoader: file: '/tmp/blobStore-0d69900e-5299-4be9-b3bc-060d06559034/job_e8fccb398c2d9de108051beb06ec64cc/blob_p-d1e0b6ace7b204eb42f56ce87b96bff39cc58289-d0b8e666fc70af746ebbd73ff8b38354' (valid JAR) Class not resolvable through given classloader. at org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:236) at org.apache.flink.streaming.runtime.tasks.OperatorChain.<init>(OperatorChain.java:104) at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:267) at org.apache.flink.runtime.taskmanager.Task.run(Task.java:704) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at org.apache.flink.util.InstantiationUtil$ClassLoaderObjectInputStream.resolveClass(InstantiationUtil.java:78) at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1868) at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2042) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431) at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:566) at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:552) at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:540) at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:501) at org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:224) ... 4 more |
file: '/tmp/blobStore-0d69900e-5299-4be9-b3bc-060d06559034/job_e8fccb398c2d9de108051beb06ec64cc/blob_p-d1e0b6ace7b204eb42f56ce87b96bff39cc58289-d0b8e666fc70af746ebbd73ff8b38354' (valid JAR)
这个文件是不是和其他节点的不一样 在2019年11月5日 15:04,Zhong venb<[hidden email]> 写道: Hi, 现在遇到个问题:Flink消费kafka作业在IDEA上编译执行正常,但是打包后发布到集群上运行报错,已将对应的jar包放到flink的lib路径下了,提交作业无报错! 请大神帮忙分析一下原因,谢谢!!! 环境如下: Flink:1.7.2 Kafka:1.1.0 Scala:2.11.8 报错信息如下: org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot load user class: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer ClassLoader info: URL ClassLoader: file: '/tmp/blobStore-0d69900e-5299-4be9-b3bc-060d06559034/job_e8fccb398c2d9de108051beb06ec64cc/blob_p-d1e0b6ace7b204eb42f56ce87b96bff39cc58289-d0b8e666fc70af746ebbd73ff8b38354' (valid JAR) Class not resolvable through given classloader. at org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:236) at org.apache.flink.streaming.runtime.tasks.OperatorChain.<init>(OperatorChain.java:104) at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:267) at org.apache.flink.runtime.taskmanager.Task.run(Task.java:704) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at org.apache.flink.util.InstantiationUtil$ClassLoaderObjectInputStream.resolveClass(InstantiationUtil.java:78) at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1868) at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2042) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431) at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:566) at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:552) at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:540) at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:501) at org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:224) ... 4 more |
这个文件不存在!!!不知道怎么来的,我重新提交了一下也是找不到报错里提到的这个临时文件的路径。
-----邮件原件----- 发件人: 李军 <[hidden email]> 发送时间: 2019年11月5日 15:12 收件人: [hidden email] 主题: 回复:flink作业提交到集群执行异常 file: '/tmp/blobStore-0d69900e-5299-4be9-b3bc-060d06559034/job_e8fccb398c2d9de108051beb06ec64cc/blob_p-d1e0b6ace7b204eb42f56ce87b96bff39cc58289-d0b8e666fc70af746ebbd73ff8b38354' (valid JAR) 这个文件是不是和其他节点的不一样 在2019年11月5日 15:04,Zhong venb<[hidden email]> 写道: Hi, 现在遇到个问题:Flink消费kafka作业在IDEA上编译执行正常,但是打包后发布到集群上运行报错,已将对应的jar包放到flink的lib路径下了,提交作业无报错! 请大神帮忙分析一下原因,谢谢!!! 环境如下: Flink:1.7.2 Kafka:1.1.0 Scala:2.11.8 报错信息如下: org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot load user class: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer ClassLoader info: URL ClassLoader: file: '/tmp/blobStore-0d69900e-5299-4be9-b3bc-060d06559034/job_e8fccb398c2d9de108051beb06ec64cc/blob_p-d1e0b6ace7b204eb42f56ce87b96bff39cc58289-d0b8e666fc70af746ebbd73ff8b38354' (valid JAR) Class not resolvable through given classloader. at org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:236) at org.apache.flink.streaming.runtime.tasks.OperatorChain.<init>(OperatorChain.java:104) at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:267) at org.apache.flink.runtime.taskmanager.Task.run(Task.java:704) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at org.apache.flink.util.InstantiationUtil$ClassLoaderObjectInputStream.resolveClass(InstantiationUtil.java:78) at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1868) at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2042) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431) at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:566) at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:552) at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:540) at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:501) at org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:224) ... 4 more |
In reply to this post by Zhong venb
你好,请问你这个flink作业的pom文件能发一下吗?我猜测你是直接参考官方quickstart修改的.如果是的话,需要激活额外的profile: add-dependencies-for-IDEA,并把flink-connector-kafka-0.?_2.11依赖的<scope>provided</scope>标签删掉,或参考profile添加flink-connector-kafka-0.?_2.11的<scope>compile</scope>.这样才会把依赖打包进jar包中.
-----邮件原件----- 发件人: Zhong venb <[hidden email]> 发送时间: 2019年11月5日 15:04 收件人: [hidden email] 主题: flink作业提交到集群执行异常 Hi, 现在遇到个问题:Flink消费kafka作业在IDEA上编译执行正常,但是打包后发布到集群上运行报错,已将对应的jar包放到flink的lib路径下了,提交作业无报错! 请大神帮忙分析一下原因,谢谢!!! 环境如下: Flink:1.7.2 Kafka:1.1.0 Scala:2.11.8 报错信息如下: org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot load user class: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer ClassLoader info: URL ClassLoader: file: '/tmp/blobStore-0d69900e-5299-4be9-b3bc-060d06559034/job_e8fccb398c2d9de108051beb06ec64cc/blob_p-d1e0b6ace7b204eb42f56ce87b96bff39cc58289-d0b8e666fc70af746ebbd73ff8b38354' (valid JAR) Class not resolvable through given classloader. at org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:236) at org.apache.flink.streaming.runtime.tasks.OperatorChain.<init>(OperatorChain.java:104) at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:267) at org.apache.flink.runtime.taskmanager.Task.run(Task.java:704) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at org.apache.flink.util.InstantiationUtil$ClassLoaderObjectInputStream.resolveClass(InstantiationUtil.java:78) at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1868) at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2042) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431) at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:566) at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:552) at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:540) at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:501) at org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:224) ... 4 more |
Pom文件已上传,不过您说的问题我有检查过我打的jar包里是有把kafka的相关包打进去的,也同时把对应的包放flink的lib下了。
-----邮件原件----- 发件人: 赵 恒泰 <[hidden email]> 发送时间: 2019年11月5日 16:35 收件人: [hidden email] 主题: 回复: flink作业提交到集群执行异常 你好,请问你这个flink作业的pom文件能发一下吗?我猜测你是直接参考官方quickstart修改的.如果是的话,需要激活额外的profile: add-dependencies-for-IDEA,并把flink-connector-kafka-0.?_2.11依赖的<scope>provided</scope>标签删掉,或参考profile添加flink-connector-kafka-0.?_2.11的<scope>compile</scope>.这样才会把依赖打包进jar包中. -----邮件原件----- 发件人: Zhong venb <[hidden email]> 发送时间: 2019年11月5日 15:04 收件人: [hidden email] 主题: flink作业提交到集群执行异常 Hi, 现在遇到个问题:Flink消费kafka作业在IDEA上编译执行正常,但是打包后发布到集群上运行报错,已将对应的jar包放到flink的lib路径下了,提交作业无报错! 请大神帮忙分析一下原因,谢谢!!! 环境如下: Flink:1.7.2 Kafka:1.1.0 Scala:2.11.8 报错信息如下: org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot load user class: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer ClassLoader info: URL ClassLoader: file: '/tmp/blobStore-0d69900e-5299-4be9-b3bc-060d06559034/job_e8fccb398c2d9de108051beb06ec64cc/blob_p-d1e0b6ace7b204eb42f56ce87b96bff39cc58289-d0b8e666fc70af746ebbd73ff8b38354' (valid JAR) Class not resolvable through given classloader. at org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:236) at org.apache.flink.streaming.runtime.tasks.OperatorChain.<init>(OperatorChain.java:104) at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:267) at org.apache.flink.runtime.taskmanager.Task.run(Task.java:704) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at org.apache.flink.util.InstantiationUtil$ClassLoaderObjectInputStream.resolveClass(InstantiationUtil.java:78) at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1868) at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2042) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431) at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:566) at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:552) at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:540) at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:501) at org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:224) ... 4 more pom.xml (12K) Download Attachment |
如果kafka-connector已经打到jar包,请把flink/lib目录下的jar包删掉
Zhong venb <[hidden email]> 于2019年11月5日周二 下午4:49写道: > Pom文件已上传,不过您说的问题我有检查过我打的jar包里是有把kafka的相关包打进去的,也同时把对应的包放flink的lib下了。 > > -----邮件原件----- > 发件人: 赵 恒泰 <[hidden email]> > 发送时间: 2019年11月5日 16:35 > 收件人: [hidden email] > 主题: 回复: flink作业提交到集群执行异常 > > 你好,请问你这个flink作业的pom文件能发一下吗?我猜测你是直接参考官方quickstart修改的.如果是的话,需要激活额外的profile: > add-dependencies-for-IDEA,并把flink-connector-kafka-0.?_2.11依赖的<scope>provided</scope>标签删掉,或参考profile添加flink-connector-kafka-0.?_2.11的<scope>compile</scope>.这样才会把依赖打包进jar包中. > > -----邮件原件----- > 发件人: Zhong venb <[hidden email]> > 发送时间: 2019年11月5日 15:04 > 收件人: [hidden email] > 主题: flink作业提交到集群执行异常 > > Hi, > 现在遇到个问题:Flink消费kafka作业在IDEA上编译执行正常,但是打包后发布到集群上运行报错,已将对应的jar包放到flink的lib路径下了,提交作业无报错! > 请大神帮忙分析一下原因,谢谢!!! > > 环境如下: > Flink:1.7.2 > Kafka:1.1.0 > Scala:2.11.8 > > 报错信息如下: > org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot load > user class: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer > ClassLoader info: URL ClassLoader: > file: > '/tmp/blobStore-0d69900e-5299-4be9-b3bc-060d06559034/job_e8fccb398c2d9de108051beb06ec64cc/blob_p-d1e0b6ace7b204eb42f56ce87b96bff39cc58289-d0b8e666fc70af746ebbd73ff8b38354' > (valid JAR) Class not resolvable through given classloader. > at > org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:236) > at > org.apache.flink.streaming.runtime.tasks.OperatorChain.<init>(OperatorChain.java:104) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:267) > at org.apache.flink.runtime.taskmanager.Task.run(Task.java:704) > at java.lang.Thread.run(Thread.java:748) > Caused by: java.lang.ClassNotFoundException: > org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer > at java.net.URLClassLoader.findClass(URLClassLoader.java:382) > at java.lang.ClassLoader.loadClass(ClassLoader.java:424) > at java.lang.ClassLoader.loadClass(ClassLoader.java:357) > at java.lang.Class.forName0(Native Method) > at java.lang.Class.forName(Class.java:348) > at > org.apache.flink.util.InstantiationUtil$ClassLoaderObjectInputStream.resolveClass(InstantiationUtil.java:78) > at > java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1868) > at > java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751) > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2042) > at > java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573) > at > java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287) > at > java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211) > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069) > at > java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573) > at > java.io.ObjectInputStream.readObject(ObjectInputStream.java:431) > at > org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:566) > at > org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:552) > at > org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:540) > at > org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:501) > at > org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:224) > ... 4 more > > > |
Hi,
遇到了同样的问题,请教下是如何解决的? 编译jar包为单独的jar,非jar-with-dependencies的方式,依赖的jar包放到了自定义一个目录lib1 修改了bin/flink脚本,CC_CLASSPATH追加了自定义jar目录lib1,lib1目录下能找到这个类 $ grep org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer * Binary file flink-sql-connector-kafka_2.12-1.11.1.jar matches flink/lib目录和自定义的lib目录里面没有重复的文件 org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot load user class: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer ClassLoader info: URL ClassLoader: file: '/tmp/blobStore-72796b24-2be1-4bc7-ac86-7acd1fe16b48/job_f532b11a7342424cdc0695126071f96e/blob_p-3dc0956f6379ef65c8f54997d7fe4a4d0918064c-b96f6980c6e06d9618abd63d25c1cee6' (valid JAR) Class not resolvable through given classloader. at org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperatorFactory(StreamConfig.java:288) ~[flink-dist_2.11-1.11.1.jar:1.11.1] at org.apache.flink.streaming.runtime.tasks.OperatorChain.<init>(OperatorChain.java:126) ~[flink-dist_2.11-1.11.1.jar:1.11.1] at org.apache.flink.streaming.runtime.tasks.StreamTask.beforeInvoke(StreamTask.java:453) ~[flink-dist_2.11-1.11.1.jar:1.11.1] at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:522) ~[flink-dist_2.11-1.11.1.jar:1.11.1] at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:721) ~[flink-dist_2.11-1.11.1.jar:1.11.1] at org.apache.flink.runtime.taskmanager.Task.run(Task.java:546) ~[flink-dist_2.11-1.11.1.jar:1.11.1] at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_251] Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer at java.net.URLClassLoader.findClass(URLClassLoader.java:382) ~[?:1.8.0_251] at java.lang.ClassLoader.loadClass(ClassLoader.java:418) ~[?:1.8.0_251] at org.apache.flink.util.FlinkUserCodeClassLoader.loadClassWithoutExceptionHandling(FlinkUserCodeClassLoader.java:61) ~[flink-dist_2.11-1.11.1.jar:1.11.1] -- Sent from: http://apache-flink.147419.n8.nabble.com/ |
定位到问题了,我这里是scala的版本不一致导致的
部分maven引用的2.11,部分引用的2.12,统一版本后这个报错就不存在了 -- Sent from: http://apache-flink.147419.n8.nabble.com/ |
Free forum by Nabble | Edit this page |