pyflink kafka connector报错ByteArrayDeserializer

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

pyflink kafka connector报错ByteArrayDeserializer

qianhuan
This post was updated on .
CONTENTS DELETED
The author has deleted this message.
Reply | Threaded
Open this post in threaded view
|

Re: pyflink kafka connector报错ByteArrayDeserializer

qianhuan
是不是connector版本问题,之前1.12.2可以跑,有没有大神帮忙看下



--
Sent from: http://apache-flink.147419.n8.nabble.com/
Reply | Threaded
Open this post in threaded view
|

Re: pyflink kafka connector报错ByteArrayDeserializer

Dian Fu
In reply to this post by qianhuan
要用fat jar: https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-kafka_2.11/1.13.1/flink-sql-connector-kafka_2.11-1.13.1.jar


> 2021年6月2日 下午2:43,qianhuan <[hidden email]> 写道:
>
> 版本:
> python 3.8
> apache-flink           1.13.1
> apache-flink-libraries 1.13.1
>
> 代码:
> from pyflink.datastream import StreamExecutionEnvironment
> from pyflink.table import StreamTableEnvironment, EnvironmentSettings
>
> def log_processing():
>    env = StreamExecutionEnvironment.get_execution_environment()
>    env_settings = EnvironmentSettings.Builder().use_blink_planner().build()
>    t_env = StreamTableEnvironment.create(stream_execution_environment=env,
> environment_settings=env_settings)
>    t_env.get_config().get_configuration().\
>        set_string("pipeline.jars",
> "file:///root/flink/jars/flink-connector-kafka_2.11-1.13.1.jar")
>
>    kafka_source_ddl = f"""
>            CREATE TABLE kafka_source_table(
>                a VARCHAR
>            ) WITH (
>              'connector.type' = 'kafka',
>              'connector.version' = 'universal',
>              'connector.topic' = 'test5',
>              'connector.properties.bootstrap.servers' = 'localhost:9092',
>              'connector.properties.zookeeper.connect' = 'localhost:2181',
>              'connector.startup-mode' = 'latest-offset',
>              'format.type' = 'json'
>            )
>            """
>
>    kafka_sink_ddl = f"""
>            CREATE TABLE kafka_sink_table(
>                b VARCHAR
>            ) WITH (
>              'connector.type' = 'kafka',
>              'connector.version' = 'universal',
>              'connector.topic' = 'test6',
>              'connector.properties.bootstrap.servers' = 'localhost:9092',
>              'connector.properties.zookeeper.connect' = 'localhost:2181',
>              'format.type' = 'json'
>            )
>            """
>
>    t_env.execute_sql(kafka_source_ddl)
>    t_env.execute_sql(kafka_sink_ddl)
>    print("all_tables", t_env.list_tables())
>    t_env.sql_query("SELECT a FROM kafka_source_table") \
>        .execute_insert("kafka_sink_table").wait()
>
> if __name__ == '__main__':
>    log_processing()
>
> 报错:
> py4j.protocol.Py4JJavaError: An error occurred while calling
> o65.executeInsert.
> : java.lang.NoClassDefFoundError:
> org/apache/kafka/common/serialization/ByteArrayDeserializer
> at
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.setDeserializer(FlinkKafkaConsumer.java:322)
> at
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.<init>(FlinkKafkaConsumer.java:223)
> at
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.<init>(FlinkKafkaConsumer.java:154)
> at
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.<init>(FlinkKafkaConsumer.java:139)
> at
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.<init>(FlinkKafkaConsumer.java:108)
> at
> org.apache.flink.streaming.connectors.kafka.KafkaTableSource.createKafkaConsumer(KafkaTableSource.java:106)
> at
> org.apache.flink.streaming.connectors.kafka.KafkaTableSourceBase.getKafkaConsumer(KafkaTableSourceBase.java:293)
> at
> org.apache.flink.streaming.connectors.kafka.KafkaTableSourceBase.getDataStream(KafkaTableSourceBase.java:194)
> at
> org.apache.flink.table.planner.plan.nodes.exec.common.CommonExecLegacyTableSourceScan.translateToPlanInternal(CommonExecLegacyTableSourceScan.java:94)
> at
> org.apache.flink.table.planner.plan.nodes.exec.ExecNodeBase.translateToPlan(ExecNodeBase.java:134)
> at
> org.apache.flink.table.planner.plan.nodes.exec.ExecEdge.translateToPlan(ExecEdge.java:247)
> at
> org.apache.flink.table.planner.plan.nodes.exec.common.CommonExecLegacySink.translateToTransformation(CommonExecLegacySink.java:172)
> at
> org.apache.flink.table.planner.plan.nodes.exec.common.CommonExecLegacySink.translateToPlanInternal(CommonExecLegacySink.java:112)
> at
> org.apache.flink.table.planner.plan.nodes.exec.ExecNodeBase.translateToPlan(ExecNodeBase.java:134)
> at
> org.apache.flink.table.planner.delegation.StreamPlanner$$anonfun$translateToPlan$1.apply(StreamPlanner.scala:70)
> at
> org.apache.flink.table.planner.delegation.StreamPlanner$$anonfun$translateToPlan$1.apply(StreamPlanner.scala:69)
> at
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
> at
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
> at scala.collection.Iterator$class.foreach(Iterator.scala:891)
> at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
> at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
> at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
> at scala.collection.AbstractTraversable.map(Traversable.scala:104)
> at
> org.apache.flink.table.planner.delegation.StreamPlanner.translateToPlan(StreamPlanner.scala:69)
> at
> org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:165)
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1518)
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:740)
> at
> org.apache.flink.table.api.internal.TableImpl.executeInsert(TableImpl.java:572)
> at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.base/java.lang.reflect.Method.invoke(Method.java:566)
> at
> org.apache.flink.api.python.shaded.py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
> at
> org.apache.flink.api.python.shaded.py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
> at org.apache.flink.api.python.shaded.py4j.Gateway.invoke(Gateway.java:282)
> at
> org.apache.flink.api.python.shaded.py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
> at
> org.apache.flink.api.python.shaded.py4j.commands.CallCommand.execute(CallCommand.java:79)
> at
> org.apache.flink.api.python.shaded.py4j.GatewayConnection.run(GatewayConnection.java:238)
> at java.base/java.lang.Thread.run(Thread.java:834)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.kafka.common.serialization.ByteArrayDeserializer
> at
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)
> at
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
> ... 40 more
>
>
>
>
> --
> Sent from: http://apache-flink.147419.n8.nabble.com/