Hello,
在使用flink sql 1.10.0 时候,当source table 中含有复杂schema,比如 create table xxx ( a string, b row( c row(d string) ) ) 当c 中有值的时候,sql 如下 insert into select * from xxx会出现下面错误 Caused by: java.lang.ClassCastException: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.NullNode cannot be cast to org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode at org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$assembleRowConverter$dd344700$1(JsonRowSerializationSchema.java:337) at org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$wrapIntoNullableConverter$1fa09b5b$1(JsonRowSerializationSchema.java:189) at org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$assembleRowConverter$dd344700$1(JsonRowSerializationSchema.java:345) at org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$wrapIntoNullableConverter$1fa09b5b$1(JsonRowSerializationSchema.java:189) at org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$assembleRowConverter$dd344700$1(JsonRowSerializationSchema.java:345) at org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$wrapIntoNullableConverter$1fa09b5b$1(JsonRowSerializationSchema.java:189) at org.apache.flink.formats.json.JsonRowSerializationSchema.serialize(JsonRowSerializationSchema.java:138) ... 38 more Best wishes. |
Hi Peihui,
这是一个已知bug[1],已经在1.10.1和1.11.0中修复了,可以尝试下这两个版本。 [1] https://issues.apache.org/jira/browse/FLINK-16220 Peihui He <[hidden email]> 于2020年7月15日周三 上午9:54写道: > Hello, > > 在使用flink sql 1.10.0 时候,当source table 中含有复杂schema,比如 > create table xxx ( > a string, > b row( > c row(d string) > ) > ) > > 当c 中有值的时候,sql 如下 insert into select * from xxx会出现下面错误 > > Caused by: java.lang.ClassCastException: > > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.NullNode > cannot be cast to > > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode > at > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$assembleRowConverter$dd344700$1(JsonRowSerializationSchema.java:337) > at > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$wrapIntoNullableConverter$1fa09b5b$1(JsonRowSerializationSchema.java:189) > at > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$assembleRowConverter$dd344700$1(JsonRowSerializationSchema.java:345) > at > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$wrapIntoNullableConverter$1fa09b5b$1(JsonRowSerializationSchema.java:189) > at > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$assembleRowConverter$dd344700$1(JsonRowSerializationSchema.java:345) > at > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$wrapIntoNullableConverter$1fa09b5b$1(JsonRowSerializationSchema.java:189) > at > > org.apache.flink.formats.json.JsonRowSerializationSchema.serialize(JsonRowSerializationSchema.java:138) > ... 38 more > > > Best wishes. > -- Best, Benchao Li |
Hi BenChao,
刚才尝试了flink 1.10.1 但是问题还是存在,看了 [1] https://issues.apache.org/jira/browse/FLINK-16628 这个bug fix没有我给的 table 复杂, CREATE TABLE source_kafka_sasl ( svt STRING, ops ROW<a ROW(b STRING)> ) WITH () 我的是在原有的ops 里面又前嵌套了row。 Benchao Li <[hidden email]> 于2020年7月15日周三 上午10:25写道: > Hi Peihui, > > 这是一个已知bug[1],已经在1.10.1和1.11.0中修复了,可以尝试下这两个版本。 > > [1] https://issues.apache.org/jira/browse/FLINK-16220 > > Peihui He <[hidden email]> 于2020年7月15日周三 上午9:54写道: > > > Hello, > > > > 在使用flink sql 1.10.0 时候,当source table 中含有复杂schema,比如 > > create table xxx ( > > a string, > > b row( > > c row(d string) > > ) > > ) > > > > 当c 中有值的时候,sql 如下 insert into select * from xxx会出现下面错误 > > > > Caused by: java.lang.ClassCastException: > > > > > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.NullNode > > cannot be cast to > > > > > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode > > at > > > > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$assembleRowConverter$dd344700$1(JsonRowSerializationSchema.java:337) > > at > > > > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$wrapIntoNullableConverter$1fa09b5b$1(JsonRowSerializationSchema.java:189) > > at > > > > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$assembleRowConverter$dd344700$1(JsonRowSerializationSchema.java:345) > > at > > > > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$wrapIntoNullableConverter$1fa09b5b$1(JsonRowSerializationSchema.java:189) > > at > > > > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$assembleRowConverter$dd344700$1(JsonRowSerializationSchema.java:345) > > at > > > > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$wrapIntoNullableConverter$1fa09b5b$1(JsonRowSerializationSchema.java:189) > > at > > > > > org.apache.flink.formats.json.JsonRowSerializationSchema.serialize(JsonRowSerializationSchema.java:138) > > ... 38 more > > > > > > Best wishes. > > > > > -- > > Best, > Benchao Li > |
In reply to this post by Benchao Li-2
Hi BenChao,
换成1.10.1 就可以了。刚才那封邮件不行,是因为依赖flink-kafka的依赖版本没有修改过来。 Thank you. Benchao Li <[hidden email]> 于2020年7月15日周三 上午10:25写道: > Hi Peihui, > > 这是一个已知bug[1],已经在1.10.1和1.11.0中修复了,可以尝试下这两个版本。 > > [1] https://issues.apache.org/jira/browse/FLINK-16220 > > Peihui He <[hidden email]> 于2020年7月15日周三 上午9:54写道: > > > Hello, > > > > 在使用flink sql 1.10.0 时候,当source table 中含有复杂schema,比如 > > create table xxx ( > > a string, > > b row( > > c row(d string) > > ) > > ) > > > > 当c 中有值的时候,sql 如下 insert into select * from xxx会出现下面错误 > > > > Caused by: java.lang.ClassCastException: > > > > > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.NullNode > > cannot be cast to > > > > > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode > > at > > > > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$assembleRowConverter$dd344700$1(JsonRowSerializationSchema.java:337) > > at > > > > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$wrapIntoNullableConverter$1fa09b5b$1(JsonRowSerializationSchema.java:189) > > at > > > > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$assembleRowConverter$dd344700$1(JsonRowSerializationSchema.java:345) > > at > > > > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$wrapIntoNullableConverter$1fa09b5b$1(JsonRowSerializationSchema.java:189) > > at > > > > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$assembleRowConverter$dd344700$1(JsonRowSerializationSchema.java:345) > > at > > > > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$wrapIntoNullableConverter$1fa09b5b$1(JsonRowSerializationSchema.java:189) > > at > > > > > org.apache.flink.formats.json.JsonRowSerializationSchema.serialize(JsonRowSerializationSchema.java:138) > > ... 38 more > > > > > > Best wishes. > > > > > -- > > Best, > Benchao Li > |
赞👍 (不过这个应该只是json format的bug,跟connector没有关系)
Peihui He <[hidden email]> 于2020年7月15日周三 下午1:16写道: > Hi BenChao, > > 换成1.10.1 就可以了。刚才那封邮件不行,是因为依赖flink-kafka的依赖版本没有修改过来。 > Thank you. > > > Benchao Li <[hidden email]> 于2020年7月15日周三 上午10:25写道: > > > Hi Peihui, > > > > 这是一个已知bug[1],已经在1.10.1和1.11.0中修复了,可以尝试下这两个版本。 > > > > [1] https://issues.apache.org/jira/browse/FLINK-16220 > > > > Peihui He <[hidden email]> 于2020年7月15日周三 上午9:54写道: > > > > > Hello, > > > > > > 在使用flink sql 1.10.0 时候,当source table 中含有复杂schema,比如 > > > create table xxx ( > > > a string, > > > b row( > > > c row(d string) > > > ) > > > ) > > > > > > 当c 中有值的时候,sql 如下 insert into select * from xxx会出现下面错误 > > > > > > Caused by: java.lang.ClassCastException: > > > > > > > > > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.NullNode > > > cannot be cast to > > > > > > > > > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode > > > at > > > > > > > > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$assembleRowConverter$dd344700$1(JsonRowSerializationSchema.java:337) > > > at > > > > > > > > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$wrapIntoNullableConverter$1fa09b5b$1(JsonRowSerializationSchema.java:189) > > > at > > > > > > > > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$assembleRowConverter$dd344700$1(JsonRowSerializationSchema.java:345) > > > at > > > > > > > > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$wrapIntoNullableConverter$1fa09b5b$1(JsonRowSerializationSchema.java:189) > > > at > > > > > > > > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$assembleRowConverter$dd344700$1(JsonRowSerializationSchema.java:345) > > > at > > > > > > > > > org.apache.flink.formats.json.JsonRowSerializationSchema.lambda$wrapIntoNullableConverter$1fa09b5b$1(JsonRowSerializationSchema.java:189) > > > at > > > > > > > > > org.apache.flink.formats.json.JsonRowSerializationSchema.serialize(JsonRowSerializationSchema.java:138) > > > ... 38 more > > > > > > > > > Best wishes. > > > > > > > > > -- > > > > Best, > > Benchao Li > > > -- Best, Benchao Li |
Free forum by Nabble | Edit this page |