sql 内嵌josn数组解析报 类型转换报错

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

sql 内嵌josn数组解析报 类型转换报错

claylin
hi all我这边有个嵌套的json数组,报类型转换错误(ts AS CAST(FROM_UNIXTIME(hiido_time) AS TIMESTAMP(3)),这里报错),是不是不能这么写
create table hiido_push_sdk_mq (
datas&nbsp; &nbsp;ARRAY<ROW<`from` string,hdid string,event string,hiido_time bigint,ts AS CAST(FROM_UNIXTIME(hiido_time) AS TIMESTAMP(3)),WATERMARK FOR ts AS ts - INTERVAL '5' MINUTE&gt;&gt;
) with (
'connector' = 'kafka',
'topic' = 'hiido_pushsdk_event',
'properties.bootstrap.servers' = 'kafkafs002-core001.yy.com:8103,kafkafs002-core002.yy.com:8103,kafkafs002-core003.yy.com:8103',
'properties.group.id' = 'push_click_sql_version_consumer',
'scan.startup.mode' = 'latest-offset',
'format.type' = 'json');




错误如下:
[ERROR] 2020-07-17 20:17:50,640(562284338) --&gt; [http-nio-8080-exec-10] com.yy.push.flink.sql.gateway.sql.parse.SqlCommandParser.parseBySqlParser(SqlCommandParser.java:77): parseBySqlParser, parse: com.yy.push.flink.sql.gateway.context.JobContext$1@5d5f32d1, stmt: create table hiido_push_sdk_mq (&nbsp; &nbsp; datas&nbsp; &nbsp;ARRAY<ROW<`from` string,hdid string,event string,hiido_time bigint,ts AS CAST(FROM_UNIXTIME(hiido_time) AS TIMESTAMP(3)),WATERMARK FOR ts AS ts - INTERVAL '5' MINUTE&gt;&gt;) with ('connector' = 'kafka','topic' = 'hiido_pushsdk_event','properties.bootstrap.servers' = 'kafkafs002-core001.yy.com:8103,kafkafs002-core002.yy.com:8103,kafkafs002-core003.yy.com:8103','properties.group.id' = 'push_click_sql_version_consumer','scan.startup.mode' = 'latest-offset','format.type' = 'json'), error info: SQL parse failed. Encountered "AS" at line 1, column 115.
Was expecting one of:
&nbsp; &nbsp; "ROW" ...
&nbsp; &nbsp; <BRACKET_QUOTED_IDENTIFIER&gt; ...
&nbsp; &nbsp; <QUOTED_IDENTIFIER&gt; ...
&nbsp; &nbsp; <BACK_QUOTED_IDENTIFIER&gt; ...
&nbsp; &nbsp; <IDENTIFIER&gt; ...
&nbsp; &nbsp; <UNICODE_QUOTED_IDENTIFIER&gt; ...
&nbsp; &nbsp; "STRING" ...
&nbsp; &nbsp; "BYTES" ...
&nbsp; &nbsp; "ARRAY" ...
&nbsp; &nbsp; "MULTISET" ...
&nbsp; &nbsp; "RAW" ...
&nbsp; &nbsp; "BOOLEAN" ...
&nbsp; &nbsp; "INTEGER" ...
&nbsp; &nbsp; "INT" ...
&nbsp; &nbsp; "TINYINT" ...
&nbsp; &nbsp; "SMALLINT" ...
&nbsp; &nbsp; "BIGINT" ...
&nbsp; &nbsp; "REAL" ...
&nbsp; &nbsp; "DOUBLE" ...
&nbsp; &nbsp; "FLOAT" ...
&nbsp; &nbsp; "BINARY" ...
&nbsp; &nbsp; "VARBINARY" ...
&nbsp; &nbsp; "DECIMAL" ...
&nbsp; &nbsp; "DEC" ...
&nbsp; &nbsp; "NUMERIC" ...
&nbsp; &nbsp; "ANY" ...
&nbsp; &nbsp; "CHARACTER" ...
&nbsp; &nbsp; "CHAR" ...
&nbsp; &nbsp; "VARCHAR" ...
&nbsp; &nbsp; "DATE" ...
&nbsp; &nbsp; "TIME" ...
&nbsp; &nbsp; "TIMESTAMP" ...
Reply | Threaded
Open this post in threaded view
|

Re: sql 内嵌josn数组解析报 类型转换报错

Benchao Li-2
计算列只能写在最外层,不能在嵌套类型里面有计算列。

claylin <[hidden email]> 于2020年7月17日周五 下午8:28写道:

> hi all我这边有个嵌套的json数组,报类型转换错误(ts AS CAST(FROM_UNIXTIME(hiido_time) AS
> TIMESTAMP(3)),这里报错),是不是不能这么写
> create table hiido_push_sdk_mq (
> datas&nbsp; &nbsp;ARRAY<ROW<`from` string,hdid string,event
> string,hiido_time bigint,ts AS CAST(FROM_UNIXTIME(hiido_time) AS
> TIMESTAMP(3)),WATERMARK FOR ts AS ts - INTERVAL '5' MINUTE&gt;&gt;
> ) with (
> 'connector' = 'kafka',
> 'topic' = 'hiido_pushsdk_event',
> 'properties.bootstrap.servers' = 'kafkafs002-core001.yy.com:8103,
> kafkafs002-core002.yy.com:8103,kafkafs002-core003.yy.com:8103',
> 'properties.group.id' = 'push_click_sql_version_consumer',
> 'scan.startup.mode' = 'latest-offset',
> 'format.type' = 'json');
>
>
>
>
> 错误如下:
> [ERROR] 2020-07-17 20:17:50,640(562284338) --&gt; [http-nio-8080-exec-10]
> com.yy.push.flink.sql.gateway.sql.parse.SqlCommandParser.parseBySqlParser(SqlCommandParser.java:77):
> parseBySqlParser, parse:
> com.yy.push.flink.sql.gateway.context.JobContext$1@5d5f32d1, stmt: create
> table hiido_push_sdk_mq (&nbsp; &nbsp; datas&nbsp; &nbsp;ARRAY<ROW<`from`
> string,hdid string,event string,hiido_time bigint,ts AS
> CAST(FROM_UNIXTIME(hiido_time) AS TIMESTAMP(3)),WATERMARK FOR ts AS ts -
> INTERVAL '5' MINUTE&gt;&gt;) with ('connector' = 'kafka','topic' =
> 'hiido_pushsdk_event','properties.bootstrap.servers' = '
> kafkafs002-core001.yy.com:8103,kafkafs002-core002.yy.com:8103,
> kafkafs002-core003.yy.com:8103','properties.group.id' =
> 'push_click_sql_version_consumer','scan.startup.mode' =
> 'latest-offset','format.type' = 'json'), error info: SQL parse failed.
> Encountered "AS" at line 1, column 115.
> Was expecting one of:
> &nbsp; &nbsp; "ROW" ...
> &nbsp; &nbsp; <BRACKET_QUOTED_IDENTIFIER&gt; ...
> &nbsp; &nbsp; <QUOTED_IDENTIFIER&gt; ...
> &nbsp; &nbsp; <BACK_QUOTED_IDENTIFIER&gt; ...
> &nbsp; &nbsp; <IDENTIFIER&gt; ...
> &nbsp; &nbsp; <UNICODE_QUOTED_IDENTIFIER&gt; ...
> &nbsp; &nbsp; "STRING" ...
> &nbsp; &nbsp; "BYTES" ...
> &nbsp; &nbsp; "ARRAY" ...
> &nbsp; &nbsp; "MULTISET" ...
> &nbsp; &nbsp; "RAW" ...
> &nbsp; &nbsp; "BOOLEAN" ...
> &nbsp; &nbsp; "INTEGER" ...
> &nbsp; &nbsp; "INT" ...
> &nbsp; &nbsp; "TINYINT" ...
> &nbsp; &nbsp; "SMALLINT" ...
> &nbsp; &nbsp; "BIGINT" ...
> &nbsp; &nbsp; "REAL" ...
> &nbsp; &nbsp; "DOUBLE" ...
> &nbsp; &nbsp; "FLOAT" ...
> &nbsp; &nbsp; "BINARY" ...
> &nbsp; &nbsp; "VARBINARY" ...
> &nbsp; &nbsp; "DECIMAL" ...
> &nbsp; &nbsp; "DEC" ...
> &nbsp; &nbsp; "NUMERIC" ...
> &nbsp; &nbsp; "ANY" ...
> &nbsp; &nbsp; "CHARACTER" ...
> &nbsp; &nbsp; "CHAR" ...
> &nbsp; &nbsp; "VARCHAR" ...
> &nbsp; &nbsp; "DATE" ...
> &nbsp; &nbsp; "TIME" ...
> &nbsp; &nbsp; "TIMESTAMP" ...



--

Best,
Benchao Li
Reply | Threaded
Open this post in threaded view
|

回复: sql 内嵌josn数组解析报 类型转换报错

claylin
那我这种内嵌式的数据结构是不能在sql里面解析了,数组每行转成表中的一列,还有watermark,只能在外部处理成单条记录然后用flink处理了吗




------------------&nbsp;原始邮件&nbsp;------------------
发件人:                                                                                                                        "user-zh"                                                                                    <[hidden email]&gt;;
发送时间:&nbsp;2020年7月17日(星期五) 晚上8:33
收件人:&nbsp;"user-zh"<[hidden email]&gt;;

主题:&nbsp;Re: sql 内嵌josn数组解析报 类型转换报错



计算列只能写在最外层,不能在嵌套类型里面有计算列。

claylin <[hidden email]&gt; 于2020年7月17日周五 下午8:28写道:

&gt; hi all我这边有个嵌套的json数组,报类型转换错误(ts AS CAST(FROM_UNIXTIME(hiido_time) AS
&gt; TIMESTAMP(3)),这里报错),是不是不能这么写
&gt; create table hiido_push_sdk_mq (
&gt; datas&amp;nbsp; &amp;nbsp;ARRAY<ROW<`from` string,hdid string,event
&gt; string,hiido_time bigint,ts AS CAST(FROM_UNIXTIME(hiido_time) AS
&gt; TIMESTAMP(3)),WATERMARK FOR ts AS ts - INTERVAL '5' MINUTE&amp;gt;&amp;gt;
&gt; ) with (
&gt; 'connector' = 'kafka',
&gt; 'topic' = 'hiido_pushsdk_event',
&gt; 'properties.bootstrap.servers' = 'kafkafs002-core001.yy.com:8103,
&gt; kafkafs002-core002.yy.com:8103,kafkafs002-core003.yy.com:8103',
&gt; 'properties.group.id' = 'push_click_sql_version_consumer',
&gt; 'scan.startup.mode' = 'latest-offset',
&gt; 'format.type' = 'json');
&gt;
&gt;
&gt;
&gt;
&gt; 错误如下:
&gt; [ERROR] 2020-07-17 20:17:50,640(562284338) --&amp;gt; [http-nio-8080-exec-10]
&gt; com.yy.push.flink.sql.gateway.sql.parse.SqlCommandParser.parseBySqlParser(SqlCommandParser.java:77):
&gt; parseBySqlParser, parse:
&gt; com.yy.push.flink.sql.gateway.context.JobContext$1@5d5f32d1, stmt: create
&gt; table hiido_push_sdk_mq (&amp;nbsp; &amp;nbsp; datas&amp;nbsp; &amp;nbsp;ARRAY<ROW<`from`
&gt; string,hdid string,event string,hiido_time bigint,ts AS
&gt; CAST(FROM_UNIXTIME(hiido_time) AS TIMESTAMP(3)),WATERMARK FOR ts AS ts -
&gt; INTERVAL '5' MINUTE&amp;gt;&amp;gt;) with ('connector' = 'kafka','topic' =
&gt; 'hiido_pushsdk_event','properties.bootstrap.servers' = '
&gt; kafkafs002-core001.yy.com:8103,kafkafs002-core002.yy.com:8103,
&gt; kafkafs002-core003.yy.com:8103','properties.group.id' =
&gt; 'push_click_sql_version_consumer','scan.startup.mode' =
&gt; 'latest-offset','format.type' = 'json'), error info: SQL parse failed.
&gt; Encountered "AS" at line 1, column 115.
&gt; Was expecting one of:
&gt; &amp;nbsp; &amp;nbsp; "ROW" ...
&gt; &amp;nbsp; &amp;nbsp; <BRACKET_QUOTED_IDENTIFIER&amp;gt; ...
&gt; &amp;nbsp; &amp;nbsp; <QUOTED_IDENTIFIER&amp;gt; ...
&gt; &amp;nbsp; &amp;nbsp; <BACK_QUOTED_IDENTIFIER&amp;gt; ...
&gt; &amp;nbsp; &amp;nbsp; <IDENTIFIER&amp;gt; ...
&gt; &amp;nbsp; &amp;nbsp; <UNICODE_QUOTED_IDENTIFIER&amp;gt; ...
&gt; &amp;nbsp; &amp;nbsp; "STRING" ...
&gt; &amp;nbsp; &amp;nbsp; "BYTES" ...
&gt; &amp;nbsp; &amp;nbsp; "ARRAY" ...
&gt; &amp;nbsp; &amp;nbsp; "MULTISET" ...
&gt; &amp;nbsp; &amp;nbsp; "RAW" ...
&gt; &amp;nbsp; &amp;nbsp; "BOOLEAN" ...
&gt; &amp;nbsp; &amp;nbsp; "INTEGER" ...
&gt; &amp;nbsp; &amp;nbsp; "INT" ...
&gt; &amp;nbsp; &amp;nbsp; "TINYINT" ...
&gt; &amp;nbsp; &amp;nbsp; "SMALLINT" ...
&gt; &amp;nbsp; &amp;nbsp; "BIGINT" ...
&gt; &amp;nbsp; &amp;nbsp; "REAL" ...
&gt; &amp;nbsp; &amp;nbsp; "DOUBLE" ...
&gt; &amp;nbsp; &amp;nbsp; "FLOAT" ...
&gt; &amp;nbsp; &amp;nbsp; "BINARY" ...
&gt; &amp;nbsp; &amp;nbsp; "VARBINARY" ...
&gt; &amp;nbsp; &amp;nbsp; "DECIMAL" ...
&gt; &amp;nbsp; &amp;nbsp; "DEC" ...
&gt; &amp;nbsp; &amp;nbsp; "NUMERIC" ...
&gt; &amp;nbsp; &amp;nbsp; "ANY" ...
&gt; &amp;nbsp; &amp;nbsp; "CHARACTER" ...
&gt; &amp;nbsp; &amp;nbsp; "CHAR" ...
&gt; &amp;nbsp; &amp;nbsp; "VARCHAR" ...
&gt; &amp;nbsp; &amp;nbsp; "DATE" ...
&gt; &amp;nbsp; &amp;nbsp; "TIME" ...
&gt; &amp;nbsp; &amp;nbsp; "TIMESTAMP" ...



--

Best,
Benchao Li
Reply | Threaded
Open this post in threaded view
|

Re: sql 内嵌josn数组解析报 类型转换报错

Benchao Li-2
你的意思是想先把json里面的array展开成多行,然后watermark基于这个展开后的数据来生成是么?

claylin <[hidden email]> 于2020年7月17日周五 下午8:37写道:

> 那我这种内嵌式的数据结构是不能在sql里面解析了,数组每行转成表中的一列,还有watermark,只能在外部处理成单条记录然后用flink处理了吗
>
>
>
>
> ------------------&nbsp;原始邮件&nbsp;------------------
> 发件人:
>                                                   "user-zh"
>                                                                     <
> [hidden email]&gt;;
> 发送时间:&nbsp;2020年7月17日(星期五) 晚上8:33
> 收件人:&nbsp;"user-zh"<[hidden email]&gt;;
>
> 主题:&nbsp;Re: sql 内嵌josn数组解析报 类型转换报错
>
>
>
> 计算列只能写在最外层,不能在嵌套类型里面有计算列。
>
> claylin <[hidden email]&gt; 于2020年7月17日周五 下午8:28写道:
>
> &gt; hi all我这边有个嵌套的json数组,报类型转换错误(ts AS CAST(FROM_UNIXTIME(hiido_time) AS
> &gt; TIMESTAMP(3)),这里报错),是不是不能这么写
> &gt; create table hiido_push_sdk_mq (
> &gt; datas&amp;nbsp; &amp;nbsp;ARRAY<ROW<`from` string,hdid string,event
> &gt; string,hiido_time bigint,ts AS CAST(FROM_UNIXTIME(hiido_time) AS
> &gt; TIMESTAMP(3)),WATERMARK FOR ts AS ts - INTERVAL '5'
> MINUTE&amp;gt;&amp;gt;
> &gt; ) with (
> &gt; 'connector' = 'kafka',
> &gt; 'topic' = 'hiido_pushsdk_event',
> &gt; 'properties.bootstrap.servers' = 'kafkafs002-core001.yy.com:8103,
> &gt; kafkafs002-core002.yy.com:8103,kafkafs002-core003.yy.com:8103',
> &gt; 'properties.group.id' = 'push_click_sql_version_consumer',
> &gt; 'scan.startup.mode' = 'latest-offset',
> &gt; 'format.type' = 'json');
> &gt;
> &gt;
> &gt;
> &gt;
> &gt; 错误如下:
> &gt; [ERROR] 2020-07-17 20:17:50,640(562284338) --&amp;gt;
> [http-nio-8080-exec-10]
> &gt;
> com.yy.push.flink.sql.gateway.sql.parse.SqlCommandParser.parseBySqlParser(SqlCommandParser.java:77):
> &gt; parseBySqlParser, parse:
> &gt; com.yy.push.flink.sql.gateway.context.JobContext$1@5d5f32d1, stmt:
> create
> &gt; table hiido_push_sdk_mq (&amp;nbsp; &amp;nbsp; datas&amp;nbsp;
> &amp;nbsp;ARRAY<ROW<`from`
> &gt; string,hdid string,event string,hiido_time bigint,ts AS
> &gt; CAST(FROM_UNIXTIME(hiido_time) AS TIMESTAMP(3)),WATERMARK FOR ts AS
> ts -
> &gt; INTERVAL '5' MINUTE&amp;gt;&amp;gt;) with ('connector' =
> 'kafka','topic' =
> &gt; 'hiido_pushsdk_event','properties.bootstrap.servers' = '
> &gt; kafkafs002-core001.yy.com:8103,kafkafs002-core002.yy.com:8103,
> &gt; kafkafs002-core003.yy.com:8103','properties.group.id' =
> &gt; 'push_click_sql_version_consumer','scan.startup.mode' =
> &gt; 'latest-offset','format.type' = 'json'), error info: SQL parse failed.
> &gt; Encountered "AS" at line 1, column 115.
> &gt; Was expecting one of:
> &gt; &amp;nbsp; &amp;nbsp; "ROW" ...
> &gt; &amp;nbsp; &amp;nbsp; <BRACKET_QUOTED_IDENTIFIER&amp;gt; ...
> &gt; &amp;nbsp; &amp;nbsp; <QUOTED_IDENTIFIER&amp;gt; ...
> &gt; &amp;nbsp; &amp;nbsp; <BACK_QUOTED_IDENTIFIER&amp;gt; ...
> &gt; &amp;nbsp; &amp;nbsp; <IDENTIFIER&amp;gt; ...
> &gt; &amp;nbsp; &amp;nbsp; <UNICODE_QUOTED_IDENTIFIER&amp;gt; ...
> &gt; &amp;nbsp; &amp;nbsp; "STRING" ...
> &gt; &amp;nbsp; &amp;nbsp; "BYTES" ...
> &gt; &amp;nbsp; &amp;nbsp; "ARRAY" ...
> &gt; &amp;nbsp; &amp;nbsp; "MULTISET" ...
> &gt; &amp;nbsp; &amp;nbsp; "RAW" ...
> &gt; &amp;nbsp; &amp;nbsp; "BOOLEAN" ...
> &gt; &amp;nbsp; &amp;nbsp; "INTEGER" ...
> &gt; &amp;nbsp; &amp;nbsp; "INT" ...
> &gt; &amp;nbsp; &amp;nbsp; "TINYINT" ...
> &gt; &amp;nbsp; &amp;nbsp; "SMALLINT" ...
> &gt; &amp;nbsp; &amp;nbsp; "BIGINT" ...
> &gt; &amp;nbsp; &amp;nbsp; "REAL" ...
> &gt; &amp;nbsp; &amp;nbsp; "DOUBLE" ...
> &gt; &amp;nbsp; &amp;nbsp; "FLOAT" ...
> &gt; &amp;nbsp; &amp;nbsp; "BINARY" ...
> &gt; &amp;nbsp; &amp;nbsp; "VARBINARY" ...
> &gt; &amp;nbsp; &amp;nbsp; "DECIMAL" ...
> &gt; &amp;nbsp; &amp;nbsp; "DEC" ...
> &gt; &amp;nbsp; &amp;nbsp; "NUMERIC" ...
> &gt; &amp;nbsp; &amp;nbsp; "ANY" ...
> &gt; &amp;nbsp; &amp;nbsp; "CHARACTER" ...
> &gt; &amp;nbsp; &amp;nbsp; "CHAR" ...
> &gt; &amp;nbsp; &amp;nbsp; "VARCHAR" ...
> &gt; &amp;nbsp; &amp;nbsp; "DATE" ...
> &gt; &amp;nbsp; &amp;nbsp; "TIME" ...
> &gt; &amp;nbsp; &amp;nbsp; "TIMESTAMP" ...
>
>
>
> --
>
> Best,
> Benchao Li



--

Best,
Benchao Li
Reply | Threaded
Open this post in threaded view
|

Re: sql 内嵌josn数组解析报 类型转换报错

Benchao Li-2
如果是的话,现在的确是还做不到,不过有一个issue[1] 正在解决这个问题。

[1] https://issues.apache.org/jira/browse/FLINK-18590

Benchao Li <[hidden email]> 于2020年7月17日周五 下午8:41写道:

> 你的意思是想先把json里面的array展开成多行,然后watermark基于这个展开后的数据来生成是么?
>
> claylin <[hidden email]> 于2020年7月17日周五 下午8:37写道:
>
>> 那我这种内嵌式的数据结构是不能在sql里面解析了,数组每行转成表中的一列,还有watermark,只能在外部处理成单条记录然后用flink处理了吗
>>
>>
>>
>>
>> ------------------&nbsp;原始邮件&nbsp;------------------
>> 发件人:
>>                                                   "user-zh"
>>                                                                     <
>> [hidden email]&gt;;
>> 发送时间:&nbsp;2020年7月17日(星期五) 晚上8:33
>> 收件人:&nbsp;"user-zh"<[hidden email]&gt;;
>>
>> 主题:&nbsp;Re: sql 内嵌josn数组解析报 类型转换报错
>>
>>
>>
>> 计算列只能写在最外层,不能在嵌套类型里面有计算列。
>>
>> claylin <[hidden email]&gt; 于2020年7月17日周五 下午8:28写道:
>>
>> &gt; hi all我这边有个嵌套的json数组,报类型转换错误(ts AS CAST(FROM_UNIXTIME(hiido_time) AS
>> &gt; TIMESTAMP(3)),这里报错),是不是不能这么写
>> &gt; create table hiido_push_sdk_mq (
>> &gt; datas&amp;nbsp; &amp;nbsp;ARRAY<ROW<`from` string,hdid string,event
>> &gt; string,hiido_time bigint,ts AS CAST(FROM_UNIXTIME(hiido_time) AS
>> &gt; TIMESTAMP(3)),WATERMARK FOR ts AS ts - INTERVAL '5'
>> MINUTE&amp;gt;&amp;gt;
>> &gt; ) with (
>> &gt; 'connector' = 'kafka',
>> &gt; 'topic' = 'hiido_pushsdk_event',
>> &gt; 'properties.bootstrap.servers' = 'kafkafs002-core001.yy.com:8103,
>> &gt; kafkafs002-core002.yy.com:8103,kafkafs002-core003.yy.com:8103',
>> &gt; 'properties.group.id' = 'push_click_sql_version_consumer',
>> &gt; 'scan.startup.mode' = 'latest-offset',
>> &gt; 'format.type' = 'json');
>> &gt;
>> &gt;
>> &gt;
>> &gt;
>> &gt; 错误如下:
>> &gt; [ERROR] 2020-07-17 20:17:50,640(562284338) --&amp;gt;
>> [http-nio-8080-exec-10]
>> &gt;
>> com.yy.push.flink.sql.gateway.sql.parse.SqlCommandParser.parseBySqlParser(SqlCommandParser.java:77):
>> &gt; parseBySqlParser, parse:
>> &gt; com.yy.push.flink.sql.gateway.context.JobContext$1@5d5f32d1, stmt:
>> create
>> &gt; table hiido_push_sdk_mq (&amp;nbsp; &amp;nbsp; datas&amp;nbsp;
>> &amp;nbsp;ARRAY<ROW<`from`
>> &gt; string,hdid string,event string,hiido_time bigint,ts AS
>> &gt; CAST(FROM_UNIXTIME(hiido_time) AS TIMESTAMP(3)),WATERMARK FOR ts AS
>> ts -
>> &gt; INTERVAL '5' MINUTE&amp;gt;&amp;gt;) with ('connector' =
>> 'kafka','topic' =
>> &gt; 'hiido_pushsdk_event','properties.bootstrap.servers' = '
>> &gt; kafkafs002-core001.yy.com:8103,kafkafs002-core002.yy.com:8103,
>> &gt; kafkafs002-core003.yy.com:8103','properties.group.id' =
>> &gt; 'push_click_sql_version_consumer','scan.startup.mode' =
>> &gt; 'latest-offset','format.type' = 'json'), error info: SQL parse
>> failed.
>> &gt; Encountered "AS" at line 1, column 115.
>> &gt; Was expecting one of:
>> &gt; &amp;nbsp; &amp;nbsp; "ROW" ...
>> &gt; &amp;nbsp; &amp;nbsp; <BRACKET_QUOTED_IDENTIFIER&amp;gt; ...
>> &gt; &amp;nbsp; &amp;nbsp; <QUOTED_IDENTIFIER&amp;gt; ...
>> &gt; &amp;nbsp; &amp;nbsp; <BACK_QUOTED_IDENTIFIER&amp;gt; ...
>> &gt; &amp;nbsp; &amp;nbsp; <IDENTIFIER&amp;gt; ...
>> &gt; &amp;nbsp; &amp;nbsp; <UNICODE_QUOTED_IDENTIFIER&amp;gt; ...
>> &gt; &amp;nbsp; &amp;nbsp; "STRING" ...
>> &gt; &amp;nbsp; &amp;nbsp; "BYTES" ...
>> &gt; &amp;nbsp; &amp;nbsp; "ARRAY" ...
>> &gt; &amp;nbsp; &amp;nbsp; "MULTISET" ...
>> &gt; &amp;nbsp; &amp;nbsp; "RAW" ...
>> &gt; &amp;nbsp; &amp;nbsp; "BOOLEAN" ...
>> &gt; &amp;nbsp; &amp;nbsp; "INTEGER" ...
>> &gt; &amp;nbsp; &amp;nbsp; "INT" ...
>> &gt; &amp;nbsp; &amp;nbsp; "TINYINT" ...
>> &gt; &amp;nbsp; &amp;nbsp; "SMALLINT" ...
>> &gt; &amp;nbsp; &amp;nbsp; "BIGINT" ...
>> &gt; &amp;nbsp; &amp;nbsp; "REAL" ...
>> &gt; &amp;nbsp; &amp;nbsp; "DOUBLE" ...
>> &gt; &amp;nbsp; &amp;nbsp; "FLOAT" ...
>> &gt; &amp;nbsp; &amp;nbsp; "BINARY" ...
>> &gt; &amp;nbsp; &amp;nbsp; "VARBINARY" ...
>> &gt; &amp;nbsp; &amp;nbsp; "DECIMAL" ...
>> &gt; &amp;nbsp; &amp;nbsp; "DEC" ...
>> &gt; &amp;nbsp; &amp;nbsp; "NUMERIC" ...
>> &gt; &amp;nbsp; &amp;nbsp; "ANY" ...
>> &gt; &amp;nbsp; &amp;nbsp; "CHARACTER" ...
>> &gt; &amp;nbsp; &amp;nbsp; "CHAR" ...
>> &gt; &amp;nbsp; &amp;nbsp; "VARCHAR" ...
>> &gt; &amp;nbsp; &amp;nbsp; "DATE" ...
>> &gt; &amp;nbsp; &amp;nbsp; "TIME" ...
>> &gt; &amp;nbsp; &amp;nbsp; "TIMESTAMP" ...
>>
>>
>>
>> --
>>
>> Best,
>> Benchao Li
>
>
>
> --
>
> Best,
> Benchao Li
>


--

Best,
Benchao Li
Reply | Threaded
Open this post in threaded view
|

回复: sql 内嵌josn数组解析报 类型转换报错

claylin
嗯了解了谢谢大佬




------------------&nbsp;原始邮件&nbsp;------------------
发件人:                                                                                                                        "user-zh"                                                                                    <[hidden email]&gt;;
发送时间:&nbsp;2020年7月17日(星期五) 晚上8:51
收件人:&nbsp;"user-zh"<[hidden email]&gt;;

主题:&nbsp;Re: sql 内嵌josn数组解析报 类型转换报错



如果是的话,现在的确是还做不到,不过有一个issue[1] 正在解决这个问题。

[1] https://issues.apache.org/jira/browse/FLINK-18590

Benchao Li <[hidden email]&gt; 于2020年7月17日周五 下午8:41写道:

&gt; 你的意思是想先把json里面的array展开成多行,然后watermark基于这个展开后的数据来生成是么?
&gt;
&gt; claylin <[hidden email]&gt; 于2020年7月17日周五 下午8:37写道:
&gt;
&gt;&gt; 那我这种内嵌式的数据结构是不能在sql里面解析了,数组每行转成表中的一列,还有watermark,只能在外部处理成单条记录然后用flink处理了吗
&gt;&gt;
&gt;&gt;
&gt;&gt;
&gt;&gt;
&gt;&gt; ------------------&amp;nbsp;原始邮件&amp;nbsp;------------------
&gt;&gt; 发件人:
&gt;&gt;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; "user-zh"
&gt;&gt;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; <
&gt;&gt; [hidden email]&amp;gt;;
&gt;&gt; 发送时间:&amp;nbsp;2020年7月17日(星期五) 晚上8:33
&gt;&gt; 收件人:&amp;nbsp;"user-zh"<[hidden email]&amp;gt;;
&gt;&gt;
&gt;&gt; 主题:&amp;nbsp;Re: sql 内嵌josn数组解析报 类型转换报错
&gt;&gt;
&gt;&gt;
&gt;&gt;
&gt;&gt; 计算列只能写在最外层,不能在嵌套类型里面有计算列。
&gt;&gt;
&gt;&gt; claylin <[hidden email]&amp;gt; 于2020年7月17日周五 下午8:28写道:
&gt;&gt;
&gt;&gt; &amp;gt; hi all我这边有个嵌套的json数组,报类型转换错误(ts AS CAST(FROM_UNIXTIME(hiido_time) AS
&gt;&gt; &amp;gt; TIMESTAMP(3)),这里报错),是不是不能这么写
&gt;&gt; &amp;gt; create table hiido_push_sdk_mq (
&gt;&gt; &amp;gt; datas&amp;amp;nbsp; &amp;amp;nbsp;ARRAY<ROW<`from` string,hdid string,event
&gt;&gt; &amp;gt; string,hiido_time bigint,ts AS CAST(FROM_UNIXTIME(hiido_time) AS
&gt;&gt; &amp;gt; TIMESTAMP(3)),WATERMARK FOR ts AS ts - INTERVAL '5'
&gt;&gt; MINUTE&amp;amp;gt;&amp;amp;gt;
&gt;&gt; &amp;gt; ) with (
&gt;&gt; &amp;gt; 'connector' = 'kafka',
&gt;&gt; &amp;gt; 'topic' = 'hiido_pushsdk_event',
&gt;&gt; &amp;gt; 'properties.bootstrap.servers' = 'kafkafs002-core001.yy.com:8103,
&gt;&gt; &amp;gt; kafkafs002-core002.yy.com:8103,kafkafs002-core003.yy.com:8103',
&gt;&gt; &amp;gt; 'properties.group.id' = 'push_click_sql_version_consumer',
&gt;&gt; &amp;gt; 'scan.startup.mode' = 'latest-offset',
&gt;&gt; &amp;gt; 'format.type' = 'json');
&gt;&gt; &amp;gt;
&gt;&gt; &amp;gt;
&gt;&gt; &amp;gt;
&gt;&gt; &amp;gt;
&gt;&gt; &amp;gt; 错误如下:
&gt;&gt; &amp;gt; [ERROR] 2020-07-17 20:17:50,640(562284338) --&amp;amp;gt;
&gt;&gt; [http-nio-8080-exec-10]
&gt;&gt; &amp;gt;
&gt;&gt; com.yy.push.flink.sql.gateway.sql.parse.SqlCommandParser.parseBySqlParser(SqlCommandParser.java:77):
&gt;&gt; &amp;gt; parseBySqlParser, parse:
&gt;&gt; &amp;gt; com.yy.push.flink.sql.gateway.context.JobContext$1@5d5f32d1, stmt:
&gt;&gt; create
&gt;&gt; &amp;gt; table hiido_push_sdk_mq (&amp;amp;nbsp; &amp;amp;nbsp; datas&amp;amp;nbsp;
&gt;&gt; &amp;amp;nbsp;ARRAY<ROW<`from`
&gt;&gt; &amp;gt; string,hdid string,event string,hiido_time bigint,ts AS
&gt;&gt; &amp;gt; CAST(FROM_UNIXTIME(hiido_time) AS TIMESTAMP(3)),WATERMARK FOR ts AS
&gt;&gt; ts -
&gt;&gt; &amp;gt; INTERVAL '5' MINUTE&amp;amp;gt;&amp;amp;gt;) with ('connector' =
&gt;&gt; 'kafka','topic' =
&gt;&gt; &amp;gt; 'hiido_pushsdk_event','properties.bootstrap.servers' = '
&gt;&gt; &amp;gt; kafkafs002-core001.yy.com:8103,kafkafs002-core002.yy.com:8103,
&gt;&gt; &amp;gt; kafkafs002-core003.yy.com:8103','properties.group.id' =
&gt;&gt; &amp;gt; 'push_click_sql_version_consumer','scan.startup.mode' =
&gt;&gt; &amp;gt; 'latest-offset','format.type' = 'json'), error info: SQL parse
&gt;&gt; failed.
&gt;&gt; &amp;gt; Encountered "AS" at line 1, column 115.
&gt;&gt; &amp;gt; Was expecting one of:
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "ROW" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; <BRACKET_QUOTED_IDENTIFIER&amp;amp;gt; ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; <QUOTED_IDENTIFIER&amp;amp;gt; ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; <BACK_QUOTED_IDENTIFIER&amp;amp;gt; ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; <IDENTIFIER&amp;amp;gt; ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; <UNICODE_QUOTED_IDENTIFIER&amp;amp;gt; ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "STRING" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "BYTES" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "ARRAY" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "MULTISET" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "RAW" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "BOOLEAN" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "INTEGER" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "INT" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "TINYINT" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "SMALLINT" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "BIGINT" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "REAL" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "DOUBLE" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "FLOAT" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "BINARY" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "VARBINARY" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "DECIMAL" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "DEC" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "NUMERIC" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "ANY" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "CHARACTER" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "CHAR" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "VARCHAR" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "DATE" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "TIME" ...
&gt;&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; "TIMESTAMP" ...
&gt;&gt;
&gt;&gt;
&gt;&gt;
&gt;&gt; --
&gt;&gt;
&gt;&gt; Best,
&gt;&gt; Benchao Li
&gt;
&gt;
&gt;
&gt; --
&gt;
&gt; Best,
&gt; Benchao Li
&gt;


--

Best,
Benchao Li