hi all我这边有个嵌套的json数组,报类型转换错误(ts AS CAST(FROM_UNIXTIME(hiido_time) AS TIMESTAMP(3)),这里报错),是不是不能这么写
create table hiido_push_sdk_mq ( datas ARRAY<ROW<`from` string,hdid string,event string,hiido_time bigint,ts AS CAST(FROM_UNIXTIME(hiido_time) AS TIMESTAMP(3)),WATERMARK FOR ts AS ts - INTERVAL '5' MINUTE>> ) with ( 'connector' = 'kafka', 'topic' = 'hiido_pushsdk_event', 'properties.bootstrap.servers' = 'kafkafs002-core001.yy.com:8103,kafkafs002-core002.yy.com:8103,kafkafs002-core003.yy.com:8103', 'properties.group.id' = 'push_click_sql_version_consumer', 'scan.startup.mode' = 'latest-offset', 'format.type' = 'json'); 错误如下: [ERROR] 2020-07-17 20:17:50,640(562284338) --> [http-nio-8080-exec-10] com.yy.push.flink.sql.gateway.sql.parse.SqlCommandParser.parseBySqlParser(SqlCommandParser.java:77): parseBySqlParser, parse: com.yy.push.flink.sql.gateway.context.JobContext$1@5d5f32d1, stmt: create table hiido_push_sdk_mq ( datas ARRAY<ROW<`from` string,hdid string,event string,hiido_time bigint,ts AS CAST(FROM_UNIXTIME(hiido_time) AS TIMESTAMP(3)),WATERMARK FOR ts AS ts - INTERVAL '5' MINUTE>>) with ('connector' = 'kafka','topic' = 'hiido_pushsdk_event','properties.bootstrap.servers' = 'kafkafs002-core001.yy.com:8103,kafkafs002-core002.yy.com:8103,kafkafs002-core003.yy.com:8103','properties.group.id' = 'push_click_sql_version_consumer','scan.startup.mode' = 'latest-offset','format.type' = 'json'), error info: SQL parse failed. Encountered "AS" at line 1, column 115. Was expecting one of: "ROW" ... <BRACKET_QUOTED_IDENTIFIER> ... <QUOTED_IDENTIFIER> ... <BACK_QUOTED_IDENTIFIER> ... <IDENTIFIER> ... <UNICODE_QUOTED_IDENTIFIER> ... "STRING" ... "BYTES" ... "ARRAY" ... "MULTISET" ... "RAW" ... "BOOLEAN" ... "INTEGER" ... "INT" ... "TINYINT" ... "SMALLINT" ... "BIGINT" ... "REAL" ... "DOUBLE" ... "FLOAT" ... "BINARY" ... "VARBINARY" ... "DECIMAL" ... "DEC" ... "NUMERIC" ... "ANY" ... "CHARACTER" ... "CHAR" ... "VARCHAR" ... "DATE" ... "TIME" ... "TIMESTAMP" ... |
计算列只能写在最外层,不能在嵌套类型里面有计算列。
claylin <[hidden email]> 于2020年7月17日周五 下午8:28写道: > hi all我这边有个嵌套的json数组,报类型转换错误(ts AS CAST(FROM_UNIXTIME(hiido_time) AS > TIMESTAMP(3)),这里报错),是不是不能这么写 > create table hiido_push_sdk_mq ( > datas ARRAY<ROW<`from` string,hdid string,event > string,hiido_time bigint,ts AS CAST(FROM_UNIXTIME(hiido_time) AS > TIMESTAMP(3)),WATERMARK FOR ts AS ts - INTERVAL '5' MINUTE>> > ) with ( > 'connector' = 'kafka', > 'topic' = 'hiido_pushsdk_event', > 'properties.bootstrap.servers' = 'kafkafs002-core001.yy.com:8103, > kafkafs002-core002.yy.com:8103,kafkafs002-core003.yy.com:8103', > 'properties.group.id' = 'push_click_sql_version_consumer', > 'scan.startup.mode' = 'latest-offset', > 'format.type' = 'json'); > > > > > 错误如下: > [ERROR] 2020-07-17 20:17:50,640(562284338) --> [http-nio-8080-exec-10] > com.yy.push.flink.sql.gateway.sql.parse.SqlCommandParser.parseBySqlParser(SqlCommandParser.java:77): > parseBySqlParser, parse: > com.yy.push.flink.sql.gateway.context.JobContext$1@5d5f32d1, stmt: create > table hiido_push_sdk_mq ( datas ARRAY<ROW<`from` > string,hdid string,event string,hiido_time bigint,ts AS > CAST(FROM_UNIXTIME(hiido_time) AS TIMESTAMP(3)),WATERMARK FOR ts AS ts - > INTERVAL '5' MINUTE>>) with ('connector' = 'kafka','topic' = > 'hiido_pushsdk_event','properties.bootstrap.servers' = ' > kafkafs002-core001.yy.com:8103,kafkafs002-core002.yy.com:8103, > kafkafs002-core003.yy.com:8103','properties.group.id' = > 'push_click_sql_version_consumer','scan.startup.mode' = > 'latest-offset','format.type' = 'json'), error info: SQL parse failed. > Encountered "AS" at line 1, column 115. > Was expecting one of: > "ROW" ... > <BRACKET_QUOTED_IDENTIFIER> ... > <QUOTED_IDENTIFIER> ... > <BACK_QUOTED_IDENTIFIER> ... > <IDENTIFIER> ... > <UNICODE_QUOTED_IDENTIFIER> ... > "STRING" ... > "BYTES" ... > "ARRAY" ... > "MULTISET" ... > "RAW" ... > "BOOLEAN" ... > "INTEGER" ... > "INT" ... > "TINYINT" ... > "SMALLINT" ... > "BIGINT" ... > "REAL" ... > "DOUBLE" ... > "FLOAT" ... > "BINARY" ... > "VARBINARY" ... > "DECIMAL" ... > "DEC" ... > "NUMERIC" ... > "ANY" ... > "CHARACTER" ... > "CHAR" ... > "VARCHAR" ... > "DATE" ... > "TIME" ... > "TIMESTAMP" ... -- Best, Benchao Li |
那我这种内嵌式的数据结构是不能在sql里面解析了,数组每行转成表中的一列,还有watermark,只能在外部处理成单条记录然后用flink处理了吗
------------------ 原始邮件 ------------------ 发件人: "user-zh" <[hidden email]>; 发送时间: 2020年7月17日(星期五) 晚上8:33 收件人: "user-zh"<[hidden email]>; 主题: Re: sql 内嵌josn数组解析报 类型转换报错 计算列只能写在最外层,不能在嵌套类型里面有计算列。 claylin <[hidden email]> 于2020年7月17日周五 下午8:28写道: > hi all我这边有个嵌套的json数组,报类型转换错误(ts AS CAST(FROM_UNIXTIME(hiido_time) AS > TIMESTAMP(3)),这里报错),是不是不能这么写 > create table hiido_push_sdk_mq ( > datas&nbsp; &nbsp;ARRAY<ROW<`from` string,hdid string,event > string,hiido_time bigint,ts AS CAST(FROM_UNIXTIME(hiido_time) AS > TIMESTAMP(3)),WATERMARK FOR ts AS ts - INTERVAL '5' MINUTE&gt;&gt; > ) with ( > 'connector' = 'kafka', > 'topic' = 'hiido_pushsdk_event', > 'properties.bootstrap.servers' = 'kafkafs002-core001.yy.com:8103, > kafkafs002-core002.yy.com:8103,kafkafs002-core003.yy.com:8103', > 'properties.group.id' = 'push_click_sql_version_consumer', > 'scan.startup.mode' = 'latest-offset', > 'format.type' = 'json'); > > > > > 错误如下: > [ERROR] 2020-07-17 20:17:50,640(562284338) --&gt; [http-nio-8080-exec-10] > com.yy.push.flink.sql.gateway.sql.parse.SqlCommandParser.parseBySqlParser(SqlCommandParser.java:77): > parseBySqlParser, parse: > com.yy.push.flink.sql.gateway.context.JobContext$1@5d5f32d1, stmt: create > table hiido_push_sdk_mq (&nbsp; &nbsp; datas&nbsp; &nbsp;ARRAY<ROW<`from` > string,hdid string,event string,hiido_time bigint,ts AS > CAST(FROM_UNIXTIME(hiido_time) AS TIMESTAMP(3)),WATERMARK FOR ts AS ts - > INTERVAL '5' MINUTE&gt;&gt;) with ('connector' = 'kafka','topic' = > 'hiido_pushsdk_event','properties.bootstrap.servers' = ' > kafkafs002-core001.yy.com:8103,kafkafs002-core002.yy.com:8103, > kafkafs002-core003.yy.com:8103','properties.group.id' = > 'push_click_sql_version_consumer','scan.startup.mode' = > 'latest-offset','format.type' = 'json'), error info: SQL parse failed. > Encountered "AS" at line 1, column 115. > Was expecting one of: > &nbsp; &nbsp; "ROW" ... > &nbsp; &nbsp; <BRACKET_QUOTED_IDENTIFIER&gt; ... > &nbsp; &nbsp; <QUOTED_IDENTIFIER&gt; ... > &nbsp; &nbsp; <BACK_QUOTED_IDENTIFIER&gt; ... > &nbsp; &nbsp; <IDENTIFIER&gt; ... > &nbsp; &nbsp; <UNICODE_QUOTED_IDENTIFIER&gt; ... > &nbsp; &nbsp; "STRING" ... > &nbsp; &nbsp; "BYTES" ... > &nbsp; &nbsp; "ARRAY" ... > &nbsp; &nbsp; "MULTISET" ... > &nbsp; &nbsp; "RAW" ... > &nbsp; &nbsp; "BOOLEAN" ... > &nbsp; &nbsp; "INTEGER" ... > &nbsp; &nbsp; "INT" ... > &nbsp; &nbsp; "TINYINT" ... > &nbsp; &nbsp; "SMALLINT" ... > &nbsp; &nbsp; "BIGINT" ... > &nbsp; &nbsp; "REAL" ... > &nbsp; &nbsp; "DOUBLE" ... > &nbsp; &nbsp; "FLOAT" ... > &nbsp; &nbsp; "BINARY" ... > &nbsp; &nbsp; "VARBINARY" ... > &nbsp; &nbsp; "DECIMAL" ... > &nbsp; &nbsp; "DEC" ... > &nbsp; &nbsp; "NUMERIC" ... > &nbsp; &nbsp; "ANY" ... > &nbsp; &nbsp; "CHARACTER" ... > &nbsp; &nbsp; "CHAR" ... > &nbsp; &nbsp; "VARCHAR" ... > &nbsp; &nbsp; "DATE" ... > &nbsp; &nbsp; "TIME" ... > &nbsp; &nbsp; "TIMESTAMP" ... -- Best, Benchao Li |
你的意思是想先把json里面的array展开成多行,然后watermark基于这个展开后的数据来生成是么?
claylin <[hidden email]> 于2020年7月17日周五 下午8:37写道: > 那我这种内嵌式的数据结构是不能在sql里面解析了,数组每行转成表中的一列,还有watermark,只能在外部处理成单条记录然后用flink处理了吗 > > > > > ------------------ 原始邮件 ------------------ > 发件人: > "user-zh" > < > [hidden email]>; > 发送时间: 2020年7月17日(星期五) 晚上8:33 > 收件人: "user-zh"<[hidden email]>; > > 主题: Re: sql 内嵌josn数组解析报 类型转换报错 > > > > 计算列只能写在最外层,不能在嵌套类型里面有计算列。 > > claylin <[hidden email]> 于2020年7月17日周五 下午8:28写道: > > > hi all我这边有个嵌套的json数组,报类型转换错误(ts AS CAST(FROM_UNIXTIME(hiido_time) AS > > TIMESTAMP(3)),这里报错),是不是不能这么写 > > create table hiido_push_sdk_mq ( > > datas&nbsp; &nbsp;ARRAY<ROW<`from` string,hdid string,event > > string,hiido_time bigint,ts AS CAST(FROM_UNIXTIME(hiido_time) AS > > TIMESTAMP(3)),WATERMARK FOR ts AS ts - INTERVAL '5' > MINUTE&gt;&gt; > > ) with ( > > 'connector' = 'kafka', > > 'topic' = 'hiido_pushsdk_event', > > 'properties.bootstrap.servers' = 'kafkafs002-core001.yy.com:8103, > > kafkafs002-core002.yy.com:8103,kafkafs002-core003.yy.com:8103', > > 'properties.group.id' = 'push_click_sql_version_consumer', > > 'scan.startup.mode' = 'latest-offset', > > 'format.type' = 'json'); > > > > > > > > > > 错误如下: > > [ERROR] 2020-07-17 20:17:50,640(562284338) --&gt; > [http-nio-8080-exec-10] > > > com.yy.push.flink.sql.gateway.sql.parse.SqlCommandParser.parseBySqlParser(SqlCommandParser.java:77): > > parseBySqlParser, parse: > > com.yy.push.flink.sql.gateway.context.JobContext$1@5d5f32d1, stmt: > create > > table hiido_push_sdk_mq (&nbsp; &nbsp; datas&nbsp; > &nbsp;ARRAY<ROW<`from` > > string,hdid string,event string,hiido_time bigint,ts AS > > CAST(FROM_UNIXTIME(hiido_time) AS TIMESTAMP(3)),WATERMARK FOR ts AS > ts - > > INTERVAL '5' MINUTE&gt;&gt;) with ('connector' = > 'kafka','topic' = > > 'hiido_pushsdk_event','properties.bootstrap.servers' = ' > > kafkafs002-core001.yy.com:8103,kafkafs002-core002.yy.com:8103, > > kafkafs002-core003.yy.com:8103','properties.group.id' = > > 'push_click_sql_version_consumer','scan.startup.mode' = > > 'latest-offset','format.type' = 'json'), error info: SQL parse failed. > > Encountered "AS" at line 1, column 115. > > Was expecting one of: > > &nbsp; &nbsp; "ROW" ... > > &nbsp; &nbsp; <BRACKET_QUOTED_IDENTIFIER&gt; ... > > &nbsp; &nbsp; <QUOTED_IDENTIFIER&gt; ... > > &nbsp; &nbsp; <BACK_QUOTED_IDENTIFIER&gt; ... > > &nbsp; &nbsp; <IDENTIFIER&gt; ... > > &nbsp; &nbsp; <UNICODE_QUOTED_IDENTIFIER&gt; ... > > &nbsp; &nbsp; "STRING" ... > > &nbsp; &nbsp; "BYTES" ... > > &nbsp; &nbsp; "ARRAY" ... > > &nbsp; &nbsp; "MULTISET" ... > > &nbsp; &nbsp; "RAW" ... > > &nbsp; &nbsp; "BOOLEAN" ... > > &nbsp; &nbsp; "INTEGER" ... > > &nbsp; &nbsp; "INT" ... > > &nbsp; &nbsp; "TINYINT" ... > > &nbsp; &nbsp; "SMALLINT" ... > > &nbsp; &nbsp; "BIGINT" ... > > &nbsp; &nbsp; "REAL" ... > > &nbsp; &nbsp; "DOUBLE" ... > > &nbsp; &nbsp; "FLOAT" ... > > &nbsp; &nbsp; "BINARY" ... > > &nbsp; &nbsp; "VARBINARY" ... > > &nbsp; &nbsp; "DECIMAL" ... > > &nbsp; &nbsp; "DEC" ... > > &nbsp; &nbsp; "NUMERIC" ... > > &nbsp; &nbsp; "ANY" ... > > &nbsp; &nbsp; "CHARACTER" ... > > &nbsp; &nbsp; "CHAR" ... > > &nbsp; &nbsp; "VARCHAR" ... > > &nbsp; &nbsp; "DATE" ... > > &nbsp; &nbsp; "TIME" ... > > &nbsp; &nbsp; "TIMESTAMP" ... > > > > -- > > Best, > Benchao Li -- Best, Benchao Li |
如果是的话,现在的确是还做不到,不过有一个issue[1] 正在解决这个问题。
[1] https://issues.apache.org/jira/browse/FLINK-18590 Benchao Li <[hidden email]> 于2020年7月17日周五 下午8:41写道: > 你的意思是想先把json里面的array展开成多行,然后watermark基于这个展开后的数据来生成是么? > > claylin <[hidden email]> 于2020年7月17日周五 下午8:37写道: > >> 那我这种内嵌式的数据结构是不能在sql里面解析了,数组每行转成表中的一列,还有watermark,只能在外部处理成单条记录然后用flink处理了吗 >> >> >> >> >> ------------------ 原始邮件 ------------------ >> 发件人: >> "user-zh" >> < >> [hidden email]>; >> 发送时间: 2020年7月17日(星期五) 晚上8:33 >> 收件人: "user-zh"<[hidden email]>; >> >> 主题: Re: sql 内嵌josn数组解析报 类型转换报错 >> >> >> >> 计算列只能写在最外层,不能在嵌套类型里面有计算列。 >> >> claylin <[hidden email]> 于2020年7月17日周五 下午8:28写道: >> >> > hi all我这边有个嵌套的json数组,报类型转换错误(ts AS CAST(FROM_UNIXTIME(hiido_time) AS >> > TIMESTAMP(3)),这里报错),是不是不能这么写 >> > create table hiido_push_sdk_mq ( >> > datas&nbsp; &nbsp;ARRAY<ROW<`from` string,hdid string,event >> > string,hiido_time bigint,ts AS CAST(FROM_UNIXTIME(hiido_time) AS >> > TIMESTAMP(3)),WATERMARK FOR ts AS ts - INTERVAL '5' >> MINUTE&gt;&gt; >> > ) with ( >> > 'connector' = 'kafka', >> > 'topic' = 'hiido_pushsdk_event', >> > 'properties.bootstrap.servers' = 'kafkafs002-core001.yy.com:8103, >> > kafkafs002-core002.yy.com:8103,kafkafs002-core003.yy.com:8103', >> > 'properties.group.id' = 'push_click_sql_version_consumer', >> > 'scan.startup.mode' = 'latest-offset', >> > 'format.type' = 'json'); >> > >> > >> > >> > >> > 错误如下: >> > [ERROR] 2020-07-17 20:17:50,640(562284338) --&gt; >> [http-nio-8080-exec-10] >> > >> com.yy.push.flink.sql.gateway.sql.parse.SqlCommandParser.parseBySqlParser(SqlCommandParser.java:77): >> > parseBySqlParser, parse: >> > com.yy.push.flink.sql.gateway.context.JobContext$1@5d5f32d1, stmt: >> create >> > table hiido_push_sdk_mq (&nbsp; &nbsp; datas&nbsp; >> &nbsp;ARRAY<ROW<`from` >> > string,hdid string,event string,hiido_time bigint,ts AS >> > CAST(FROM_UNIXTIME(hiido_time) AS TIMESTAMP(3)),WATERMARK FOR ts AS >> ts - >> > INTERVAL '5' MINUTE&gt;&gt;) with ('connector' = >> 'kafka','topic' = >> > 'hiido_pushsdk_event','properties.bootstrap.servers' = ' >> > kafkafs002-core001.yy.com:8103,kafkafs002-core002.yy.com:8103, >> > kafkafs002-core003.yy.com:8103','properties.group.id' = >> > 'push_click_sql_version_consumer','scan.startup.mode' = >> > 'latest-offset','format.type' = 'json'), error info: SQL parse >> failed. >> > Encountered "AS" at line 1, column 115. >> > Was expecting one of: >> > &nbsp; &nbsp; "ROW" ... >> > &nbsp; &nbsp; <BRACKET_QUOTED_IDENTIFIER&gt; ... >> > &nbsp; &nbsp; <QUOTED_IDENTIFIER&gt; ... >> > &nbsp; &nbsp; <BACK_QUOTED_IDENTIFIER&gt; ... >> > &nbsp; &nbsp; <IDENTIFIER&gt; ... >> > &nbsp; &nbsp; <UNICODE_QUOTED_IDENTIFIER&gt; ... >> > &nbsp; &nbsp; "STRING" ... >> > &nbsp; &nbsp; "BYTES" ... >> > &nbsp; &nbsp; "ARRAY" ... >> > &nbsp; &nbsp; "MULTISET" ... >> > &nbsp; &nbsp; "RAW" ... >> > &nbsp; &nbsp; "BOOLEAN" ... >> > &nbsp; &nbsp; "INTEGER" ... >> > &nbsp; &nbsp; "INT" ... >> > &nbsp; &nbsp; "TINYINT" ... >> > &nbsp; &nbsp; "SMALLINT" ... >> > &nbsp; &nbsp; "BIGINT" ... >> > &nbsp; &nbsp; "REAL" ... >> > &nbsp; &nbsp; "DOUBLE" ... >> > &nbsp; &nbsp; "FLOAT" ... >> > &nbsp; &nbsp; "BINARY" ... >> > &nbsp; &nbsp; "VARBINARY" ... >> > &nbsp; &nbsp; "DECIMAL" ... >> > &nbsp; &nbsp; "DEC" ... >> > &nbsp; &nbsp; "NUMERIC" ... >> > &nbsp; &nbsp; "ANY" ... >> > &nbsp; &nbsp; "CHARACTER" ... >> > &nbsp; &nbsp; "CHAR" ... >> > &nbsp; &nbsp; "VARCHAR" ... >> > &nbsp; &nbsp; "DATE" ... >> > &nbsp; &nbsp; "TIME" ... >> > &nbsp; &nbsp; "TIMESTAMP" ... >> >> >> >> -- >> >> Best, >> Benchao Li > > > > -- > > Best, > Benchao Li > -- Best, Benchao Li |
嗯了解了谢谢大佬
------------------ 原始邮件 ------------------ 发件人: "user-zh" <[hidden email]>; 发送时间: 2020年7月17日(星期五) 晚上8:51 收件人: "user-zh"<[hidden email]>; 主题: Re: sql 内嵌josn数组解析报 类型转换报错 如果是的话,现在的确是还做不到,不过有一个issue[1] 正在解决这个问题。 [1] https://issues.apache.org/jira/browse/FLINK-18590 Benchao Li <[hidden email]> 于2020年7月17日周五 下午8:41写道: > 你的意思是想先把json里面的array展开成多行,然后watermark基于这个展开后的数据来生成是么? > > claylin <[hidden email]> 于2020年7月17日周五 下午8:37写道: > >> 那我这种内嵌式的数据结构是不能在sql里面解析了,数组每行转成表中的一列,还有watermark,只能在外部处理成单条记录然后用flink处理了吗 >> >> >> >> >> ------------------&nbsp;原始邮件&nbsp;------------------ >> 发件人: >> "user-zh" >> < >> [hidden email]&gt;; >> 发送时间:&nbsp;2020年7月17日(星期五) 晚上8:33 >> 收件人:&nbsp;"user-zh"<[hidden email]&gt;; >> >> 主题:&nbsp;Re: sql 内嵌josn数组解析报 类型转换报错 >> >> >> >> 计算列只能写在最外层,不能在嵌套类型里面有计算列。 >> >> claylin <[hidden email]&gt; 于2020年7月17日周五 下午8:28写道: >> >> &gt; hi all我这边有个嵌套的json数组,报类型转换错误(ts AS CAST(FROM_UNIXTIME(hiido_time) AS >> &gt; TIMESTAMP(3)),这里报错),是不是不能这么写 >> &gt; create table hiido_push_sdk_mq ( >> &gt; datas&amp;nbsp; &amp;nbsp;ARRAY<ROW<`from` string,hdid string,event >> &gt; string,hiido_time bigint,ts AS CAST(FROM_UNIXTIME(hiido_time) AS >> &gt; TIMESTAMP(3)),WATERMARK FOR ts AS ts - INTERVAL '5' >> MINUTE&amp;gt;&amp;gt; >> &gt; ) with ( >> &gt; 'connector' = 'kafka', >> &gt; 'topic' = 'hiido_pushsdk_event', >> &gt; 'properties.bootstrap.servers' = 'kafkafs002-core001.yy.com:8103, >> &gt; kafkafs002-core002.yy.com:8103,kafkafs002-core003.yy.com:8103', >> &gt; 'properties.group.id' = 'push_click_sql_version_consumer', >> &gt; 'scan.startup.mode' = 'latest-offset', >> &gt; 'format.type' = 'json'); >> &gt; >> &gt; >> &gt; >> &gt; >> &gt; 错误如下: >> &gt; [ERROR] 2020-07-17 20:17:50,640(562284338) --&amp;gt; >> [http-nio-8080-exec-10] >> &gt; >> com.yy.push.flink.sql.gateway.sql.parse.SqlCommandParser.parseBySqlParser(SqlCommandParser.java:77): >> &gt; parseBySqlParser, parse: >> &gt; com.yy.push.flink.sql.gateway.context.JobContext$1@5d5f32d1, stmt: >> create >> &gt; table hiido_push_sdk_mq (&amp;nbsp; &amp;nbsp; datas&amp;nbsp; >> &amp;nbsp;ARRAY<ROW<`from` >> &gt; string,hdid string,event string,hiido_time bigint,ts AS >> &gt; CAST(FROM_UNIXTIME(hiido_time) AS TIMESTAMP(3)),WATERMARK FOR ts AS >> ts - >> &gt; INTERVAL '5' MINUTE&amp;gt;&amp;gt;) with ('connector' = >> 'kafka','topic' = >> &gt; 'hiido_pushsdk_event','properties.bootstrap.servers' = ' >> &gt; kafkafs002-core001.yy.com:8103,kafkafs002-core002.yy.com:8103, >> &gt; kafkafs002-core003.yy.com:8103','properties.group.id' = >> &gt; 'push_click_sql_version_consumer','scan.startup.mode' = >> &gt; 'latest-offset','format.type' = 'json'), error info: SQL parse >> failed. >> &gt; Encountered "AS" at line 1, column 115. >> &gt; Was expecting one of: >> &gt; &amp;nbsp; &amp;nbsp; "ROW" ... >> &gt; &amp;nbsp; &amp;nbsp; <BRACKET_QUOTED_IDENTIFIER&amp;gt; ... >> &gt; &amp;nbsp; &amp;nbsp; <QUOTED_IDENTIFIER&amp;gt; ... >> &gt; &amp;nbsp; &amp;nbsp; <BACK_QUOTED_IDENTIFIER&amp;gt; ... >> &gt; &amp;nbsp; &amp;nbsp; <IDENTIFIER&amp;gt; ... >> &gt; &amp;nbsp; &amp;nbsp; <UNICODE_QUOTED_IDENTIFIER&amp;gt; ... >> &gt; &amp;nbsp; &amp;nbsp; "STRING" ... >> &gt; &amp;nbsp; &amp;nbsp; "BYTES" ... >> &gt; &amp;nbsp; &amp;nbsp; "ARRAY" ... >> &gt; &amp;nbsp; &amp;nbsp; "MULTISET" ... >> &gt; &amp;nbsp; &amp;nbsp; "RAW" ... >> &gt; &amp;nbsp; &amp;nbsp; "BOOLEAN" ... >> &gt; &amp;nbsp; &amp;nbsp; "INTEGER" ... >> &gt; &amp;nbsp; &amp;nbsp; "INT" ... >> &gt; &amp;nbsp; &amp;nbsp; "TINYINT" ... >> &gt; &amp;nbsp; &amp;nbsp; "SMALLINT" ... >> &gt; &amp;nbsp; &amp;nbsp; "BIGINT" ... >> &gt; &amp;nbsp; &amp;nbsp; "REAL" ... >> &gt; &amp;nbsp; &amp;nbsp; "DOUBLE" ... >> &gt; &amp;nbsp; &amp;nbsp; "FLOAT" ... >> &gt; &amp;nbsp; &amp;nbsp; "BINARY" ... >> &gt; &amp;nbsp; &amp;nbsp; "VARBINARY" ... >> &gt; &amp;nbsp; &amp;nbsp; "DECIMAL" ... >> &gt; &amp;nbsp; &amp;nbsp; "DEC" ... >> &gt; &amp;nbsp; &amp;nbsp; "NUMERIC" ... >> &gt; &amp;nbsp; &amp;nbsp; "ANY" ... >> &gt; &amp;nbsp; &amp;nbsp; "CHARACTER" ... >> &gt; &amp;nbsp; &amp;nbsp; "CHAR" ... >> &gt; &amp;nbsp; &amp;nbsp; "VARCHAR" ... >> &gt; &amp;nbsp; &amp;nbsp; "DATE" ... >> &gt; &amp;nbsp; &amp;nbsp; "TIME" ... >> &gt; &amp;nbsp; &amp;nbsp; "TIMESTAMP" ... >> >> >> >> -- >> >> Best, >> Benchao Li > > > > -- > > Best, > Benchao Li > -- Best, Benchao Li |
Free forum by Nabble | Edit this page |