项目需求要向 postgres 中插入数据,用 catalog 之后,插入数据貌似需要和数据库表定义完全一致,而且没找到只插入部分字段的写法
在时间转 TIMESTAMP(6) WITH LOCAL TIME ZONE 时报了错,这个格式是 postgres 中的时间戳定义 select cast(localtimestamp as TIMESTAMP(6) WITH LOCAL TIME ZONE); 有没有什么转换方法?或者只插入部分数据的方法? |
报错信息:
Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue. at org.apache.flink.table.client.SqlClient.main(SqlClient.java:213) Caused by: org.apache.flink.table.api.TableException: Unsupported conversion from data type 'TIMESTAMP(6) WITH LOCAL TIME ZONE NOT NULL' (conversion class: java.time.Instant) to type information. Only data types that originated from type information fully support a reverse conversion. at org.apache.flink.table.types.utils.LegacyTypeInfoDataTypeConverter.toLegacyTypeInfo(LegacyTypeInfoDataTypeConverter.java:259) at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382) at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472) at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:546) at java.util.stream.AbstractPipeline.evaluateToArrayNode(AbstractPipeline.java:260) at java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:438) at org.apache.flink.table.types.utils.LegacyTypeInfoDataTypeConverter.convertToRowTypeInfo(LegacyTypeInfoDataTypeConverter.java:329) at org.apache.flink.table.types.utils.LegacyTypeInfoDataTypeConverter.toLegacyTypeInfo(LegacyTypeInfoDataTypeConverter.java:237) at org.apache.flink.table.types.utils.TypeConversions.fromDataTypeToLegacyInfo(TypeConversions.java:49) at org.apache.flink.table.api.TableSchema.toRowType(TableSchema.java:271) at org.apache.flink.table.client.gateway.local.result.CollectStreamResult.<init>(CollectStreamResult.java:71) at org.apache.flink.table.client.gateway.local.result.MaterializedCollectStreamResult.<init>(MaterializedCollectStreamResult.java:101) at org.apache.flink.table.client.gateway.local.result.MaterializedCollectStreamResult.<init>(MaterializedCollectStreamResult.java:129) at org.apache.flink.table.client.gateway.local.ResultStore.createResult(ResultStore.java:83) at org.apache.flink.table.client.gateway.local.LocalExecutor.executeQueryInternal(LocalExecutor.java:608) at org.apache.flink.table.client.gateway.local.LocalExecutor.executeQuery(LocalExecutor.java:465) at org.apache.flink.table.client.cli.CliClient.callSelect(CliClient.java:555) at org.apache.flink.table.client.cli.CliClient.callCommand(CliClient.java:311) at java.util.Optional.ifPresent(Optional.java:159) at org.apache.flink.table.client.cli.CliClient.open(CliClient.java:212) at org.apache.flink.table.client.SqlClient.openCli(SqlClient.java:142) at org.apache.flink.table.client.SqlClient.start(SqlClient.java:114) at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201) 在 2020-12-08 19:24:43,"李轲" <[hidden email]> 写道: >项目需求要向 postgres 中插入数据,用 catalog 之后,插入数据貌似需要和数据库表定义完全一致,而且没找到只插入部分字段的写法 >在时间转 TIMESTAMP(6) WITH LOCAL TIME ZONE 时报了错,这个格式是 postgres 中的时间戳定义 >select cast(localtimestamp as TIMESTAMP(6) WITH LOCAL TIME ZONE); >有没有什么转换方法?或者只插入部分数据的方法? |
Administrator
|
看报错信息,你并没有 insert into postgres,而是只是 select 了 query,这个会将运行结果显式在 sql client
界面上,而不会插入到 postgres 中。 你的 query 中有 timestamp with local time zone, 而 sql client 的 query运行结果的显式 还不支持这个类型。 这个问题的解决可以关注下这个 issue:https://issues.apache.org/jira/browse/FLINK-17948 Best, Jark On Tue, 8 Dec 2020 at 19:32, 李轲 <[hidden email]> wrote: > 报错信息: > Exception in thread "main" > org.apache.flink.table.client.SqlClientException: Unexpected exception. > This is a bug. Please consider filing an issue. > at org.apache.flink.table.client.SqlClient.main(SqlClient.java:213) > Caused by: org.apache.flink.table.api.TableException: Unsupported > conversion from data type 'TIMESTAMP(6) WITH LOCAL TIME ZONE NOT NULL' > (conversion class: java.time.Instant) to type information. Only data types > that originated from type information fully support a reverse conversion. > at > org.apache.flink.table.types.utils.LegacyTypeInfoDataTypeConverter.toLegacyTypeInfo(LegacyTypeInfoDataTypeConverter.java:259) > at > java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) > at > java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382) > at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) > at > java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472) > at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:546) > at > java.util.stream.AbstractPipeline.evaluateToArrayNode(AbstractPipeline.java:260) > at java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:438) > at > org.apache.flink.table.types.utils.LegacyTypeInfoDataTypeConverter.convertToRowTypeInfo(LegacyTypeInfoDataTypeConverter.java:329) > at > org.apache.flink.table.types.utils.LegacyTypeInfoDataTypeConverter.toLegacyTypeInfo(LegacyTypeInfoDataTypeConverter.java:237) > at > org.apache.flink.table.types.utils.TypeConversions.fromDataTypeToLegacyInfo(TypeConversions.java:49) > at org.apache.flink.table.api.TableSchema.toRowType(TableSchema.java:271) > at > org.apache.flink.table.client.gateway.local.result.CollectStreamResult.<init>(CollectStreamResult.java:71) > at > org.apache.flink.table.client.gateway.local.result.MaterializedCollectStreamResult.<init>(MaterializedCollectStreamResult.java:101) > at > org.apache.flink.table.client.gateway.local.result.MaterializedCollectStreamResult.<init>(MaterializedCollectStreamResult.java:129) > at > org.apache.flink.table.client.gateway.local.ResultStore.createResult(ResultStore.java:83) > at > org.apache.flink.table.client.gateway.local.LocalExecutor.executeQueryInternal(LocalExecutor.java:608) > at > org.apache.flink.table.client.gateway.local.LocalExecutor.executeQuery(LocalExecutor.java:465) > at > org.apache.flink.table.client.cli.CliClient.callSelect(CliClient.java:555) > at > org.apache.flink.table.client.cli.CliClient.callCommand(CliClient.java:311) > at java.util.Optional.ifPresent(Optional.java:159) > at org.apache.flink.table.client.cli.CliClient.open(CliClient.java:212) > at org.apache.flink.table.client.SqlClient.openCli(SqlClient.java:142) > at org.apache.flink.table.client.SqlClient.start(SqlClient.java:114) > at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201) > > > > > > > > > > > > > > > > > > 在 2020-12-08 19:24:43,"李轲" <[hidden email]> 写道: > >项目需求要向 postgres 中插入数据,用 catalog 之后,插入数据貌似需要和数据库表定义完全一致,而且没找到只插入部分字段的写法 > >在时间转 TIMESTAMP(6) WITH LOCAL TIME ZONE 时报了错,这个格式是 postgres 中的时间戳定义 > >select cast(localtimestamp as TIMESTAMP(6) WITH LOCAL TIME ZONE); > >有没有什么转换方法?或者只插入部分数据的方法? > |
谢谢
发自我的iPhone > 在 2020年12月10日,10:49,Jark Wu <[hidden email]> 写道: > > 看报错信息,你并没有 insert into postgres,而是只是 select 了 query,这个会将运行结果显式在 sql client > 界面上,而不会插入到 postgres 中。 > > 你的 query 中有 timestamp with local time zone, 而 sql client 的 query运行结果的显式 > 还不支持这个类型。 > > 这个问题的解决可以关注下这个 issue:https://issues.apache.org/jira/browse/FLINK-17948 > > Best, > Jark > >> On Tue, 8 Dec 2020 at 19:32, 李轲 <[hidden email]> wrote: >> >> 报错信息: >> Exception in thread "main" >> org.apache.flink.table.client.SqlClientException: Unexpected exception. >> This is a bug. Please consider filing an issue. >> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:213) >> Caused by: org.apache.flink.table.api.TableException: Unsupported >> conversion from data type 'TIMESTAMP(6) WITH LOCAL TIME ZONE NOT NULL' >> (conversion class: java.time.Instant) to type information. Only data types >> that originated from type information fully support a reverse conversion. >> at >> org.apache.flink.table.types.utils.LegacyTypeInfoDataTypeConverter.toLegacyTypeInfo(LegacyTypeInfoDataTypeConverter.java:259) >> at >> java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) >> at >> java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382) >> at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) >> at >> java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472) >> at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:546) >> at >> java.util.stream.AbstractPipeline.evaluateToArrayNode(AbstractPipeline.java:260) >> at java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:438) >> at >> org.apache.flink.table.types.utils.LegacyTypeInfoDataTypeConverter.convertToRowTypeInfo(LegacyTypeInfoDataTypeConverter.java:329) >> at >> org.apache.flink.table.types.utils.LegacyTypeInfoDataTypeConverter.toLegacyTypeInfo(LegacyTypeInfoDataTypeConverter.java:237) >> at >> org.apache.flink.table.types.utils.TypeConversions.fromDataTypeToLegacyInfo(TypeConversions.java:49) >> at org.apache.flink.table.api.TableSchema.toRowType(TableSchema.java:271) >> at >> org.apache.flink.table.client.gateway.local.result.CollectStreamResult.<init>(CollectStreamResult.java:71) >> at >> org.apache.flink.table.client.gateway.local.result.MaterializedCollectStreamResult.<init>(MaterializedCollectStreamResult.java:101) >> at >> org.apache.flink.table.client.gateway.local.result.MaterializedCollectStreamResult.<init>(MaterializedCollectStreamResult.java:129) >> at >> org.apache.flink.table.client.gateway.local.ResultStore.createResult(ResultStore.java:83) >> at >> org.apache.flink.table.client.gateway.local.LocalExecutor.executeQueryInternal(LocalExecutor.java:608) >> at >> org.apache.flink.table.client.gateway.local.LocalExecutor.executeQuery(LocalExecutor.java:465) >> at >> org.apache.flink.table.client.cli.CliClient.callSelect(CliClient.java:555) >> at >> org.apache.flink.table.client.cli.CliClient.callCommand(CliClient.java:311) >> at java.util.Optional.ifPresent(Optional.java:159) >> at org.apache.flink.table.client.cli.CliClient.open(CliClient.java:212) >> at org.apache.flink.table.client.SqlClient.openCli(SqlClient.java:142) >> at org.apache.flink.table.client.SqlClient.start(SqlClient.java:114) >> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201) >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >>> 在 2020-12-08 19:24:43,"李轲" <[hidden email]> 写道: >>> 项目需求要向 postgres 中插入数据,用 catalog 之后,插入数据貌似需要和数据库表定义完全一致,而且没找到只插入部分字段的写法 >>> 在时间转 TIMESTAMP(6) WITH LOCAL TIME ZONE 时报了错,这个格式是 postgres 中的时间戳定义 >>> select cast(localtimestamp as TIMESTAMP(6) WITH LOCAL TIME ZONE); >>> 有没有什么转换方法?或者只插入部分数据的方法? >> > |
Free forum by Nabble | Edit this page |