sql-client 连接hive报错 TTransportException

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

sql-client 连接hive报错 TTransportException

hechuan
Hi, 请教下
我尝试使用sql-client连接hive,  hive正常, 使用beeline -u jdbc:hive2://x.x.x.x:10000 可以正常连接


sql-client-defaults.yaml配置内容:
tables: []
functions: []
catalogs:
- name: myhive
  type: hive
  hive-conf-dir: /home/hive/flink-1.11.1/conf
  default-database: default
execution:
  planner: blink
  type: streaming
  time-characteristic: event-time
  periodic-watermarks-interval: 200
  result-mode: table
  max-table-result-rows: 1000000
  parallelism: 1
  max-parallelism: 128
  min-idle-state-retention: 0
  max-idle-state-retention: 0
  restart-strategy:
    type: fallback
deployment:
  response-timeout: 5000
  gateway-address: ""
  gateway-port: 0


然后启动sql-client报错
$./bin/sql-client.sh embedded


最后的报错信息:
Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue.
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:213)
Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:870)
at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227)
at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
Caused by: org.apache.flink.table.catalog.exceptions.CatalogException: Failed to determine whether database default exists or not
at org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:335)
at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:227)
at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191)
at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337)
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627)
at java.util.HashMap.forEach(HashMap.java:1289)
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
... 3 more
Caused by: org.apache.thrift.transport.TTransportException
at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:1135)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:1122)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1511)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1506)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:208)
at com.sun.proxy.$Proxy28.getDatabase(Unknown Source)
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.getDatabase(HiveMetastoreClientWrapper.java:107)
at org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:330)
... 15 more




附录完整错误信息:
Searching for '/home/hive/flink-1.11.1/conf/sql-client-defaults.yaml'...found.
Reading default environment from: file:/home/hive/flink-1.11.1/conf/sql-client-defaults.yaml
No session environment specified.
2020-10-27 09:48:14,533 INFO  org.apache.hadoop.hive.conf.HiveConf                         [] - Found configuration file file:/home/hive/flink-1.11.1/conf/hive-site.xml
2020-10-27 09:48:15,144 INFO  org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - Trying to connect to metastore with URI thrift://x.x.x.x:10000
2020-10-27 09:48:15,168 INFO  org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - Opened a connection to metastore, current connections: 1
2020-10-27 09:48:15,240 WARN  org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - set_ugi() not successful, Likely cause: new client talking to old server. Continuing without it.
org.apache.thrift.transport.TTransportException: null
at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:4787) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:4773) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:534) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:224) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_251]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_251]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_251]
at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_251]
at org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_251]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_251]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_251]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_251]
at org.apache.flink.table.catalog.hive.client.HiveShimV310.getHiveMetastoreClient(HiveShimV310.java:103) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:71) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:223) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191) ~[flink-table_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337) ~[flink-table_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627) ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_251]
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625) ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
2020-10-27 09:48:15,247 INFO  org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - Connected to metastore.
2020-10-27 09:48:15,247 INFO  org.apache.hadoop.hive.metastore.RetryingMetaStoreClient     [] - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.metastore.HiveMetaStoreClient ugi=hive (auth:SIMPLE) retries=1 delay=1 lifetime=0
2020-10-27 09:48:15,364 WARN  org.apache.hadoop.hive.metastore.RetryingMetaStoreClient     [] - MetaStoreClient lost connection. Attempting to reconnect (1 of 1) after 1s. getDatabase
org.apache.thrift.transport.TTransportException: null
at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:1135) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:1122) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1511) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1506) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_251]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_251]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_251]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_251]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:208) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at com.sun.proxy.$Proxy28.getDatabase(Unknown Source) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.getDatabase(HiveMetastoreClientWrapper.java:107) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:330) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:227) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191) ~[flink-table_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337) ~[flink-table_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627) ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_251]
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625) ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
2020-10-27 09:48:16,365 INFO  org.apache.hadoop.hive.metastore.RetryingMetaStoreClient     [] - RetryingMetaStoreClient trying reconnect as hive (auth:SIMPLE)
2020-10-27 09:48:16,375 INFO  org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - Closed a connection to metastore, current connections: 0
2020-10-27 09:48:16,375 INFO  org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - Trying to connect to metastore with URI thrift://x.x.x.x:10000
2020-10-27 09:48:16,376 INFO  org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - Opened a connection to metastore, current connections: 1
2020-10-27 09:48:16,436 WARN  org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - set_ugi() not successful, Likely cause: new client talking to old server. Continuing without it.
org.apache.thrift.transport.TTransportException: null
at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:4787) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:4773) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:534) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.reconnect(HiveMetaStoreClient.java:379) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient$1.run(RetryingMetaStoreClient.java:187) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_251]
at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_251]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836) ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:183) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at com.sun.proxy.$Proxy28.getDatabase(Unknown Source) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.getDatabase(HiveMetastoreClientWrapper.java:107) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:330) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:227) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191) ~[flink-table_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337) ~[flink-table_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627) ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_251]
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625) ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201) [flink-sql-client_2.12-1.11.1.jar:1.11.1]
2020-10-27 09:48:16,438 INFO  org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - Connected to metastore.




Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue.
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:213)
Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:870)
at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227)
at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
Caused by: org.apache.flink.table.catalog.exceptions.CatalogException: Failed to determine whether database default exists or not
at org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:335)
at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:227)
at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191)
at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337)
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627)
at java.util.HashMap.forEach(HashMap.java:1289)
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
... 3 more
Caused by: org.apache.thrift.transport.TTransportException
at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:1135)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:1122)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1511)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1506)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:208)
at com.sun.proxy.$Proxy28.getDatabase(Unknown Source)
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.getDatabase(HiveMetastoreClientWrapper.java:107)
at org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:330)
... 15 more


谢谢!



Reply | Threaded
Open this post in threaded view
|

Re: sql-client 连接hive报错 TTransportException

Rui Li
你好,我看log里连接的是10000端口,这个是HS2的端口吧?Flink的HiveCatalog需要连接的是HMS,可以启动一个HMS再试试哈。

On Tue, Oct 27, 2020 at 9:57 AM RS <[hidden email]> wrote:

> Hi, 请教下
> 我尝试使用sql-client连接hive,  hive正常, 使用beeline -u jdbc:hive2://x.x.x.x:10000
> 可以正常连接
>
>
> sql-client-defaults.yaml配置内容:
> tables: []
> functions: []
> catalogs:
> - name: myhive
>   type: hive
>   hive-conf-dir: /home/hive/flink-1.11.1/conf
>   default-database: default
> execution:
>   planner: blink
>   type: streaming
>   time-characteristic: event-time
>   periodic-watermarks-interval: 200
>   result-mode: table
>   max-table-result-rows: 1000000
>   parallelism: 1
>   max-parallelism: 128
>   min-idle-state-retention: 0
>   max-idle-state-retention: 0
>   restart-strategy:
>     type: fallback
> deployment:
>   response-timeout: 5000
>   gateway-address: ""
>   gateway-port: 0
>
>
> 然后启动sql-client报错
> $./bin/sql-client.sh embedded
>
>
> 最后的报错信息:
> Exception in thread "main"
> org.apache.flink.table.client.SqlClientException: Unexpected exception.
> This is a bug. Please consider filing an issue.
> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:213)
> Caused by: org.apache.flink.table.client.gateway.SqlExecutionException:
> Could not create execution context.
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:870)
> at
> org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227)
> at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
> Caused by: org.apache.flink.table.catalog.exceptions.CatalogException:
> Failed to determine whether database default exists or not
> at
> org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:335)
> at
> org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:227)
> at
> org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191)
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337)
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627)
> at java.util.HashMap.forEach(HashMap.java:1289)
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
> ... 3 more
> Caused by: org.apache.thrift.transport.TTransportException
> at
> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
> at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
> at
> org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
> at
> org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
> at
> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
> at
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:1135)
> at
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:1122)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1511)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1506)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:208)
> at com.sun.proxy.$Proxy28.getDatabase(Unknown Source)
> at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.getDatabase(HiveMetastoreClientWrapper.java:107)
> at
> org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:330)
> ... 15 more
>
>
>
>
> 附录完整错误信息:
> Searching for
> '/home/hive/flink-1.11.1/conf/sql-client-defaults.yaml'...found.
> Reading default environment from:
> file:/home/hive/flink-1.11.1/conf/sql-client-defaults.yaml
> No session environment specified.
> 2020-10-27 09:48:14,533 INFO  org.apache.hadoop.hive.conf.HiveConf
>                  [] - Found configuration file
> file:/home/hive/flink-1.11.1/conf/hive-site.xml
> 2020-10-27 09:48:15,144 INFO
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - Trying to
> connect to metastore with URI thrift://x.x.x.x:10000
> 2020-10-27 09:48:15,168 INFO
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - Opened a
> connection to metastore, current connections: 1
> 2020-10-27 09:48:15,240 WARN
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - set_ugi()
> not successful, Likely cause: new client talking to old server. Continuing
> without it.
> org.apache.thrift.transport.TTransportException: null
> at
> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:4787)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:4773)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:534)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:224)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> ~[?:1.8.0_251]
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> ~[?:1.8.0_251]
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> ~[?:1.8.0_251]
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> ~[?:1.8.0_251]
> at
> org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> ~[?:1.8.0_251]
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> ~[?:1.8.0_251]
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> ~[?:1.8.0_251]
> at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_251]
> at
> org.apache.flink.table.catalog.hive.client.HiveShimV310.getHiveMetastoreClient(HiveShimV310.java:103)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:71)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:223)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191)
> ~[flink-table_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337)
> ~[flink-table_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627)
> ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_251]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
> ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> 2020-10-27 09:48:15,247 INFO
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - Connected
> to metastore.
> 2020-10-27 09:48:15,247 INFO
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient     [] -
> RetryingMetaStoreClient proxy=class
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient ugi=hive (auth:SIMPLE)
> retries=1 delay=1 lifetime=0
> 2020-10-27 09:48:15,364 WARN
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient     [] -
> MetaStoreClient lost connection. Attempting to reconnect (1 of 1) after 1s.
> getDatabase
> org.apache.thrift.transport.TTransportException: null
> at
> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:1135)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:1122)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1511)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1506)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> ~[?:1.8.0_251]
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> ~[?:1.8.0_251]
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> ~[?:1.8.0_251]
> at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_251]
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:208)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at com.sun.proxy.$Proxy28.getDatabase(Unknown Source) ~[?:?]
> at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.getDatabase(HiveMetastoreClientWrapper.java:107)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:330)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:227)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191)
> ~[flink-table_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337)
> ~[flink-table_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627)
> ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_251]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
> ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> 2020-10-27 09:48:16,365 INFO
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient     [] -
> RetryingMetaStoreClient trying reconnect as hive (auth:SIMPLE)
> 2020-10-27 09:48:16,375 INFO
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - Closed a
> connection to metastore, current connections: 0
> 2020-10-27 09:48:16,375 INFO
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - Trying to
> connect to metastore with URI thrift://x.x.x.x:10000
> 2020-10-27 09:48:16,376 INFO
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - Opened a
> connection to metastore, current connections: 1
> 2020-10-27 09:48:16,436 WARN
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - set_ugi()
> not successful, Likely cause: new client talking to old server. Continuing
> without it.
> org.apache.thrift.transport.TTransportException: null
> at
> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:4787)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:4773)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:534)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.reconnect(HiveMetaStoreClient.java:379)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient$1.run(RetryingMetaStoreClient.java:187)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at java.security.AccessController.doPrivileged(Native Method)
> ~[?:1.8.0_251]
> at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_251]
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
> ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:183)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at com.sun.proxy.$Proxy28.getDatabase(Unknown Source) ~[?:?]
> at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.getDatabase(HiveMetastoreClientWrapper.java:107)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:330)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:227)
> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191)
> ~[flink-table_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337)
> ~[flink-table_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627)
> ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_251]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
> ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at
> org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> 2020-10-27 09:48:16,438 INFO
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - Connected
> to metastore.
>
>
>
>
> Exception in thread "main"
> org.apache.flink.table.client.SqlClientException: Unexpected exception.
> This is a bug. Please consider filing an issue.
> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:213)
> Caused by: org.apache.flink.table.client.gateway.SqlExecutionException:
> Could not create execution context.
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:870)
> at
> org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227)
> at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
> Caused by: org.apache.flink.table.catalog.exceptions.CatalogException:
> Failed to determine whether database default exists or not
> at
> org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:335)
> at
> org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:227)
> at
> org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191)
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337)
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627)
> at java.util.HashMap.forEach(HashMap.java:1289)
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
> at
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
> ... 3 more
> Caused by: org.apache.thrift.transport.TTransportException
> at
> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
> at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
> at
> org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
> at
> org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
> at
> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
> at
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:1135)
> at
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:1122)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1511)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1506)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:208)
> at com.sun.proxy.$Proxy28.getDatabase(Unknown Source)
> at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.getDatabase(HiveMetastoreClientWrapper.java:107)
> at
> org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:330)
> ... 15 more
>
>
> 谢谢!
>
>
>
>

--
Best regards!
Rui Li
Reply | Threaded
Open this post in threaded view
|

Re:Re: sql-client 连接hive报错 TTransportException

hechuan
Hi,

谢谢,应该是HMS的问题, 原来是需要配置remote的HMS,之前都是local模式
我执行了一下流程:
1. 清理了旧的数据库和数据目录
2. 重新初始化 schematool -dbType mysql -initSchema
3. 启动hive --service metastore, 成功监听端口9083端口
4. 启动hiveserver2, hiveserver2一直在重试,没有监听10000端口


然后hiveserver2启动失败, hive版本3.1.2, 请问下这个问题如何解决呢?


2020-10-29T18:53:35,602  WARN [main] server.HiveServer2: Error starting HiveServer2 on attempt 1, will retry in 60000ms
java.lang.RuntimeException: Error initializing notification event poll        at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:275) ~[hive-service-3.1.2.jar:3.1.2]
        at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1036) [hive-service-3.1.
2.jar:3.1.2]
        at org.apache.hive.service.server.HiveServer2.access$1600(HiveServer2.java:140) [hive-service-3.1.2.jar:
3.1.2]
        at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1305) [hive-s
ervice-3.1.2.jar:3.1.2]
        at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1149) [hive-service-3.1.2.jar:3.1.2]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_261]        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_261]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_261]        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_261]
        at org.apache.hadoop.util.RunJar.run(RunJar.java:323) [hadoop-common-3.3.0.jar:?]        at org.apache.hadoop.util.RunJar.main(RunJar.java:236) [hadoop-common-3.3.0.jar:?]
Caused by: java.io.IOException: org.apache.thrift.TApplicationException: Internal error processing get_current_notificationEventId
        at org.apache.hadoop.hive.metastore.messaging.EventUtils$MSClientNotificationFetcher.getCurrentNotificat
ionEventId(EventUtils.java:75) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.ql.metadata.events.NotificationEventPoll.<init>(NotificationEventPoll.java:103
) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.ql.metadata.events.NotificationEventPoll.initialize(NotificationEventPoll.java
:59) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:273) ~[hive-service-3.1.2.jar:3.1.2]
        ... 10 more
Caused by: org.apache.thrift.TApplicationException: Internal error processing get_current_notificationEventId        at org.apache.thrift.TApplicationException.read(TApplicationException.java:111) ~[hive-exec-3.1.2.jar:3.
1.2]        at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:79) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_current_notificationEventId(
ThriftHiveMetastore.java:5575) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_current_notificationEventId(Thrif
tHiveMetastore.java:5563) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getCurrentNotificationEventId(HiveMetaStoreClient.java:2723) ~[hive-exec-3.1.2.jar:3.1.2]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_261]
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_261]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_261]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_261]
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:212) ~[h








在 2020-10-27 19:58:32,"Rui Li" <[hidden email]> 写道:

>你好,我看log里连接的是10000端口,这个是HS2的端口吧?Flink的HiveCatalog需要连接的是HMS,可以启动一个HMS再试试哈。
>
>On Tue, Oct 27, 2020 at 9:57 AM RS <[hidden email]> wrote:
>
>> Hi, 请教下
>> 我尝试使用sql-client连接hive,  hive正常, 使用beeline -u jdbc:hive2://x.x.x.x:10000
>> 可以正常连接
>>
>>
>> sql-client-defaults.yaml配置内容:
>> tables: []
>> functions: []
>> catalogs:
>> - name: myhive
>>   type: hive
>>   hive-conf-dir: /home/hive/flink-1.11.1/conf
>>   default-database: default
>> execution:
>>   planner: blink
>>   type: streaming
>>   time-characteristic: event-time
>>   periodic-watermarks-interval: 200
>>   result-mode: table
>>   max-table-result-rows: 1000000
>>   parallelism: 1
>>   max-parallelism: 128
>>   min-idle-state-retention: 0
>>   max-idle-state-retention: 0
>>   restart-strategy:
>>     type: fallback
>> deployment:
>>   response-timeout: 5000
>>   gateway-address: ""
>>   gateway-port: 0
>>
>>
>> 然后启动sql-client报错
>> $./bin/sql-client.sh embedded
>>
>>
>> 最后的报错信息:
>> Exception in thread "main"
>> org.apache.flink.table.client.SqlClientException: Unexpected exception.
>> This is a bug. Please consider filing an issue.
>> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:213)
>> Caused by: org.apache.flink.table.client.gateway.SqlExecutionException:
>> Could not create execution context.
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:870)
>> at
>> org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227)
>> at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
>> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
>> Caused by: org.apache.flink.table.catalog.exceptions.CatalogException:
>> Failed to determine whether database default exists or not
>> at
>> org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:335)
>> at
>> org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:227)
>> at
>> org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191)
>> at
>> org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337)
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627)
>> at java.util.HashMap.forEach(HashMap.java:1289)
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
>> ... 3 more
>> Caused by: org.apache.thrift.transport.TTransportException
>> at
>> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
>> at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
>> at
>> org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
>> at
>> org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
>> at
>> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
>> at
>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:1135)
>> at
>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:1122)
>> at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1511)
>> at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1506)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:208)
>> at com.sun.proxy.$Proxy28.getDatabase(Unknown Source)
>> at
>> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.getDatabase(HiveMetastoreClientWrapper.java:107)
>> at
>> org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:330)
>> ... 15 more
>>
>>
>>
>>
>> 附录完整错误信息:
>> Searching for
>> '/home/hive/flink-1.11.1/conf/sql-client-defaults.yaml'...found.
>> Reading default environment from:
>> file:/home/hive/flink-1.11.1/conf/sql-client-defaults.yaml
>> No session environment specified.
>> 2020-10-27 09:48:14,533 INFO  org.apache.hadoop.hive.conf.HiveConf
>>                  [] - Found configuration file
>> file:/home/hive/flink-1.11.1/conf/hive-site.xml
>> 2020-10-27 09:48:15,144 INFO
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - Trying to
>> connect to metastore with URI thrift://x.x.x.x:10000
>> 2020-10-27 09:48:15,168 INFO
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - Opened a
>> connection to metastore, current connections: 1
>> 2020-10-27 09:48:15,240 WARN
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - set_ugi()
>> not successful, Likely cause: new client talking to old server. Continuing
>> without it.
>> org.apache.thrift.transport.TTransportException: null
>> at
>> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:4787)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:4773)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:534)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:224)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> ~[?:1.8.0_251]
>> at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> ~[?:1.8.0_251]
>> at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> ~[?:1.8.0_251]
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> ~[?:1.8.0_251]
>> at
>> org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> ~[?:1.8.0_251]
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> ~[?:1.8.0_251]
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> ~[?:1.8.0_251]
>> at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_251]
>> at
>> org.apache.flink.table.catalog.hive.client.HiveShimV310.getHiveMetastoreClient(HiveShimV310.java:103)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:71)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:223)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191)
>> ~[flink-table_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337)
>> ~[flink-table_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627)
>> ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_251]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
>> ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> 2020-10-27 09:48:15,247 INFO
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - Connected
>> to metastore.
>> 2020-10-27 09:48:15,247 INFO
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient     [] -
>> RetryingMetaStoreClient proxy=class
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient ugi=hive (auth:SIMPLE)
>> retries=1 delay=1 lifetime=0
>> 2020-10-27 09:48:15,364 WARN
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient     [] -
>> MetaStoreClient lost connection. Attempting to reconnect (1 of 1) after 1s.
>> getDatabase
>> org.apache.thrift.transport.TTransportException: null
>> at
>> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:1135)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:1122)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1511)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1506)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> ~[?:1.8.0_251]
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> ~[?:1.8.0_251]
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> ~[?:1.8.0_251]
>> at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_251]
>> at
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:208)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at com.sun.proxy.$Proxy28.getDatabase(Unknown Source) ~[?:?]
>> at
>> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.getDatabase(HiveMetastoreClientWrapper.java:107)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:330)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:227)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191)
>> ~[flink-table_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337)
>> ~[flink-table_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627)
>> ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_251]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
>> ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> 2020-10-27 09:48:16,365 INFO
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient     [] -
>> RetryingMetaStoreClient trying reconnect as hive (auth:SIMPLE)
>> 2020-10-27 09:48:16,375 INFO
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - Closed a
>> connection to metastore, current connections: 0
>> 2020-10-27 09:48:16,375 INFO
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - Trying to
>> connect to metastore with URI thrift://x.x.x.x:10000
>> 2020-10-27 09:48:16,376 INFO
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - Opened a
>> connection to metastore, current connections: 1
>> 2020-10-27 09:48:16,436 WARN
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - set_ugi()
>> not successful, Likely cause: new client talking to old server. Continuing
>> without it.
>> org.apache.thrift.transport.TTransportException: null
>> at
>> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:4787)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:4773)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:534)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.reconnect(HiveMetaStoreClient.java:379)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient$1.run(RetryingMetaStoreClient.java:187)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at java.security.AccessController.doPrivileged(Native Method)
>> ~[?:1.8.0_251]
>> at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_251]
>> at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>> ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]
>> at
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:183)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at com.sun.proxy.$Proxy28.getDatabase(Unknown Source) ~[?:?]
>> at
>> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.getDatabase(HiveMetastoreClientWrapper.java:107)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:330)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:227)
>> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191)
>> ~[flink-table_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337)
>> ~[flink-table_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627)
>> ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_251]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
>> ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at
>> org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
>> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
>> 2020-10-27 09:48:16,438 INFO
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] - Connected
>> to metastore.
>>
>>
>>
>>
>> Exception in thread "main"
>> org.apache.flink.table.client.SqlClientException: Unexpected exception.
>> This is a bug. Please consider filing an issue.
>> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:213)
>> Caused by: org.apache.flink.table.client.gateway.SqlExecutionException:
>> Could not create execution context.
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:870)
>> at
>> org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227)
>> at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
>> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
>> Caused by: org.apache.flink.table.catalog.exceptions.CatalogException:
>> Failed to determine whether database default exists or not
>> at
>> org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:335)
>> at
>> org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:227)
>> at
>> org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191)
>> at
>> org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337)
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627)
>> at java.util.HashMap.forEach(HashMap.java:1289)
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
>> at
>> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
>> ... 3 more
>> Caused by: org.apache.thrift.transport.TTransportException
>> at
>> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
>> at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
>> at
>> org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
>> at
>> org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
>> at
>> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
>> at
>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:1135)
>> at
>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:1122)
>> at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1511)
>> at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1506)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:208)
>> at com.sun.proxy.$Proxy28.getDatabase(Unknown Source)
>> at
>> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.getDatabase(HiveMetastoreClientWrapper.java:107)
>> at
>> org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:330)
>> ... 15 more
>>
>>
>> 谢谢!
>>
>>
>>
>>
>
>--
>Best regards!
>Rui Li
Reply | Threaded
Open this post in threaded view
|

Re: Re: sql-client 连接hive报错 TTransportException

nobleyd
Hi,问题解决了。不清楚你为啥需要启动hiveServer2呢?貌似不需要。flink只需要用到hms吧。

RS <[hidden email]> 于2020年10月30日周五 上午9:57写道:

> Hi,
>
> 谢谢,应该是HMS的问题, 原来是需要配置remote的HMS,之前都是local模式
> 我执行了一下流程:
> 1. 清理了旧的数据库和数据目录
> 2. 重新初始化 schematool -dbType mysql -initSchema
> 3. 启动hive --service metastore, 成功监听端口9083端口
> 4. 启动hiveserver2, hiveserver2一直在重试,没有监听10000端口
>
>
> 然后hiveserver2启动失败, hive版本3.1.2, 请问下这个问题如何解决呢?
>
>
> 2020-10-29T18:53:35,602  WARN [main] server.HiveServer2: Error starting
> HiveServer2 on attempt 1, will retry in 60000ms
> java.lang.RuntimeException: Error initializing notification event poll
>     at
> org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:275)
> ~[hive-service-3.1.2.jar:3.1.2]
>         at
> org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1036)
> [hive-service-3.1.
> 2.jar:3.1.2]
>         at
> org.apache.hive.service.server.HiveServer2.access$1600(HiveServer2.java:140)
> [hive-service-3.1.2.jar:
> 3.1.2]
>         at
> org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1305)
> [hive-s
> ervice-3.1.2.jar:3.1.2]
>         at
> org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1149)
> [hive-service-3.1.2.jar:3.1.2]
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> ~[?:1.8.0_261]        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> ~[?:1.8.0_261]
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> ~[?:1.8.0_261]        at java.lang.reflect.Method.invoke(Method.java:498)
> ~[?:1.8.0_261]
>         at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
> [hadoop-common-3.3.0.jar:?]        at
> org.apache.hadoop.util.RunJar.main(RunJar.java:236)
> [hadoop-common-3.3.0.jar:?]
> Caused by: java.io.IOException: org.apache.thrift.TApplicationException:
> Internal error processing get_current_notificationEventId
>         at
> org.apache.hadoop.hive.metastore.messaging.EventUtils$MSClientNotificationFetcher.getCurrentNotificat
> ionEventId(EventUtils.java:75) ~[hive-exec-3.1.2.jar:3.1.2]
>         at
> org.apache.hadoop.hive.ql.metadata.events.NotificationEventPoll.<init>(NotificationEventPoll.java:103
> ) ~[hive-exec-3.1.2.jar:3.1.2]
>         at
> org.apache.hadoop.hive.ql.metadata.events.NotificationEventPoll.initialize(NotificationEventPoll.java
> :59) ~[hive-exec-3.1.2.jar:3.1.2]
>         at
> org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:273)
> ~[hive-service-3.1.2.jar:3.1.2]
>         ... 10 more
> Caused by: org.apache.thrift.TApplicationException: Internal error
> processing get_current_notificationEventId        at
> org.apache.thrift.TApplicationException.read(TApplicationException.java:111)
> ~[hive-exec-3.1.2.jar:3.
> 1.2]        at
> org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:79)
> ~[hive-exec-3.1.2.jar:3.1.2]
>         at
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_current_notificationEventId(
> ThriftHiveMetastore.java:5575) ~[hive-exec-3.1.2.jar:3.1.2]
>         at
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_current_notificationEventId(Thrif
> tHiveMetastore.java:5563) ~[hive-exec-3.1.2.jar:3.1.2]
>         at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getCurrentNotificationEventId(HiveMetaStoreClient.java:2723)
> ~[hive-exec-3.1.2.jar:3.1.2]
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> ~[?:1.8.0_261]
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> ~[?:1.8.0_261]
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> ~[?:1.8.0_261]
>         at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_261]
>         at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:212)
> ~[h
>
>
>
>
>
>
>
>
> 在 2020-10-27 19:58:32,"Rui Li" <[hidden email]> 写道:
> >你好,我看log里连接的是10000端口,这个是HS2的端口吧?Flink的HiveCatalog需要连接的是HMS,可以启动一个HMS再试试哈。
> >
> >On Tue, Oct 27, 2020 at 9:57 AM RS <[hidden email]> wrote:
> >
> >> Hi, 请教下
> >> 我尝试使用sql-client连接hive,  hive正常, 使用beeline -u jdbc:hive2://x.x.x.x:10000
> >> 可以正常连接
> >>
> >>
> >> sql-client-defaults.yaml配置内容:
> >> tables: []
> >> functions: []
> >> catalogs:
> >> - name: myhive
> >>   type: hive
> >>   hive-conf-dir: /home/hive/flink-1.11.1/conf
> >>   default-database: default
> >> execution:
> >>   planner: blink
> >>   type: streaming
> >>   time-characteristic: event-time
> >>   periodic-watermarks-interval: 200
> >>   result-mode: table
> >>   max-table-result-rows: 1000000
> >>   parallelism: 1
> >>   max-parallelism: 128
> >>   min-idle-state-retention: 0
> >>   max-idle-state-retention: 0
> >>   restart-strategy:
> >>     type: fallback
> >> deployment:
> >>   response-timeout: 5000
> >>   gateway-address: ""
> >>   gateway-port: 0
> >>
> >>
> >> 然后启动sql-client报错
> >> $./bin/sql-client.sh embedded
> >>
> >>
> >> 最后的报错信息:
> >> Exception in thread "main"
> >> org.apache.flink.table.client.SqlClientException: Unexpected exception.
> >> This is a bug. Please consider filing an issue.
> >> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:213)
> >> Caused by: org.apache.flink.table.client.gateway.SqlExecutionException:
> >> Could not create execution context.
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:870)
> >> at
> >>
> org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227)
> >> at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
> >> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
> >> Caused by: org.apache.flink.table.catalog.exceptions.CatalogException:
> >> Failed to determine whether database default exists or not
> >> at
> >>
> org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:335)
> >> at
> >>
> org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:227)
> >> at
> >>
> org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191)
> >> at
> >>
> org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337)
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627)
> >> at java.util.HashMap.forEach(HashMap.java:1289)
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
> >> ... 3 more
> >> Caused by: org.apache.thrift.transport.TTransportException
> >> at
> >>
> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
> >> at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
> >> at
> >>
> org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
> >> at
> >>
> org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
> >> at
> >>
> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
> >> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
> >> at
> >>
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:1135)
> >> at
> >>
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:1122)
> >> at
> >>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1511)
> >> at
> >>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1506)
> >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >> at java.lang.reflect.Method.invoke(Method.java:498)
> >> at
> >>
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:208)
> >> at com.sun.proxy.$Proxy28.getDatabase(Unknown Source)
> >> at
> >>
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.getDatabase(HiveMetastoreClientWrapper.java:107)
> >> at
> >>
> org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:330)
> >> ... 15 more
> >>
> >>
> >>
> >>
> >> 附录完整错误信息:
> >> Searching for
> >> '/home/hive/flink-1.11.1/conf/sql-client-defaults.yaml'...found.
> >> Reading default environment from:
> >> file:/home/hive/flink-1.11.1/conf/sql-client-defaults.yaml
> >> No session environment specified.
> >> 2020-10-27 09:48:14,533 INFO  org.apache.hadoop.hive.conf.HiveConf
> >>                  [] - Found configuration file
> >> file:/home/hive/flink-1.11.1/conf/hive-site.xml
> >> 2020-10-27 09:48:15,144 INFO
> >> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] -
> Trying to
> >> connect to metastore with URI thrift://x.x.x.x:10000
> >> 2020-10-27 09:48:15,168 INFO
> >> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] -
> Opened a
> >> connection to metastore, current connections: 1
> >> 2020-10-27 09:48:15,240 WARN
> >> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] -
> set_ugi()
> >> not successful, Likely cause: new client talking to old server.
> Continuing
> >> without it.
> >> org.apache.thrift.transport.TTransportException: null
> >> at
> >>
> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:4787)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:4773)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:534)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:224)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> >> ~[?:1.8.0_251]
> >> at
> >>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> >> ~[?:1.8.0_251]
> >> at
> >>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> >> ~[?:1.8.0_251]
> >> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> >> ~[?:1.8.0_251]
> >> at
> >>
> org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> ~[?:1.8.0_251]
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >> ~[?:1.8.0_251]
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >> ~[?:1.8.0_251]
> >> at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_251]
> >> at
> >>
> org.apache.flink.table.catalog.hive.client.HiveShimV310.getHiveMetastoreClient(HiveShimV310.java:103)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:71)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:223)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191)
> >> ~[flink-table_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337)
> >> ~[flink-table_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627)
> >> ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_251]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
> >> ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> 2020-10-27 09:48:15,247 INFO
> >> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] -
> Connected
> >> to metastore.
> >> 2020-10-27 09:48:15,247 INFO
> >> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient     [] -
> >> RetryingMetaStoreClient proxy=class
> >> org.apache.hadoop.hive.metastore.HiveMetaStoreClient ugi=hive
> (auth:SIMPLE)
> >> retries=1 delay=1 lifetime=0
> >> 2020-10-27 09:48:15,364 WARN
> >> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient     [] -
> >> MetaStoreClient lost connection. Attempting to reconnect (1 of 1) after
> 1s.
> >> getDatabase
> >> org.apache.thrift.transport.TTransportException: null
> >> at
> >>
> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:1135)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:1122)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1511)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1506)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> ~[?:1.8.0_251]
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >> ~[?:1.8.0_251]
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >> ~[?:1.8.0_251]
> >> at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_251]
> >> at
> >>
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:208)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at com.sun.proxy.$Proxy28.getDatabase(Unknown Source) ~[?:?]
> >> at
> >>
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.getDatabase(HiveMetastoreClientWrapper.java:107)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:330)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:227)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191)
> >> ~[flink-table_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337)
> >> ~[flink-table_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627)
> >> ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_251]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
> >> ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> 2020-10-27 09:48:16,365 INFO
> >> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient     [] -
> >> RetryingMetaStoreClient trying reconnect as hive (auth:SIMPLE)
> >> 2020-10-27 09:48:16,375 INFO
> >> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] -
> Closed a
> >> connection to metastore, current connections: 0
> >> 2020-10-27 09:48:16,375 INFO
> >> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] -
> Trying to
> >> connect to metastore with URI thrift://x.x.x.x:10000
> >> 2020-10-27 09:48:16,376 INFO
> >> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] -
> Opened a
> >> connection to metastore, current connections: 1
> >> 2020-10-27 09:48:16,436 WARN
> >> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] -
> set_ugi()
> >> not successful, Likely cause: new client talking to old server.
> Continuing
> >> without it.
> >> org.apache.thrift.transport.TTransportException: null
> >> at
> >>
> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:4787)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:4773)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:534)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.reconnect(HiveMetaStoreClient.java:379)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient$1.run(RetryingMetaStoreClient.java:187)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at java.security.AccessController.doPrivileged(Native Method)
> >> ~[?:1.8.0_251]
> >> at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_251]
> >> at
> >>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
> >> ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]
> >> at
> >>
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:183)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at com.sun.proxy.$Proxy28.getDatabase(Unknown Source) ~[?:?]
> >> at
> >>
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.getDatabase(HiveMetastoreClientWrapper.java:107)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:330)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:227)
> >> ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191)
> >> ~[flink-table_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337)
> >> ~[flink-table_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627)
> >> ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_251]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
> >> ~[flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at
> >>
> org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
> >> [flink-sql-client_2.12-1.11.1.jar:1.11.1]
> >> 2020-10-27 09:48:16,438 INFO
> >> org.apache.hadoop.hive.metastore.HiveMetaStoreClient         [] -
> Connected
> >> to metastore.
> >>
> >>
> >>
> >>
> >> Exception in thread "main"
> >> org.apache.flink.table.client.SqlClientException: Unexpected exception.
> >> This is a bug. Please consider filing an issue.
> >> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:213)
> >> Caused by: org.apache.flink.table.client.gateway.SqlExecutionException:
> >> Could not create execution context.
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:870)
> >> at
> >>
> org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227)
> >> at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
> >> at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
> >> Caused by: org.apache.flink.table.catalog.exceptions.CatalogException:
> >> Failed to determine whether database default exists or not
> >> at
> >>
> org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:335)
> >> at
> >>
> org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:227)
> >> at
> >>
> org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191)
> >> at
> >>
> org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337)
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627)
> >> at java.util.HashMap.forEach(HashMap.java:1289)
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
> >> at
> >>
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
> >> ... 3 more
> >> Caused by: org.apache.thrift.transport.TTransportException
> >> at
> >>
> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
> >> at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
> >> at
> >>
> org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
> >> at
> >>
> org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
> >> at
> >>
> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
> >> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
> >> at
> >>
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:1135)
> >> at
> >>
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:1122)
> >> at
> >>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1511)
> >> at
> >>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1506)
> >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >> at java.lang.reflect.Method.invoke(Method.java:498)
> >> at
> >>
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:208)
> >> at com.sun.proxy.$Proxy28.getDatabase(Unknown Source)
> >> at
> >>
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.getDatabase(HiveMetastoreClientWrapper.java:107)
> >> at
> >>
> org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:330)
> >> ... 15 more
> >>
> >>
> >> 谢谢!
> >>
> >>
> >>
> >>
> >
> >--
> >Best regards!
> >Rui Li
>