Flink1.12.0 sql-client连接hive报错

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Flink1.12.0 sql-client连接hive报错

yujianbo
一、环境
1、Flink1.12.0
2、hive 2.1.1
3、下载release-1.12编译的jar包,用export HADOOP_CLASSPATH=`hadoop
classpath`连接Hadoop集群
4、flink的lib目录下是这些包:(是不是还需要加一下什么包?)
    flink-csv-1.12.jar
    flink-dist_2.11-1.12.jar
    flink-json-1.12.jar
    flink-shaded-zookeeper-3.4.14.jar
    flink-table_2.11-1.12.jar
    flink-table-blink_2.11-1.12.jar
    log4j-1.2-api-2.12.1.jar
    log4j-api-2.12.1.jar
    log4j-core-2.12.1.jar
    log4j-slf4j-impl-2.12.1.jar
 
5、flink的conf目录下的sql-client-defaults.yaml 只修改了:
    catalogs: #[] # empty list
       - name: myhive
         type: hive
         hive-conf-dir: /etc/hive/conf


二、启动:
     export HADOOP_CLASSPATH=`hadoop classpath`
     /tmp/flink-1.12.0/bin/sql-client.sh embedded


三、报错:
   
[yujianbo@qzcs86 conf]$ /tmp/flink-1.12.0/bin/sql-client.sh embedded
Setting HBASE_CONF_DIR=/etc/hbase/conf because no HBASE_CONF_DIR was set.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/tmp/flink-1.12.0/lib/log4j-slf4j-impl-2.12.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/opt/cloudera/parcels/CDH-6.3.1-1.cdh6.3.1.p0.1470567/jars/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type
[org.apache.logging.slf4j.Log4jLoggerFactory]
No default environment specified.
Searching for '/tmp/flink-1.12.0/conf/sql-client-defaults.yaml'...found.
Reading default environment from:
file:/tmp/flink-1.12.0/conf/sql-client-defaults.yaml
No session environment specified.


Exception in thread "main" org.apache.flink.table.client.SqlClientException:
Unexpected exception. This is a bug. Please consider filing an issue.
        at org.apache.flink.table.client.SqlClient.main(SqlClient.java:208)
Caused by: org.apache.flink.table.client.gateway.SqlExecutionException:
Could not create execution context.
        at
org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:878)
        at
org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:226)
        at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
        at org.apache.flink.table.client.SqlClient.main(SqlClient.java:196)
Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could
not find a suitable table factory for
'org.apache.flink.table.factories.CatalogFactory' in
the classpath.

Reason: Required context properties mismatch.

The following properties are requested:
hive-conf-dir=/etc/hive/conf
type=hive

The following factories have been considered:
org.apache.flink.table.catalog.GenericInMemoryCatalogFactory
        at
org.apache.flink.table.factories.TableFactoryService.filterByContext(TableFactoryService.java:322)
        at
org.apache.flink.table.factories.TableFactoryService.filter(TableFactoryService.java:190)
        at
org.apache.flink.table.factories.TableFactoryService.findSingleInternal(TableFactoryService.java:143)
        at
org.apache.flink.table.factories.TableFactoryService.find(TableFactoryService.java:113)
        at
org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:383)
        at
org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:634)
        at java.util.HashMap.forEach(HashMap.java:1280)
        at
org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:633)
        at
org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:266)
        at
org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:632)
        at
org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:529)
        at
org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:185)
        at
org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:138)
        at
org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:867)
        ... 3 more













--
Sent from: http://apache-flink.147419.n8.nabble.com/
Reply | Threaded
Open this post in threaded view
|

回复:Flink1.12.0 sql-client连接hive报错

fanrui
Hi yujianbo,


这两篇文章应该能解决你的问题,文中有写到需要哪些包。

Flink集成Hive之快速入门--以Flink1.12为例

https://mp.weixin.qq.com/s/99ehmNzJVwW3cOrw_UkGsg



Flink集成Hive之Hive Catalog与Hive Dialect--以Flink1.12为例

https://mp.weixin.qq.com/s/YuR-s5zCtBz_5ku_bttbaw


&nbsp;
对应官网链接:
https://ci.apache.org/projects/flink/flink-docs-release-1.12/dev/table/connectors/hive/
https://ci.apache.org/projects/flink/flink-docs-release-1.12/dev/table/connectors/hive/hive_catalog.html
https://ci.apache.org/projects/flink/flink-docs-release-1.12/dev/table/connectors/hive/hive_dialect.html


Best&nbsp;
fanrui


------------------&nbsp;原始邮件&nbsp;------------------
发件人:                                                                                                                        "user-zh"                                                                                    <[hidden email]&gt;;
发送时间:&nbsp;2021年1月12日(星期二) 上午10:34
收件人:&nbsp;"user-zh"<[hidden email]&gt;;

主题:&nbsp;Flink1.12.0 sql-client连接hive报错



一、环境
1、Flink1.12.0
2、hive 2.1.1
3、下载release-1.12编译的jar包,用export HADOOP_CLASSPATH=`hadoop
classpath`连接Hadoop集群
4、flink的lib目录下是这些包:(是不是还需要加一下什么包?)
&nbsp;&nbsp;&nbsp; flink-csv-1.12.jar
&nbsp;&nbsp;&nbsp; flink-dist_2.11-1.12.jar
&nbsp;&nbsp;&nbsp; flink-json-1.12.jar
&nbsp;&nbsp;&nbsp; flink-shaded-zookeeper-3.4.14.jar
&nbsp;&nbsp;&nbsp; flink-table_2.11-1.12.jar
&nbsp;&nbsp;&nbsp; flink-table-blink_2.11-1.12.jar
&nbsp;&nbsp;&nbsp; log4j-1.2-api-2.12.1.jar
&nbsp;&nbsp;&nbsp; log4j-api-2.12.1.jar
&nbsp;&nbsp;&nbsp; log4j-core-2.12.1.jar
&nbsp;&nbsp;&nbsp; log4j-slf4j-impl-2.12.1.jar
&nbsp;
5、flink的conf目录下的sql-client-defaults.yaml 只修改了:
&nbsp;&nbsp;&nbsp; catalogs: #[] # empty list
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; - name: myhive
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; type: hive
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; hive-conf-dir: /etc/hive/conf


二、启动:
&nbsp;&nbsp;&nbsp;&nbsp; export HADOOP_CLASSPATH=`hadoop classpath`
&nbsp;&nbsp;&nbsp;&nbsp; /tmp/flink-1.12.0/bin/sql-client.sh embedded


三、报错:
&nbsp;&nbsp;
[yujianbo@qzcs86 conf]$ /tmp/flink-1.12.0/bin/sql-client.sh embedded
Setting HBASE_CONF_DIR=/etc/hbase/conf because no HBASE_CONF_DIR was set.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/tmp/flink-1.12.0/lib/log4j-slf4j-impl-2.12.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/opt/cloudera/parcels/CDH-6.3.1-1.cdh6.3.1.p0.1470567/jars/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type
[org.apache.logging.slf4j.Log4jLoggerFactory]
No default environment specified.
Searching for '/tmp/flink-1.12.0/conf/sql-client-defaults.yaml'...found.
Reading default environment from:
file:/tmp/flink-1.12.0/conf/sql-client-defaults.yaml
No session environment specified.


Exception in thread "main" org.apache.flink.table.client.SqlClientException:
Unexpected exception. This is a bug. Please consider filing an issue.
        at org.apache.flink.table.client.SqlClient.main(SqlClient.java:208)
Caused by: org.apache.flink.table.client.gateway.SqlExecutionException:
Could not create execution context.
        at
org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:878)
        at
org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:226)
        at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
        at org.apache.flink.table.client.SqlClient.main(SqlClient.java:196)
Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could
not find a suitable table factory for
'org.apache.flink.table.factories.CatalogFactory' in
the classpath.

Reason: Required context properties mismatch.

The following properties are requested:
hive-conf-dir=/etc/hive/conf
type=hive

The following factories have been considered:
org.apache.flink.table.catalog.GenericInMemoryCatalogFactory
        at
org.apache.flink.table.factories.TableFactoryService.filterByContext(TableFactoryService.java:322)
        at
org.apache.flink.table.factories.TableFactoryService.filter(TableFactoryService.java:190)
        at
org.apache.flink.table.factories.TableFactoryService.findSingleInternal(TableFactoryService.java:143)
        at
org.apache.flink.table.factories.TableFactoryService.find(TableFactoryService.java:113)
        at
org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:383)
        at
org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:634)
        at java.util.HashMap.forEach(HashMap.java:1280)
        at
org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:633)
        at
org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:266)
        at
org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:632)
        at
org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:529)
        at
org.apache.flink.table.client.gateway.local.ExecutionContext.<init&gt;(ExecutionContext.java:185)
        at
org.apache.flink.table.client.gateway.local.ExecutionContext.<init&gt;(ExecutionContext.java:138)
        at
org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:867)
        ... 3 more













--
Sent from: http://apache-flink.147419.n8.nabble.com/
Reply | Threaded
Open this post in threaded view
|

Re: 回复:Flink1.12.0 sql-client连接hive报错

yujianbo
1、现在sql-cli能够提交到yarn的session那边,但是会直接报错:
Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.mapred.JobConf
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

找不到Hadoop的依赖,我已经在三台测试机上已经将export HADOOP_CLASSPATH=`hadoop
classpath`配置到/etc/profile。

2、我的perjob任务或者,启动一个任务flink run -yid提交到这个session都是没有问题的

3、有个朋友给了我这个shade包flink-shaded-hadoop-2-uber-2.7.5-8.0.jar放到lib目录下,就可以了。

4、奇怪的是我不加这个shade包,通过export HADOOP_CLASSPATH=`hadoop
classpath`配置全局变量,session启动不起来吗???



--
Sent from: http://apache-flink.147419.n8.nabble.com/