<http://apache-flink.147419.n8.nabble.com/file/t1162/taskmanager.png>
Hello everyone! 如上图所示,升级后的flink,为什么看不到taskmanager的日志了。在Stdout中能看自己代码中打的log,但flink自身的log以及springboot相关的log等,都无法看到,不知何因?升级后日志系统需要重新配置吗? Thanks! Jacob -- Sent from: http://apache-flink.147419.n8.nabble.com/
Thanks!
Jacob |
Hi Jacob
1.可否发现使用的配置? 2.检查下jobmanager.err日志,看下日志的绑定是否正确 Jacob <[hidden email]> 于2020年12月16日周三 下午4:01写道: > <http://apache-flink.147419.n8.nabble.com/file/t1162/taskmanager.png> > > Hello everyone! > > > 如上图所示,升级后的flink,为什么看不到taskmanager的日志了。在Stdout中能看自己代码中打的log,但flink自身的log以及springboot相关的log等,都无法看到,不知何因?升级后日志系统需要重新配置吗? > > > Thanks! > Jacob > > > > -- > Sent from: http://apache-flink.147419.n8.nabble.com/ > |
谢谢回复!
1. 在jobmanager.err中发现如下日志绑定,存在冲突。 SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/data/hadoop/dn/sdc/yarn/nm/usercache/***/appcache/application_1603495749855_62368/filecache/24/test_job.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/data/hadoop/dn/sdc/yarn/nm/usercache/***/appcache/application_1603495749855_62368/filecache/32/log4j-slf4j-impl-2.12.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.8.3-1.cdh5.8.3.p0.2/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder] 这个多绑定会影响吗? 2. 该版本使用的配置如下: env.java.home: /usr/java/jdk1.8.0_162 yarn.taskmanager.env.JAVA_HOME: /usr/java/jdk1.8.0_162 containerized.master.env.JAVA_HOME: /usr/java/jdk1.8.0_162 containerized.taskmanager.env.JAVA_HOME: /usr/java/jdk1.8.0_162 jobmanager.rpc.address: localhost jobmanager.rpc.port: 6123 jobmanager.memory.process.size: 3072m taskmanager.memory.process.size: 3072m taskmanager.numberOfTaskSlots: 4 yarn.application-attempts: 10 state.backend: filesystem state.checkpoints.dir: hdfs://nameservice1/prd/website/flink_checkpoint state.savepoints.dir: hdfs://nameservice1/prd/website/flink_checkpoint state.backend.incremental: false state.backend.fs.memory-threshold: 1024 state.checkpoints.num-retained: 3 restart-strategy: fixed-delay restart-strategy.fixed-delay.attempts: 1000 restart-strategy.fixed-delay.delay: 30 s jobmanager.execution.failover-strategy: region classloader.resolve-order: parent-first 3. job运行方式:on yarn 4. hadoop版本:2.6 Thanks! Jacob -- Sent from: http://apache-flink.147419.n8.nabble.com/
Thanks!
Jacob |
看样子slf4j最终是绑定到了logback实现,你的任务配置是用的logback吗?如果不是,需要把logback的依赖排除掉
Jacob <[hidden email]> 于2020年12月16日周三 下午5:38写道: > 谢谢回复! > > 1. 在jobmanager.err中发现如下日志绑定,存在冲突。 > > SLF4J: Class path contains multiple SLF4J bindings. > SLF4J: Found binding in > > [jar:file:/data/hadoop/dn/sdc/yarn/nm/usercache/***/appcache/application_1603495749855_62368/filecache/24/test_job.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in > > [jar:file:/data/hadoop/dn/sdc/yarn/nm/usercache/***/appcache/application_1603495749855_62368/filecache/32/log4j-slf4j-impl-2.12.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in > > [jar:file:/opt/cloudera/parcels/CDH-5.8.3-1.cdh5.8.3.p0.2/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an > explanation. > SLF4J: Actual binding is of type > [ch.qos.logback.classic.util.ContextSelectorStaticBinder] > > 这个多绑定会影响吗? > > 2. 该版本使用的配置如下: > > env.java.home: /usr/java/jdk1.8.0_162 > yarn.taskmanager.env.JAVA_HOME: /usr/java/jdk1.8.0_162 > containerized.master.env.JAVA_HOME: /usr/java/jdk1.8.0_162 > containerized.taskmanager.env.JAVA_HOME: /usr/java/jdk1.8.0_162 > > > jobmanager.rpc.address: localhost > jobmanager.rpc.port: 6123 > jobmanager.memory.process.size: 3072m > taskmanager.memory.process.size: 3072m > taskmanager.numberOfTaskSlots: 4 > > yarn.application-attempts: 10 > state.backend: filesystem > state.checkpoints.dir: hdfs://nameservice1/prd/website/flink_checkpoint > state.savepoints.dir: hdfs://nameservice1/prd/website/flink_checkpoint > state.backend.incremental: false > state.backend.fs.memory-threshold: 1024 > state.checkpoints.num-retained: 3 > > restart-strategy: fixed-delay > restart-strategy.fixed-delay.attempts: 1000 > restart-strategy.fixed-delay.delay: 30 s > > jobmanager.execution.failover-strategy: region > > > classloader.resolve-order: parent-first > > 3. job运行方式:on yarn > > 4. hadoop版本:2.6 > > Thanks! > Jacob > > > > -- > Sent from: http://apache-flink.147419.n8.nabble.com/ > |
是的,我看到项目中有logback.xml配置,在pom中也有logback-classic依赖
<dependency> <groupId>ch.qos.logback</groupId> <artifactId>logback-classic</artifactId> <version>1.2.3</version> </dependency> 我移除这个依赖后,在UI中可以看到相关日志了! 谢谢! Thanks! Jacob -- Sent from: http://apache-flink.147419.n8.nabble.com/
Thanks!
Jacob |
Free forum by Nabble | Edit this page |