Hi guys,
I run a ingestion job druid + hadoop 2.7.1 (custom version declared in common conf druid.extensions.hadoopDependenciesDir=/druid/hadoop-dependencies).
The job finishes successfully but at the very end i get the next exception:
…
…
2017-01-13T09:44:48,064 INFO [task-runner-0-priority-0] io.druid.indexing.worker.executor.ExecutorLifecycle - Task completed with status: {
"id" : "index_hadoop_impression_segments5_2017-01-13T09:42:05.410Z",
"status" : "SUCCESS",
"duration" : 159082
}
…
…
**2017-01-13 09:44:48,136 Thread-2 ERROR Unable to register shutdown hook because JVM is shutting down. java.lang.IllegalStateException: Not started
at io.druid.common.config.Log4jShutdown.addShutdownCallback(Log4jShutdown.java:45)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.addShutdownCallback(Log4jContextFactory.java:273)
at org.apache.logging.log4j.core.LoggerContext.setUpShutdownHook(LoggerContext.java:256)
at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:216)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:145)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41)
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:182)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:103)
at org.apache.logging.slf4j.Log4jLoggerFactory.getContext(Log4jLoggerFactory.java:43)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42)
at org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:29)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:284)
at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:155)
at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:132)
at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:273)
at org.apache.hadoop.hdfs.LeaseRenewer.<clinit>(LeaseRenewer.java:72)
at org.apache.hadoop.hdfs.DFSClient.getLeaseRenewer(DFSClient.java:699)
at org.apache.hadoop.hdfs.DFSClient.close(DFSClient.java:859)
at org.apache.hadoop.hdfs.DistributedFileSystem.close(DistributedFileSystem.java:853)
at org.apache.hadoop.fs.FileSystem$Cache.closeAll(FileSystem.java:2407)
at org.apache.hadoop.fs.FileSystem$Cache$ClientFinalizer.run(FileSystem.java:2424)
at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)**
Can anybody help with any idea what could be the reason?