Druid stream ingestion can't print any log

I use middlemanager ingestion kafka data.the task can success complete, but the task can't print any log.
just:
Thread-2 ERROR Unable to register shutdown hook because JVM is shutting down. java.lang.IllegalStateException: Not started
    at io.druid.common.config.Log4jShutdown.addShutdownCallback(Log4jShutdown.java:45)
    at org.apache.logging.log4j.core.impl.Log4jContextFactory.addShutdownCallback(Log4jContextFactory.java:273)
    at org.apache.logging.log4j.core.LoggerContext.setUpShutdownHook(LoggerContext.java:256)
    at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:216)
    at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:145)
    at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41)
    at org.apache.logging.log4j.LogManager.getContext(LogManager.java:182)
    at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:103)
    at org.apache.logging.slf4j.Log4jLoggerFactory.getContext(Log4jLoggerFactory.java:43)
    at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42)
    at org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:29)
    at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:253)
    at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:155)
    at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:132)
    at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:273)
    at org.apache.hadoop.hdfs.LeaseRenewer.<clinit>(LeaseRenewer.java:72)
    at org.apache.hadoop.hdfs.DFSClient.getLeaseRenewer(DFSClient.java:699)
    at org.apache.hadoop.hdfs.DFSClient.close(DFSClient.java:859)
    at org.apache.hadoop.hdfs.DistributedFileSystem.close(DistributedFileSystem.java:853)
    at org.apache.hadoop.fs.FileSystem$Cache.closeAll(FileSystem.java:2407)
    at org.apache.hadoop.fs.FileSystem$Cache$ClientFinalizer.run(FileSystem.java:2424)
    at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)

i want to know more ingestion info from log

Hi,
This is an error that occurs semi-frequently when Druid servers shutdown, causing a few additional shutdown logs to be prevented from printing, but is mostly cosmetic and shouldn’t be losing much of any information real information, described in this issue, https://github.com/apache/incubator-druid/issues/5568 and which should be fixed in the upcoming 0.14 release by this patch https://github.com/apache/incubator-druid/pull/6864.

The complete task log (ending with that error) should be available in task log deep storage after the task is completed, and additional details of the ingestion task can be found from the ingestion reporting feature if enabled on the task, described here http://druid.io/docs/latest/ingestion/reports.html.

It’s also possible that log4j is not configured correctly and only printing error level messages by default

thanks, I have find the reasons.
l have modify the log4j2 file, and use rollingfileappender.

but, now, l also don't know how to fix it, I want to use rollingfileappender to solve big log.