Apache kafka avro ingestion showing success.. But no data in segment .. no datasource visible

what should i do..

2019-06-23T14:58:31,049 INFO [task-runner-0-priority-0] org.apache.druid.indexing.worker.executor.ExecutorLifecycle - Task completed with status: {
  "id" : "index_kafka_tdc_d314c8b3c49e91a_bebmfiod",
  "status" : "SUCCESS",
  "duration" : 901225,
  "errorMsg" : null
}

2019-06-23T14:58:30,959 INFO [task-runner-0-priority-0] org.apache.druid.indexing.seekablestream.SeekableStreamIndexTaskRunner - Ingestion loop resumed
2019-06-23T14:58:30,959 INFO [task-runner-0-priority-0] org.apache.druid.indexing.seekablestream.SeekableStreamIndexTaskRunner - Finished reading partition[0].
2019-06-23T14:58:30,961 INFO [task-runner-0-priority-0] org.apache.druid.indexing.seekablestream.SeekableStreamIndexTaskRunner - All partitions have been fully read
2019-06-23T14:58:30,961 INFO [task-runner-0-priority-0] org.apache.druid.indexing.seekablestream.SeekableStreamIndexTaskRunner - Persisting all pending data
2019-06-23T14:58:30,962 INFO [task-runner-0-priority-0] org.apache.druid.segment.realtime.appenderator.StreamAppenderatorDriver - Persisting data.
2019-06-23T14:58:30,969 INFO [task-runner-0-priority-0] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Submitting persist runnable for dataSource[tdc]
2019-06-23T14:58:30,972 INFO [tdc-incremental-persist] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Committing metadata[AppenderatorDriverMetadata{segments={}, lastSegmentIds={}, callerMetadata={nextPartitions=SeekableStreamEndSequenceNumbers{stream='tdc', partitionSequenceNumberMap={0=43589}}}}] for sinks[].
2019-06-23T14:58:30,985 INFO [task-runner-0-priority-0] org.apache.druid.segment.realtime.appenderator.StreamAppenderatorDriver - Persisted pending data in 23ms.
2019-06-23T14:58:30,986 INFO [task-runner-0-priority-0] org.apache.druid.indexing.seekablestream.SeekableStreamIndexTaskRunner - Publishing segments for sequence [SequenceMetadata{sequenceId=0, sequenceName='index_kafka_tdc_d314c8b3c49e91a_0', assignments=[], startOffsets={0=42669}, exclusiveStartPartitions=[], endOffsets={0=43589}, sentinel=false, checkpointed=true}]
2019-06-23T14:58:30,995 INFO [task-runner-0-priority-0] org.apache.druid.segment.realtime.appenderator.BaseAppenderatorDriver - Pushing segments in background: []
2019-06-23T14:58:30,995 INFO [task-runner-0-priority-0] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Submitting persist runnable for dataSource[tdc]
2019-06-23T14:58:30,996 INFO [tdc-incremental-persist] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Committing metadata[AppenderatorDriverMetadata{segments={}, lastSegmentIds={}, callerMetadata={nextPartitions=SeekableStreamStartSequenceNumbers{stream='tdc', partitionSequenceNumberMap={0=43589}, exclusivePartitions=[]}, publishPartitions=SeekableStreamEndSequenceNumbers{stream='tdc', partitionSequenceNumberMap={0=43589}}}}] for sinks[].
2019-06-23T14:58:31,007 INFO [publish-0] org.apache.druid.segment.realtime.appenderator.BaseAppenderatorDriver - Nothing to publish, skipping publish step.
2019-06-23T14:58:31,010 INFO [publish-0] org.apache.druid.indexing.seekablestream.SeekableStreamIndexTaskRunner - Published segments [] with metadata[AppenderatorDriverMetadata{segments={}, lastSegmentIds={}, callerMetadata={nextPartitions=SeekableStreamStartSequenceNumbers{stream='tdc', partitionSequenceNumberMap={0=43589}, exclusivePartitions=[]}, publishPartitions=SeekableStreamEndSequenceNumbers{stream='tdc', partitionSequenceNumberMap={0=43589}}}}].
2019-06-23T14:58:31,010 INFO [publish-0] org.apache.druid.indexing.seekablestream.SeekableStreamIndexTaskRunner - Persisting Sequences Metadata [[]]
2019-06-23T14:58:31,013 INFO [task-runner-0-priority-0] org.apache.druid.indexing.seekablestream.SeekableStreamIndexTaskRunner - Handoff completed for segments [] with metadata[{nextPartitions=SeekableStreamStartSequenceNumbers{stream='tdc', partitionSequenceNumberMap={0=43589}, exclusivePartitions=[]}, publishPartitions=SeekableStreamEndSequenceNumbers{stream='tdc', partitionSequenceNumberMap={0=43589}}}].
2019-06-23T14:58:31,013 INFO [task-runner-0-priority-0] org.apache.druid.segment.realtime.appenderator.AppenderatorImpl - Shutting down...
2019-06-23T14:58:31,018 INFO [task-runner-0-priority-0] org.apache.druid.segment.realtime.firehose.ServiceAnnouncingChatHandlerProvider - Unregistering chat handler[index_kafka_tdc_d314c8b3c49e91a_bebmfiod]
2019-06-23T14:58:31,018 INFO [task-runner-0-priority-0] org.apache.druid.curator.discovery.CuratorDruidNodeAnnouncer - Unannouncing [DiscoveryDruidNode{druidNode=DruidNode{serviceName='druid/overlord', host='res-006.dnx.com', bindOnHost=false, port=-1, plaintextPort=8100, enablePlaintextPort=true, tlsPort=-1, enableTlsPort=false}, nodeType='PEON', services={dataNodeService=DataNodeService{tier='_default_tier', maxSize=0, type=indexer-executor, priority=0}, lookupNodeService=LookupNodeService{lookupTier='__default'}}}].
2019-06-23T14:58:31,018 INFO [task-runner-0-priority-0] org.apache.druid.curator.announcement.Announcer - unannouncing [/dnx/druid/internal-discovery/PEON/res-006.dnx.com:8100]
2019-06-23T14:58:31,037 INFO [task-runner-0-priority-0] org.apache.druid.curator.discovery.CuratorDruidNodeAnnouncer - Unannounced [DiscoveryDruidNode{druidNode=DruidNode{serviceName='druid/overlord', host='res-006.dnx.com', bindOnHost=false, port=-1, plaintextPort=8100, enablePlaintextPort=true, tlsPort=-1, enableTlsPort=false}, nodeType='PEON', services={dataNodeService=DataNodeService{tier='_default_tier', maxSize=0, type=indexer-executor, priority=0}, lookupNodeService=LookupNodeService{lookupTier='__default'}}}].
2019-06-23T14:58:31,037 INFO [task-runner-0-priority-0] org.apache.druid.server.coordination.CuratorDataSegmentServerAnnouncer - Unannouncing self[DruidServerMetadata{name='res-006.dnx.com:8100', hostAndPort='res-006.dnx.com:8100', hostAndTlsPort='null', maxSize=0, tier='_default_tier', type=indexer-executor, priority=0}] at [/dnx/druid/announcements/res-006.dnx.com:8100]
2019-06-23T14:58:31,037 INFO [task-runner-0-priority-0] org.apache.druid.curator.announcement.Announcer - unannouncing [/dnx/druid/announcements/res-006.dnx.com:8100]
2019-06-23T14:58:31,046 INFO [task-runner-0-priority-0] org.apache.druid.indexing.overlord.TaskRunnerUtils - Task [index_kafka_tdc_d314c8b3c49e91a_bebmfiod] status changed to [SUCCESS]

Hi Strinix,

Looks like the indexing task was able to read from Kafka but did not produce any segments. This is typically indicative of a timestamp parsing error. You can enable logging for parsing errors by adding ‘logParseExceptions: true’ to the tuningConfig of your supervisor spec. This should help to identify what needs to be changed in your supervisor spec.

If you still need assistance after enabling parse exception logging, please post your supervisor spec and a sample event so that we can make sure your parse spec makes sense for your data.