I enabled http log emitter in all druid nodes and have a rest end-point that collects these metrics and pumps them into Druid using Tranquility API.
I can see that tasks are successfully submitted to Overrlord. And middle-manager and peons are working on them and tasks are being completed with SUCCESS status.
However, I notice few strange things -
1 - I use S3 as deep storage. And as mentioned in production-cluster document, I set the following in Overlord node -
# Upload all task logs to deep storage druid.indexer.logs.type=s3 druid.indexer.logs.s3Bucket=druid druid.indexer.logs.s3Prefix=prod/logs/v1
I can see the task being created inside druid bucket, but log content is empty - 0 bytes - both before and after the task is completed.
2 - The Segment datasource and the segments are never created in S3 after the task is successful.
Note # I see no errors in overlord node, middle-manager node and peon logs. Of course, I had to juggle with peon memory settings to get it right.
What am I missing? Any advice?