Data is automatically deleted

Hi All,

We have druid installed on single machine with default configurations, and trying to load data to Druid from Kafka. After loading data, I am able to see them in datasource , but it shows size 0 MB, with Number of Rows 1. At the same time I am able to query them in SQL. But aftre sometime the data is deleted automatically, and it disappear from datasource/segment and SQL. Could you please advise what could have gone wrong. Following is the supervisor spec

{

“type”: “kafka”,

“dataSchema”: {

“dataSource”: “cei_rt”,

“parser”: {

“type”: “string”,

“parseSpec”: {

“format”: “json”,

“timestampSpec”: {

“column”: “ensDate”,

“format”: “YYYY-MM-DD”

},

“dimensionsSpec”: {

“dimensions”: [

“hashedD”,

“channel”,

“page”,

“userID”,

“employeeID”,

“employeename”,

“linkType”,

“locationID”,

“branchID”,

“menuID”,

“menuName”,

“sectionID”,

“sectionName”,

“siteSection”,

“midValue”,

“ensTime”

],

“dimensionExclusions”: [

]

}

}

},

“metricsSpec”: [

],

“granularitySpec”: {

“type”: “uniform”,

“segmentGranularity”: “fifteen_minute”,

“queryGranularity”: “NONE”

}

},

“tuningConfig”: {

“type”: “kafka”,

“maxRowsPerSegment”: 5000000

},

“ioConfig”: {

“topic”: “cms_rt”,

“consumerProperties”: {

“bootstrap.servers”: “11.111.114.11:9092”

},

“taskCount”: 1,

“replicas”: 1,

“taskDuration”: “PT1H”

}

}

Hi Soumya,

Can you check the following.

  1. Do you have any drop rules configured ? You can look at the drop rules from the Druid web/coordinator console.

  2. Did you observe any errors/exceptions on the Historical logs or any other process logs ?

Thanks,

Sashi

HI Sashidhar,

No drop rules configured by us. And where do we see the historical logs? I am checking indexing logs only.