Hi,
I’m trying to stream/read data from a kafka topic in avro format that was serialized using hortonworks schema registry through nifi but things doesn’t seem to work for me. looks like druid 0.17 only works with confluent schema registry through
…
"avroBytesDecoder" : {
"type" : “schema_registry”,
"url" :
}
…
my kafka json ingestion spec:
{
“type”: “kafka”,
“dataSchema”: {
“dataSource”: “location”,
“parser”: {
“type”: “avro_stream”,
“avroBytesDecoder”: {
“type”: “schema_registry”,
“url”: “http://127.0.0.1:9090/api/v1”
},
“parseSpec”: {
“format”: “avro”,
“timestampSpec”: {
“column”: “timestamp”,
“format”: “auto”
},
“dimensionsSpec”: {
“dimensions”: [
“view”
]
}
}
},
“metricsSpec” : [
{“type”: “count”, “name”: “countagg”}
],
“granularitySpec”: {
“type”: “uniform”,
“segmentGranularity”: “HOUR”,
“queryGranularity”: “HOUR”,
“rollup”: false,
“intervals”: null
}
},
“tuningConfig”: {
“type”: “kafka”,
“reportParseExceptions”: false,
“offsetFetchPeriod” : “PT120S”,
“logParseExceptions”: true
},
“ioConfig”: {
“useEarliestOffset”: true,
“topic”: “location”,
“replicas”: 1,
“taskDuration”: “PT120M”,
“completionTimeout”: “PT240M”,
“consumerProperties”: {
“bootstrap.servers”: “localhost:9092”
}
}
}
Attached is the error i’m getting.
Regards,
/
error.log (34.8 KB)