Schema registry in Druid | Using Kafka as an ingestion source

Hi Team,

I’m trying to ingest the data to druid using Kafka as an ingestion source. The Kafka topics which I’m using are having schema registry URL with specific username and password associated to the URL. How I can mention these details in the consumer properties. Please if you can add examples too for that where we can add schema registry with the username and password associated with this as well.

Thanks in advance.
Nikhil

Welcome Nikhil! Maybe EnvironmentVariableDynamicConfigProvider can help? It can be configured to

store passwords or other sensitive information using system environment variables instead of plain text configuration.

Hey @N_K – Druid uses the standard consumerproperties definition – so I think (having not done this myself!) you will need something like:

sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="(username)" password="(password)";
security.protocol=SASL_PLAINTEXT
sasl.mechanism=PLAIN

I got this from java - Kafka "Login module not specified in JAAS config" - Stack Overflow

OH WAIT
To the schema registry?

Oh…
ermmm…
Of that I am not sure… @Hellmar_Becker is more than quite experienced in these things – I hope maybe he can give you some pointers?

Thanks much Peter.

@Hellmar_Becker - Appreciate your help on this please

@N_K it’s actually not in the consumer properties but in the avroBytesDecoder/protoBytesDecoder. Here are some examples:

https://blog.hellmar-becker.de/2022/05/26/ingesting-protobuf-messages-into-apache-druid/

3 Likes

Thanks @Hellmar_Becker , let me try to implement the same. Could you please tell me
if we have same functionality of schema registry for JSON Format data as well like we have the avroBytesDecoder/protoBytesDecoder for AVRO?

not to my knowledge …