Is the docker installation must for hadoop based Ingestion?

Hi everyone, I wanted to know is the docker installation must for loading data from hdfs into druid?

When I specify the hdfs paths in paths tab in druid for loading the data, its throwing error. Is it because I haven’t installed docker or because of some other issues?

Please help me to understand hadoop based ingestion. Your help is much appreciated, Thanks in advance

Welcome to the community @Nalina.
The docker installation is for quickstart mainly, it is not a mandate.
Can you post your ingestion spec (mainly the ioConfig) and also the error you get?

Thanks!

Hi, can you help me how to define the path?

The error I get is,
Error: HTML Error:java.lang.IllegalArgumentException: java.net.URISyntaxException: Illegal Character in scheme name at index 0:

I defined the path as hdfs://nn1:8088/full path of the file

Did you follow these steps in druid,

No Senthil, I haven’t followed those steps. I will include those configurations. Thank you very much for your help.

Hi, we have all the above configurations but still getting the same error I mentioned above. Is the path definition format correct? Your help is very much appreciated

Hi Nalina, please make sure there is no space in the path when you define it.