I used hdfs as deep storage, I create an external table on dsql by name “contacts_info”. when I goto on my namenode table not found in extact location where it must be exist.
I’m not sure if I’m following you correctly, but dsql is used for querying data from Druid and not for creating dataSources or adding records. To get data into Druid, you would use an indexing task.
Have you tried the load a file tutorial here: https://druid.apache.org/docs/latest/tutorials/tutorial-batch.html ? That would be a good place to start learning about how to ingest data into Druid.
Hi David, I’m trying to make a druid cluster and almost deploy on 3 machines. zookeeper, coordinator, overlord and metadata as mysql on machine1, while historical,middlemanager on machine2 and broker on a machine3. I started zookeeper and then trying to start coordinator and overlord via below command, it give an Error: “could not found or load main class io.druid.cli.Main”
java `cat conf/druid/coordinator/jvm.config | xargs` -cp conf/druid/_common:conf/druid/coordinator:lib/* org.apache.druid.cli.Main server coordinator
Which version of Druid are you using ?
Have you tried using supervise or other shell scripts available under bin directory for starting the services ?