Facing the below error.
0: jdbc:hive2://ip-10-230-245-117.ec2.interna> Create external table druid_table_1 STORED BY ‘org.apache.hadoop.hive.druid.DruidStorageHandler’ TBLPROPERTIES (“druid.datasource” = “metrics-kafka”);
Error: Error while compiling statement: FAILED: SemanticException Cannot find class ‘org.apache.hadoop.hive.druid.DruidStorageHandler’ (state=42000,code=40000)
And what do we need to keep in druid.datasource ?
The title says
Migrating the data from Hive to Druid but the query you are using is actual not doing kind off the opposite of that.
Your query is creating an external table view in Hive of a datasource existing in Druid. Thus it is not what you want to use.
You need to create table like this example https://github.com/cartershanklin/hive-druid-ssb/blob/master/queries.druid/index_ssb.sql#L12 if you want to write data from Hive to Druid.
On the side you are getting an exception about class not found that is because you are connection to Hive1. The Druid Hive integration is part of Hive Interactive (Hive2).
Since this is a Hive related question i will recommend using this platform https://community.hortonworks.com/index.html there is more Hive experts over there.
Hope this helps