FAILED: ParseException line 1:27 cannot recognize input near ''org.apache.hadoop.hive.druid.DruidStorageHandler'' 'TBLPROPERTIES' '(' in create table statement hive>

Hi guys!

I’m creating new druid datasource from hive where I want it to be overwrite with hive existing table(sale_orc).

as mention in docs: https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.0/using-druid/content/druid_anatomy_of_hive_to_druid.html

Here is the statement followed by above docs.

CREATE TABLE druid_table_4 ‘org.apache.hadoop.hive.druid.DruidStorageHandler’

TBLPROPERTIES (“druid.segment.granularity”=“MONTH”,“druid.query.granularity”=“DAY”)

AS SELECT

cast(saledate as timestamp) as __time,

cast(distcode_locationiD_key as string) distloc_key,

cast(doc_no as string) docno,

cast(skucode_key as string) skucode_key,

cast(order_type as string) ordertype,

cast(doc_date as string) docdate,

cast(ex_factory_value as string) exfact_value,

cast(shopcode_without_dist as string) shopwith_dist,

cast(lfl_ly as string) lfl_ly,

cast(lfl_ty as string) lfl_ty,

cast(distributorcode_dsr_code as string) distcode_dsrcode,

cast(doc_no_key as string) docno_key,

cast(dist_section_key as string) distsec_key,

cast(distributorcode_dsrcode_pjp_sell_category as string) ddpsc,

cast(shopcode_without_dist_saledate as string) shopwith_distsaledate,

cast(cartons as double) cartons,

cast(units as double) units,

cast(net_amount as double) net_amount,

cast(gross_amount as double) gross_amount,

cast(foreign_amount as double) foreign_amount

FROM sales.sale_orc;

when I submit the query it shows an error: FAILED: ParseException line 1:27 cannot recognize input near ‘‘org.apache.hadoop.hive.druid.DruidStorageHandler’’ ‘TBLPROPERTIES’ ‘(’ in create table statement hive>

I don’t know why it showing this error, is this due to the hive version compatibility with druid? or hive doesn’t support this type of CTAS statement.

I’ve just successfully created an existing datasource(wikipedia) of druid in hive with statement: create external table druid_table1 stored by ‘org.apache.hadoop.hive.druid.DruidStorageHandler’ tblproperties (“druid.datasource”=“wikipedia”);

Hi Umar,

It seems your create statement is wrong "CREATE TABLE druid_table_4 ‘org.apache.hadoop.hive.druid.DruidStorageHandler’ " . You are missing STORED BY 'org.apache.hadoop.hive.

druid.DruidStorageHandler’

Eg :

CREATE TABLE ssb_druid_hive
STORED BY ‘org.apache.hadoop.hive.
druid.DruidStorageHandler’
TBLPROPERTIES

Could you correct it and try .

Thanks and Regards,

Vaibhav

Thanks vaibhav!
I tried and it solve the issue but another error has occurred which is : FAILED: SemanticException No column with timestamp with local time-zone type on query result; one column should be of timestamp with local time-zone type

Can you find out for me?

Seems you are hitting :
https://issues.apache.org/jira/browse/HIVE-18156

You may need to check if the used hdp version has a fix for this or need to get it backported. Hope this helps.

Thanks and Regards,

Vaibhav

Hi vaibhav!
I could not understand,I checked the link you provided but did get any solution yet.

Can you please define it for me.

Thanks.

HI Umar,

It seems you are hitting a Hive issue as described in my last update. I think the right place to ask about your last error is HIVE/HDP community.

Thanks and regards,

Vaibhav

Thanks Vaibhav!
I’ll check that soon