Hive to Druid connectivity

There is no direct connectivity from our Power BI app to Druid so we are trying to come thru Hive
So setup a Hive docker instance
after setting up the hive.druid.broker.address.default=ip:8082
if I do a create external table
I am getting below error, any idea?
this is a POC druid cluster so I am using derby as metastore

Error: org.apache.hive.service.cli.HiveSQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: MetaException(message:org.apache.hadoop.hive.serde2.SerDeException org.apache.hive.druid.com.fasterxml.jackson.databind.JsonMappingException: Can not deserialize instance of java.util.ArrayList out of VALUE_STRING token

at [Source: org.apache.hive.druid.com.metamx.http.client.io.AppendableByteArrayInputStream@60840f41; line: -1, column: 0]

    at org.apache.hive.druid.com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:857)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:853)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.handleNonArray(CollectionDeserializer.java:292)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:227)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:217)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:25)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3736)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2803)

    at org.apache.hadoop.hive.druid.serde.DruidSerDe.submitMetadataRequest(DruidSerDe.java:278)

    at org.apache.hadoop.hive.druid.serde.DruidSerDe.initialize(DruidSerDe.java:182)

    at org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:54)

    at org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:533)

    at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:442)

    at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:429)

    at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:281)

    at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:263)

    at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:833)

    at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:867)

    at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4356)

    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:354)

    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199)

    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)

    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2183)

    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1839)

    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1526)

    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)

    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1232)

    at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:255)

    at org.apache.hive.service.cli.operation.SQLOperation.access$800(SQLOperation.java:91)

    at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:348)

    at java.security.AccessController.doPrivileged(Native Method)

    at javax.security.auth.Subject.doAs(Subject.java:422)

    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)

    at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:362)

    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)

    at java.util.concurrent.FutureTask.run(FutureTask.java:266)

    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

    at java.lang.Thread.run(Thread.java:748)

)

    at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:380)

    at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:257)

    at org.apache.hive.service.cli.operation.SQLOperation.access$800(SQLOperation.java:91)

    at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:348)

    at java.security.AccessController.doPrivileged(Native Method)

    at javax.security.auth.Subject.doAs(Subject.java:422)

    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)

    at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:362)

    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)

    at java.util.concurrent.FutureTask.run(FutureTask.java:266)

    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

    at java.lang.Thread.run(Thread.java:748)

Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: MetaException(message:org.apache.hadoop.hive.serde2.SerDeException org.apache.hive.druid.com.fasterxml.jackson.databind.JsonMappingException: Can not deserialize instance of java.util.ArrayList out of VALUE_STRING token

at [Source: org.apache.hive.druid.com.metamx.http.client.io.AppendableByteArrayInputStream@60840f41; line: -1, column: 0]

    at org.apache.hive.druid.com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:857)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:853)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.handleNonArray(CollectionDeserializer.java:292)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:227)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:217)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:25)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3736)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2803)

    at org.apache.hadoop.hive.druid.serde.DruidSerDe.submitMetadataRequest(DruidSerDe.java:278)

    at org.apache.hadoop.hive.druid.serde.DruidSerDe.initialize(DruidSerDe.java:182)

    at org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:54)

    at org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:533)

    at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:442)

    at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:429)

    at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:281)

    at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:263)

    at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:833)

    at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:867)

    at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4356)

    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:354)

    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199)

    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)

    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2183)

    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1839)

    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1526)

    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)

    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1232)

    at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:255)

    at org.apache.hive.service.cli.operation.SQLOperation.access$800(SQLOperation.java:91)

    at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:348)

    at java.security.AccessController.doPrivileged(Native Method)

    at javax.security.auth.Subject.doAs(Subject.java:422)

    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)

    at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:362)

    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)

    at java.util.concurrent.FutureTask.run(FutureTask.java:266)

    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

    at java.lang.Thread.run(Thread.java:748)

)

    at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:862)

    at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:867)

    at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4356)

    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:354)

    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199)

    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)

    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2183)

    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1839)

    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1526)

    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)

    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1232)

    at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:255)

    ... 11 more

Caused by: java.lang.RuntimeException: MetaException(message:org.apache.hadoop.hive.serde2.SerDeException org.apache.hive.druid.com.fasterxml.jackson.databind.JsonMappingException: Can not deserialize instance of java.util.ArrayList out of VALUE_STRING token

at [Source: org.apache.hive.druid.com.metamx.http.client.io.AppendableByteArrayInputStream@60840f41; line: -1, column: 0]

    at org.apache.hive.druid.com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:857)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:853)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.handleNonArray(CollectionDeserializer.java:292)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:227)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:217)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:25)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3736)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2803)

    at org.apache.hadoop.hive.druid.serde.DruidSerDe.submitMetadataRequest(DruidSerDe.java:278)

    at org.apache.hadoop.hive.druid.serde.DruidSerDe.initialize(DruidSerDe.java:182)

    at org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:54)

    at org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:533)

    at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:442)

    at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:429)

    at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:281)

    at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:263)

    at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:833)

    at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:867)

    at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4356)

    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:354)

    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199)

    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)

    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2183)

    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1839)

    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1526)

    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)

    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1232)

    at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:255)

    at org.apache.hive.service.cli.operation.SQLOperation.access$800(SQLOperation.java:91)

    at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:348)

    at java.security.AccessController.doPrivileged(Native Method)

    at javax.security.auth.Subject.doAs(Subject.java:422)

    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)

    at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:362)

    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)

    at java.util.concurrent.FutureTask.run(FutureTask.java:266)

    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

    at java.lang.Thread.run(Thread.java:748)

)

    at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:283)

    at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:263)

    at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:833)

    ... 22 more

Caused by: MetaException(message:org.apache.hadoop.hive.serde2.SerDeException org.apache.hive.druid.com.fasterxml.jackson.databind.JsonMappingException: Can not deserialize instance of java.util.ArrayList out of VALUE_STRING token

at [Source: org.apache.hive.druid.com.metamx.http.client.io.AppendableByteArrayInputStream@60840f41; line: -1, column: 0]

   at org.apache.hive.druid.com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:857)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:853)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.handleNonArray(CollectionDeserializer.java:292)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:227)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:217)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:25)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3736)

    at org.apache.hive.druid.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2803)

    at org.apache.hadoop.hive.druid.serde.DruidSerDe.submitMetadataRequest(DruidSerDe.java:278)

    at org.apache.hadoop.hive.druid.serde.DruidSerDe.initialize(DruidSerDe.java:182)

    at org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:54)

    at org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:533)

    at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:442)

    at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:429)

    at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:281)

    at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:263)

    at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:833)

    at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:867)

    at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4356)

    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:354)

    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199)

    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)

    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2183)

    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1839)

    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1526)

    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)

    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1232)

    at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:255)

    at org.apache.hive.service.cli.operation.SQLOperation.access$800(SQLOperation.java:91)

    at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:348)

    at java.security.AccessController.doPrivileged(Native Method)

    at javax.security.auth.Subject.doAs(Subject.java:422)

    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)

    at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:362)

    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)

    at java.util.concurrent.FutureTask.run(FutureTask.java:266)

    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

    at java.lang.Thread.run(Thread.java:748)

)

    at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:450)

    at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:429)

    at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:281)

    ... 24 more (state=08S01,code=1)

Relates to Apache Druid

Hi @ranjan,

Can you share the CREATE EXTERNAL TABLE command?

Thanks,
Sergio

create external table hive_table_name stored by ‘org.apache.hadoop.hive.druid.DruidStorageHandler’ TBLPROPERTIES (“druid.datasource”=“druid table name”);

the docker image has hive 2, do you think Hive 3 will solve the issue