sandeep

Hi Guys,

. Actually, we are using ambari hortonworks. And I have installed Druid on it. But I would like to know how to open Druid in GUI and how to test and access the druid.? Could you please help me.

Regards,

Sandeep

Hi

this page has docs about how to install it from Ambari https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.4/bk_data-access/content/ch_using-druid.html

Then form Ambari UI you will have links to various Druid UIs

PastedGraphic-1.pdf (129 KB)

Hi,

I am unable to view druid coordinator page after clicking on druid
coordinator console in ambari. its getting "page cannot be loaded".

i got stuck in sending the data from hive to druid.

Please send me the sample data for the execution of hive to druid.

i am confused. please help me out.

https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.4/bk_data-access/content/ch_using-druid.html

I got stuck from this step :

Access Druid with a curl command. When you run the command, you must
include the SPNEGO protocol --negotiateargument. (Note that this
argument has double hyphens.) The following is an example command.
Replace anyUser, cookies.txt, and endpoint with your real values.

curl --negotiate -u:anyUser -b ~/cookies.txt -c ~/cookies.txt -X POST
-H'Content-Type: application/json' http://_endpoint - unable to
execute

Submit a query to Druid in the following example curl command format:

curl --negotiate -u:anyUser -b ~/cookies.txt -c ~/cookies.txt -X POST
-H'Content-Type: application/json'
http://broker-host:port/druid/v2/?pretty -d @query.json- unable to
execute

Put all the Hive data to undergo ETL in an external table.-unable to execute

Run a CREATE TABLE AS SELECT statement to create a new Druid
datasource. The following is an example of a statement pushing Hive
data to Druid.-unable to execute

Regards,
Sandeep

Hi Team,

I am facing the error while creating table. Could you please help us.

[root@ip-10-230-245-117 hive]# beeline -u
"jdbc:hive2://ip-10-230-245-117.ec2.internal:2181,ip-10-230-245-1.ec2.internal:2181,ip-10-230-245-193.ec2.internal:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2-hive2"
Connecting to jdbc:hive2://ip-10-230-245-117.ec2.internal:2181,ip-10-230-245-1.ec2.internal:2181,ip-10-230-245-193.ec2.internal:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2-hive2
Connected to: Apache Hive (version 2.1.0.2.6.3.0-235)
Driver: Hive JDBC (version 1.2.1000.2.6.3.0-235)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 1.2.1000.2.6.3.0-235 by Apache Hive
0: jdbc:hive2://ip-10-230-245-117.ec2.interna> CREATE TABLE druid_table_1
0: jdbc:hive2://ip-10-230-245-117.ec2.interna> (`__time` TIMESTAMP,
`dimension1` STRING, `dimension2` STRING, `metric1` INT, `metric2`
FLOAT)
0: jdbc:hive2://ip-10-230-245-117.ec2.interna> STORED BY
'org.apache.hadoop.hive.druid.DruidStorageHandler';
INFO : Compiling
command(queryId=hive_20180419060747_4845a8b1-02e1-4f1b-bdf2-77c2ce6d94ff):
CREATE TABLE druid_table_1
(`__time` TIMESTAMP, `dimension1` STRING, `dimension2` STRING,
`metric1` INT, `metric2` FLOAT)
STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler'
INFO : We are setting the hadoop caller context from
HIVE_SSN_ID:4b1c47c9-e2e1-4085-ab04-f2165b7eb056 to
hive_20180419060747_4845a8b1-02e1-4f1b-bdf2-77c2ce6d94ff
INFO : Semantic Analysis Completed
INFO : Returning Hive schema: Schema(fieldSchemas:null, properties:null)
INFO : Completed compiling
command(queryId=hive_20180419060747_4845a8b1-02e1-4f1b-bdf2-77c2ce6d94ff);
Time taken: 0.008 seconds
INFO : We are resetting the hadoop caller context to
HIVE_SSN_ID:4b1c47c9-e2e1-4085-ab04-f2165b7eb056
INFO : Setting caller context to query id
hive_20180419060747_4845a8b1-02e1-4f1b-bdf2-77c2ce6d94ff
INFO : Executing
command(queryId=hive_20180419060747_4845a8b1-02e1-4f1b-bdf2-77c2ce6d94ff):
CREATE TABLE druid_table_1
(`__time` TIMESTAMP, `dimension1` STRING, `dimension2` STRING,
`metric1` INT, `metric2` FLOAT)
STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler'
INFO : Starting task [Stage-0:DDL] in serial mode
ERROR : FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask.
org.skife.jdbi.v2.exceptions.UnableToObtainConnectionException:
java.sql.SQLException: Cannot create PoolableConnectionFactory
(Communications link failure

The last packet sent successfully to the server was 0 milliseconds
ago. The driver has not received any packets from the server.)
INFO : Resetting the caller context to
HIVE_SSN_ID:4b1c47c9-e2e1-4085-ab04-f2165b7eb056
INFO : Completed executing
command(queryId=hive_20180419060747_4845a8b1-02e1-4f1b-bdf2-77c2ce6d94ff);
Time taken: 253.44 seconds
Error: Error while processing statement: FAILED: Execution Error,
return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask.
org.skife.jdbi.v2.exceptions.UnableToObtainConnectionException:
java.sql.SQLException: Cannot create PoolableConnectionFactory
(Communications link failure

The last packet sent successfully to the server was 0 milliseconds
ago. The driver has not received any packets from the server.)
(state=08S01,code=1)
0: jdbc:hive2://ip-10-230-245-117.ec2.interna>
0: jdbc:hive2://ip-10-230-245-117.ec2.interna>

Regards,
Sandeep

This seems to be a Hive Issue it is better to send such email to Hive user groups or you can post your question here https://community.hortonworks.com/topics/hcc.html

Looking at the error seems like you have missing configuration for the Druid Hive integration you need to set the Druid metadata URI/UserName/Password

something like this https://github.com/cartershanklin/hive-druid-ssb/blob/master/queries.druid/index_ssb.sql#L1-L3

Hi sandeep,
I am facing the exact same issue as yours

Sorry for this out of the blue message but I just can’t seem to make it work, plus there seems to be no support from Hortonworks. Hope that you’ll be able to help.

Looking forward to your response.