org.apache.druid.cli.Main tools insert-segment-to-db error

HI ,
2019-04-25T16:17:59,284 INFO [main] org.apache.druid.server.emitter.EmitterModule - Underlying emitter for ServiceEmitter: org.apache.druid.java.util.emitter.core.NoopEmitter@1352434e

2019-04-25T16:17:59,284 INFO [main] org.apache.druid.server.emitter.EmitterModule - Extra service dimensions: {version=0.13.0-incubating}

2019-04-25T16:17:59,285 INFO [main] org.apache.druid.server.metrics.MetricsModule - Adding monitor[org.apache.druid.query.ExecutorServiceMonitor@3b545206]

2019-04-25T16:17:59,286 INFO [main] org.apache.druid.server.metrics.MetricsModule - Adding monitor[org.apache.druid.server.initialization.jetty.JettyServerModule$JettyMonitor@26586b74]

2019-04-25T16:17:59,307 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.metadata.MetadataStorageTablesConfig] from props[druid.metadata.storage.tables.] as [org.apache.druid.metadata.MetadataStorageTablesConfig@7f5538a1]

2019-04-25T16:17:59,313 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.metadata.MetadataStorageConnectorConfig] from props[druid.metadata.storage.connector.] as [DbConnectorConfig{createTables=true, connectURI=‘jdbc:mysql://172.19.104.253:3306/druid-oplate’, user=‘druid’, passwordProvider=org.apache.druid.metadata.DefaultPasswordProvider}]

2019-04-25T16:17:59,319 INFO [main] org.apache.druid.guice.JsonConfigurator - Loaded class[class org.apache.druid.metadata.storage.mysql.MySQLConnectorConfig] from props[druid.metadata.mysql.ssl.] as [MySQLConnectorConfig{useSSL=‘false’, clientCertificateKeyStoreUrl=‘null’, clientCertificateKeyStoreType=‘null’, verifyServerCertificate=‘false’, trustCertificateKeyStoreUrl=‘null’, trustCertificateKeyStoreType=‘null’, enabledSSLCipherSuites=null, enabledTLSProtocols=null}]

2019-04-25T16:17:59,375 INFO [main] org.apache.druid.metadata.storage.mysql.MySQLConnector - Configured MySQL as metadata storage

2019-04-25T16:17:59,376 INFO [main] org.apache.druid.cli.InsertSegment - Start searching segments under [hdfs:///hadoop/druid/oplate/segments/wikipedia1]

2019-04-25T16:18:00,315 INFO [main] org.apache.druid.storage.hdfs.HdfsDataSegmentFinder - hdfs

2019-04-25T16:18:00,316 INFO [main] org.apache.druid.storage.hdfs.HdfsDataSegmentFinder - FileSystem URI:hdfs://hadoop

2019-04-25T16:18:20,368 INFO [main] org.apache.hadoop.ipc.Client - Retrying connect to server: hstore/220.250.64.26:8020. Already tried 0 time(s); maxRetries=45

These exceptions occur with insert-segment-to-db, but deep storage is normal.

Is insert-segment-to-db not loading hadoop conf core-site.xml?

Executing commands

java -Ddruid.metadata.storage.type=mysql -Ddruid.metadata.storage.connector.connectURI= -Ddruid.metadata.storage.connector.user=druid -Ddruid.metadata.storage.connector.password= -Ddruid.extensions.loadList=[“mysql-metadata-storage”,“druid-hdfs-storage”] -Ddruid.storage.type=hdfs -cp “/home/ant/druid/apache-druid-0.13.0-incubating/lib/*” org.apache.druid.cli.Main tools insert-segment-to-db --workingDir hdfs://hdfshost:port//druid/oplate/segments/wikipedia1 --updateDescriptor true

Hm, since you’re reading from HDFS, I think you may need to put a directory containing your hadoop .xml configs in the classpath when you run the insert-segment-to-db tool.

Thank you very much for your reply. According to your way, the problem has been solved.

在 2019年4月26日星期五 UTC+8上午4:34:36,Jonathan Wei写道: