hadoop index successful, but no segment created

Hi,

impression-determine_partitions_hashed-Optional.of([2010-08-31T00:00:00.000Z/2019-09-01T00:00:00.000Z]) is Succesful.

In hdfs I only see old segments, no new segments.

Please advise,

Nicu

Job was started via overlord.
The task reports fail, with what looks like connectivity to yarn / job history server, error occured after determine_partitions job completed, which probably stopped the next steps including metadata update/create in mysql, hdfs. Are there also other jobs needed? (by the name, it looks like the first job just computes partitions for the second one)?

So, what I am giving below is the stack trace in hadoop index task log, started thru overlord, which did the first job well, but could not continue the workflow and data was not imported to druid.

Pls advise!:frowning:

java.lang.RuntimeException: java.lang.reflect.InvocationTargetException

at com.google.api.client.repackaged.com.google.common.base.Throwables.propagate(Throwables.java:160) ~[google-http-client-1.15.0-rc.jar:?]

at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:132) ~[druid-indexing-service-0.8.1.jar:0.8.1]

at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:173) ~[druid-indexing-service-0.8.1.jar:0.8.1]

at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:235) [druid-indexing-service-0.8.1.jar:0.8.1]

at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:214) [druid-indexing-service-0.8.1.jar:0.8.1]

at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_66]

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_66]

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_66]

at java.lang.Thread.run(Thread.java:745) [?:1.8.0_66]

Caused by: java.lang.reflect.InvocationTargetException

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_66]

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_66]

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_66]

at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_66]

at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:129) ~[druid-indexing-service-0.8.1.jar:0.8.1]

… 7 more

Caused by: java.lang.RuntimeException: java.io.IOException: java.net.ConnectException: Call From druid1.adswizz.com/10.0.60.211 to 0.0.0.0:10020 failed on connection exception: java.net.ConnectException:

Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused

at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[guava-16.0.1.jar:?]

at io.druid.indexer.DetermineHashedPartitionsJob.run(DetermineHashedPartitionsJob.java:201) ~[druid-indexing-hadoop-0.8.1.jar:0.8.1]

at io.druid.indexer.JobHelper.runJobs(JobHelper.java:182) ~[druid-indexing-hadoop-0.8.1.jar:0.8.1]

at io.druid.indexer.HadoopDruidDetermineConfigurationJob.run(HadoopDruidDetermineConfigurationJob.java:84) ~[druid-indexing-hadoop-0.8.1.jar:0.8.1]

at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:289) ~[druid-indexing-service-0.8.1.jar:0.8.1]

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_66]

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_66]

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_66]

at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_66]

at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:129) ~[druid-indexing-service-0.8.1.jar:0.8.1]

… 7 more

Caused by: java.io.IOException: java.net.ConnectException: Call From druid1.adswizz.com/10.0.60.211 to 0.0.0.0:10020 failed on connection exception: java.net.ConnectException: Connection refused; For more

details see: http://wiki.apache.org/hadoop/ConnectionRefused

at org.apache.hadoop.mapred.ClientServiceDelegate.invoke(ClientServiceDelegate.java:334) ~[?:?]

at org.apache.hadoop.mapred.ClientServiceDelegate.getJobStatus(ClientServiceDelegate.java:419) ~[?:?]

at org.apache.hadoop.mapred.YARNRunner.getJobStatus(YARNRunner.java:524) ~[?:?]

at org.apache.hadoop.mapreduce.Job$1.run(Job.java:314) ~[?:?]

at org.apache.hadoop.mapreduce.Job$1.run(Job.java:311) ~[?:?]

at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_66]

at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_66]

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) ~[?:?]

at org.apache.hadoop.mapreduce.Job.updateStatus(Job.java:311) ~[?:?]

at org.apache.hadoop.mapreduce.Job.isSuccessful(Job.java:611) ~[?:?]

at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1318) ~[?:?]

at io.druid.indexer.DetermineHashedPartitionsJob.run(DetermineHashedPartitionsJob.java:114) ~[druid-indexing-hadoop-0.8.1.jar:0.8.1]

at io.druid.indexer.JobHelper.runJobs(JobHelper.java:182) ~[druid-indexing-hadoop-0.8.1.jar:0.8.1]

at io.druid.indexer.HadoopDruidDetermineConfigurationJob.run(HadoopDruidDetermineConfigurationJob.java:84) ~[druid-indexing-hadoop-0.8.1.jar:0.8.1]

at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:289) ~[druid-indexing-service-0.8.1.jar:0.8.1]

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_66]

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_66]

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_66]

at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_66]

at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:129) ~[druid-indexing-service-0.8.1.jar:0.8.1]

… 7 more

Caused by: java.net.ConnectException: Call From druid1.adswizz.com/10.0.60.211 to 0.0.0.0:10020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http:/

/wiki.apache.org/hadoop/ConnectionRefused

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_66]

at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_66]

at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_66]

at java.lang.reflect.Constructor.newInstance(Constructor.java:422) ~[?:1.8.0_66]

at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783) ~[?:?]

at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730) ~[?:?]

at org.apache.hadoop.ipc.Client.call(Client.java:1410) ~[?:?]

at org.apache.hadoop.ipc.Client.call(Client.java:1359) ~[?:?]

at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) ~[?:?]

at com.sun.proxy.$Proxy193.getJobReport(Unknown Source) ~[?:?]

at org.apache.hadoop.mapreduce.v2.api.impl.pb.client.MRClientProtocolPBClientImpl.getJobReport(MRClientProtocolPBClientImpl.java:133) ~[?:?]

at sun.reflect.GeneratedMethodAccessor60.invoke(Unknown Source) ~[?:?]

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_66]

at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_66]

at org.apache.hadoop.mapred.ClientServiceDelegate.invoke(ClientServiceDelegate.java:320) ~[?:?]

at org.apache.hadoop.mapred.ClientServiceDelegate.getJobStatus(ClientServiceDelegate.java:419) ~[?:?]

at org.apache.hadoop.mapred.YARNRunner.getJobStatus(YARNRunner.java:524) ~[?:?]

at org.apache.hadoop.mapreduce.Job$1.run(Job.java:314) ~[?:?]

at org.apache.hadoop.mapreduce.Job$1.run(Job.java:311) ~[?:?]

at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_66]

at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_66]

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) ~[?:?]

at org.apache.hadoop.mapreduce.Job.updateStatus(Job.java:311) ~[?:?]

at org.apache.hadoop.mapreduce.Job.isSuccessful(Job.java:611) ~[?:?]

at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1318) ~[?:?]

at io.druid.indexer.DetermineHashedPartitionsJob.run(DetermineHashedPartitionsJob.java:114) ~[druid-indexing-hadoop-0.8.1.jar:0.8.1]

at io.druid.indexer.JobHelper.runJobs(JobHelper.java:182) ~[druid-indexing-hadoop-0.8.1.jar:0.8.1]

at io.druid.indexer.HadoopDruidDetermineConfigurationJob.run(HadoopDruidDetermineConfigurationJob.java:84) ~[druid-indexing-hadoop-0.8.1.jar:0.8.1]

at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:289) ~[druid-indexing-service-0.8.1.jar:0.8.1]

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_66]

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_66]

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_66]

at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_66]

at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:129) ~[druid-indexing-service-0.8.1.jar:0.8.1]

… 7 more

Caused by: java.net.ConnectException: Connection refused

at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:1.8.0_66]

at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) ~[?:1.8.0_66]

at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[?:?]

at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529) ~[?:?]

at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493) ~[?:?]

at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:601) ~[?:?]

at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:696) ~[?:?]

at org.apache.hadoop.ipc.Client$Connection.access$2700(Client.java:367) ~[?:?]

at org.apache.hadoop.ipc.Client.getConnection(Client.java:1458) ~[?:?]

at org.apache.hadoop.ipc.Client.call(Client.java:1377) ~[?:?]

at org.apache.hadoop.ipc.Client.call(Client.java:1359) ~[?:?]

at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) ~[?:?]

at com.sun.proxy.$Proxy193.getJobReport(Unknown Source) ~[?:?]

at org.apache.hadoop.mapreduce.v2.api.impl.pb.client.MRClientProtocolPBClientImpl.getJobReport(MRClientProtocolPBClientImpl.java:133) ~[?:?]

at sun.reflect.GeneratedMethodAccessor60.invoke(Unknown Source) ~[?:?]

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_66]

at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_66]

at org.apache.hadoop.mapred.ClientServiceDelegate.invoke(ClientServiceDelegate.java:320) ~[?:?]

at org.apache.hadoop.mapred.ClientServiceDelegate.getJobStatus(ClientServiceDelegate.java:419) ~[?:?]

at org.apache.hadoop.mapred.YARNRunner.getJobStatus(YARNRunner.java:524) ~[?:?]

at org.apache.hadoop.mapreduce.Job$1.run(Job.java:314) ~[?:?]

at org.apache.hadoop.mapreduce.Job$1.run(Job.java:311) ~[?:?]

at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_66]

at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_66]

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) ~[?:?]

at org.apache.hadoop.mapreduce.Job.updateStatus(Job.java:311) ~[?:?]

at org.apache.hadoop.mapreduce.Job.isSuccessful(Job.java:611) ~[?:?]

at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1318) ~[?:?]

at io.druid.indexer.DetermineHashedPartitionsJob.run(DetermineHashedPartitionsJob.java:114) ~[druid-indexing-hadoop-0.8.1.jar:0.8.1]

at io.druid.indexer.JobHelper.runJobs(JobHelper.java:182) ~[druid-indexing-hadoop-0.8.1.jar:0.8.1]

at io.druid.indexer.HadoopDruidDetermineConfigurationJob.run(HadoopDruidDetermineConfigurationJob.java:84) ~[druid-indexing-hadoop-0.8.1.jar:0.8.1]

at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:289) ~[druid-indexing-service-0.8.1.jar:0.8.1]

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_66]

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_66]

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_66]

at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_66]

at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:129) ~[druid-indexing-service-0.8.1.jar:0.8.1]

Hello, no answer ? :frowning:

Hi, does the advice on http://wiki.apache.org/hadoop/ConnectionRefused help at all?

Hi, indeed, the yarn resource manager was dead. now i hope it will work, thanks!