[PROBLEM] Cluster Configuration

Hi evereyone !

I have a cluster running on CentOS with two data Server (everything is as described in the tutorial) A and B.

I deleted the var folder of every server before starting the cluster. When I start it everything looks fine I have my two historicals ect…

Then I try to load data (the wikipedia json fodler provided in the quickstart/tutorial) and only the historical A is remaining.

So I go into the historical logs of B and see :

2019-07-04T07:46:14,482 ERROR [ZKCoordinator–1] org.apache.druid.server.coordination.SegmentLoadDropHandler - Failed to load segment for dataSource: {class=org.apache.druid.server.coordination.SegmentLoadDropHandler, exceptionType=class org.apache.druid.segment.loading.SegmentLoadingException, exceptionMessage=Exception loading segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-07-04T07:38:23.161Z], segment=DataSegment{size=4322529, shardSpec=NumberedShardSpec{partitionNum=0, partitions=0}, metrics=[count, sum_added, sum_deleted, sum_delta, sum_metroCode], dimensions=[channel, cityName, comment, countryIsoCode, countryName, isAnonymous, isMinor, isNew, isRobot, isUnpatrolled, namespace, page, regionIsoCode, regionName, user], version=‘2019-07-04T07:38:23.161Z’, loadSpec={type=>local, path=>/home/clement/Documents/apache-druid-0.15.0-incubating/var/druid/segments/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z/2019-07-04T07:38:23.161Z/0/index.zip}, interval=2015-09-12T00:00:00.000Z/2015-09-13T00:00:00.000Z, dataSource=‘wikipedia’, binaryVersion=‘9’}}
org.apache.druid.segment.loading.SegmentLoadingException: Exception loading segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-07-04T07:38:23.161Z]
at org.apache.druid.server.coordination.SegmentLoadDropHandler.loadSegment(SegmentLoadDropHandler.java:268) ~[druid-server-0.15.0-incubating.jar:0.15.0-incubating]
at org.apache.druid.server.coordination.SegmentLoadDropHandler.addSegment(SegmentLoadDropHandler.java:312) ~[druid-server-0.15.0-incubating.jar:0.15.0-incubating]
at org.apache.druid.server.coordination.SegmentChangeRequestLoad.go(SegmentChangeRequestLoad.java:47) ~[druid-server-0.15.0-incubating.jar:0.15.0-incubating]
at org.apache.druid.server.coordination.ZkCoordinator.lambda$childAdded$0(ZkCoordinator.java:152) ~[druid-server-0.15.0-incubating.jar:0.15.0-incubating]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_212]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_212]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_212]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_212]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_212]
Caused by: java.lang.IllegalArgumentException: Instantiation of [simple type, class org.apache.druid.segment.loading.LocalLoadSpec] value failed: [/home/clement/Documents/apache-druid-0.15.0-incubating/var/druid/segments/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z/2019-07-04T07:38:23.161Z/0/index.zip] does not exist
at com.fasterxml.jackson.databind.ObjectMapper._convert(ObjectMapper.java:3459) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.ObjectMapper.convertValue(ObjectMapper.java:3378) ~[jackson-databind-2.6.7.jar:2.6.7]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadInLocation(SegmentLoaderLocalCacheManager.java:235) ~[druid-server-0.15.0-incubating.jar:0.15.0-incubating]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadInLocationWithStartMarker(SegmentLoaderLocalCacheManager.java:224) ~[druid-server-0.15.0-incubating.jar:0.15.0-incubating]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadSegmentWithRetry(SegmentLoaderLocalCacheManager.java:187) ~[druid-server-0.15.0-incubating.jar:0.15.0-incubating]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.getSegmentFiles(SegmentLoaderLocalCacheManager.java:164) ~[druid-server-0.15.0-incubating.jar:0.15.0-incubating]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.getSegment(SegmentLoaderLocalCacheManager.java:131) ~[druid-server-0.15.0-incubating.jar:0.15.0-incubating]
at org.apache.druid.server.SegmentManager.getAdapter(SegmentManager.java:196) ~[druid-server-0.15.0-incubating.jar:0.15.0-incubating]
at org.apache.druid.server.SegmentManager.loadSegment(SegmentManager.java:157) ~[druid-server-0.15.0-incubating.jar:0.15.0-incubating]
at org.apache.druid.server.coordination.SegmentLoadDropHandler.loadSegment(SegmentLoadDropHandler.java:264) ~[druid-server-0.15.0-incubating.jar:0.15.0-incubating]
… 8 more
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Instantiation of [simple type, class org.apache.druid.segment.loading.LocalLoadSpec] value failed: [/home/clement/Documents/apache-druid-0.15.0-incubating/var/druid/segments/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z/2019-07-04T07:38:23.161Z/0/index.zip] does not exist
at com.fasterxml.jackson.databind.deser.std.StdValueInstantiator.wrapException(StdValueInstantiator.java:399) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.std.StdValueInstantiator.createFromObjectWith(StdValueInstantiator.java:231) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:135) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:442) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1099) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:296) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeOther(BeanDeserializer.java:166) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:136) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer._deserializeTypedForId(AsPropertyTypeDeserializer.java:122) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer.deserializeTypedFromObject(AsPropertyTypeDeserializer.java:93) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.AbstractDeserializer.deserializeWithType(AbstractDeserializer.java:131) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.impl.TypeWrappedDeserializer.deserialize(TypeWrappedDeserializer.java:42) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.ObjectMapper._convert(ObjectMapper.java:3454) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.ObjectMapper.convertValue(ObjectMapper.java:3378) ~[jackson-databind-2.6.7.jar:2.6.7]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadInLocation(SegmentLoaderLocalCacheManager.java:235) ~[druid-server-0.15.0-incubating.jar:0.15.0-incubating]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadInLocationWithStartMarker(SegmentLoaderLocalCacheManager.java:224) ~[druid-server-0.15.0-incubating.jar:0.15.0-incubating]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadSegmentWithRetry(SegmentLoaderLocalCacheManager.java:187) ~[druid-server-0.15.0-incubating.jar:0.15.0-incubating]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.getSegmentFiles(SegmentLoaderLocalCacheManager.java:164) ~[druid-server-0.15.0-incubating.jar:0.15.0-incubating]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.getSegment(SegmentLoaderLocalCacheManager.java:131) ~[druid-server-0.15.0-incubating.jar:0.15.0-incubating]
at org.apache.druid.server.SegmentManager.getAdapter(SegmentManager.java:196) ~[druid-server-0.15.0-incubating.jar:0.15.0-incubating]
at org.apache.druid.server.SegmentManager.loadSegment(SegmentManager.java:157) ~[druid-server-0.15.0-incubating.jar:0.15.0-incubating]
at org.apache.druid.server.coordination.

Hello,

Can you tell me how your deep storage is configured? Using ‘local’ for deep storage only works for a single node, as soon as you go to a clustered setup you will need to use S3, HDFS, or a NFS mount for your deep storage directory. If Node A is loading the data to that path, it expects all historical nodes to have access to the data at the same path.

Let me know if this doesn’t make sense.

Thanks,

Ben

HI Benjamin, in fadct you’re may be right. I’m thought I could use the local storage even with the clustered setup.

You say " If Node A is loading the data to that path, it expects all historical nodes to have access to the data at the same path " but using the local storage my two historical nodes have access to the data with same path.

Thanks !

Le ven. 5 juil. 2019 à 02:27, Benjamin Hopp benjamin.hopp@imply.io a écrit :