Error: Cannot inherit from final class while POC of druid 0.13

Hi,
I have strange batch processing issue that i only get once i try to POC the new imply 2.8.13 that is using druid13.
my payload is simple and data with EMR and s3, the EMR version is running hadoop 2.8.3.

payload :

{

“type”: “index_hadoop”,

“hadoopDependencyCoordinates”: [“org.apache.hadoop:hadoop-client:2.8.3”, “org.apache.hadoop:hadoop-aws:2.8.3”],

“spec”: {

“ioConfig”: {

“type”: “hadoop”,

“inputSpec”: {

“type”: “static”,

“inputFormat”: “org.apache.druid.data.input.parquet.DruidParquetInputFormat”,

“paths” : “s3n://partition_dt=20180101/”

}

},

“dataSchema”: {

“dataSource”: “no_metrics”,

“parser”: {

“type”: “parquet”,

“parseSpec”: {

“format”: “timeAndDims”,

“timestampSpec”: {

“column”: “startmeasurement”,

“format”: “auto”

},

“dimensionsSpec”: {

“dimensions”: [

“code”

],

“dimensionExclusions”: ,

“spatialDimensions”:

}

}

},

“metricsSpec”: [{

“type”: “count”,

“name”: “count”

}],

“granularitySpec”: {

“type”: “uniform”,

“segmentGranularity”: “DAY”,

“queryGranularity”: “ALL”,

“intervals”: [“2018-01-01/2018-01-02”]

}

},

“tuningConfig”: {

“type”: “hadoop”,

“partitionsSpec”: {

“targetPartitionSize”: 5000000

},

“jobProperties” : {

“fs.s3.awsAccessKeyId” : “”,

“fs.s3.awsSecretAccessKey” : “”,

“fs.s3.impl” : “org.apache.hadoop.fs.s3native.NativeS3FileSystem”,

“fs.s3n.awsAccessKeyId” : “”,

“fs.s3n.awsSecretAccessKey” : " ",

“fs.s3n.impl” : “org.apache.hadoop.fs.s3native.NativeS3FileSystem”,

“io.compression.codecs” : “org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.BZip2Codec,org.apache.hadoop.io.compress.SnappyCodec”,

“mapreduce.job.user.classpath.first” : “true”,

“mapreduce.job.classloader.system.classes” : “-javax.validation.,java.,javax.,org.apache.commons.logging.,org.apache.log4j.,org.apache.hadoop.”,

“parquet.avro.add-list-element-records” : “false”

},

“leaveIntermediate”: true

}

}

}

all of my extensions are loaded properly “from props[druid.extensions.] as [ExtensionsConfig{searchCurrentClassloader=true, directory=‘dist/druid/extensions’, useExtensionClassloaderFirst=false, hadoopDependenciesDir=‘dist/druid/hadoop-dependencies’, hadoopContainerDruidClasspath=‘null’, addExtensionsToHadoopContainer=false, loadList=[druid-histogram, druid-datasketches, druid-kafka-indexing-service, druid-parser-route, imply-utility-belt, druid-avro-extensions, druid-parquet-extensions, druid-s3-extensions]}]”
and the EMR is getting the job.

but the the index job is giving me :
2019-02-07T13:30:00,758 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Job job_1549537550942_0009 running in uber mode : false

2019-02-07T13:30:00,759 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - map 0% reduce 0%

2019-02-07T13:30:08,808 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Task Id : attempt_1549537550942_0009_m_000000_0, Status : FAILED

Error: Cannot inherit from final class

2019-02-07T13:30:08,822 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Task Id : attempt_1549537550942_0009_m_000001_0, Status : FAILED

Error: Cannot inherit from final class

2019-02-07T13:30:09,828 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Task Id : attempt_1549537550942_0009_m_000002_0, Status : FAILED

Error: Cannot inherit from final class

2019-02-07T13:30:10,840 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Task Id : attempt_1549537550942_0009_m_000003_0, Status : FAILED

Error: Cannot inherit from final class

2019-02-07T13:30:15,857 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Task Id : attempt_1549537550942_0009_m_000000_1, Status : FAILED

Error: Cannot inherit from final class

2019-02-07T13:30:17,865 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Task Id : attempt_1549537550942_0009_m_000002_1, Status : FAILED

Error: Cannot inherit from final class

2019-02-07T13:30:18,874 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Task Id : attempt_1549537550942_0009_m_000001_1, Status : FAILED

Error: Cannot inherit from final class

2019-02-07T13:30:18,875 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Task Id : attempt_1549537550942_0009_m_000003_1, Status : FAILED

Error: Cannot inherit from final class

2019-02-07T13:30:21,885 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Task Id : attempt_1549537550942_0009_m_000000_2, Status : FAILED

Error: Cannot inherit from final class

2019-02-07T13:30:25,898 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Task Id : attempt_1549537550942_0009_m_000002_2, Status : FAILED

Error: Cannot inherit from final class

2019-02-07T13:30:25,899 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Task Id : attempt_1549537550942_0009_m_000001_2, Status : FAILED

Error: Cannot inherit from final class

2019-02-07T13:30:26,904 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Task Id : attempt_1549537550942_0009_m_000003_2, Status : FAILED

Error: Cannot inherit from final class

2019-02-07T13:30:30,915 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - map 100% reduce 100%

and the hadoop log fails with this error :

2019-02-07T11:11:55,399 ERROR [main] org.apache.hadoop.mapred.YarnChild - Error running child : java.lang.VerifyError: Cannot inherit from final class
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.jets3t.service.utils.RestUtils.createDefaultHttpParams(RestUtils.java:574)
	at org.jets3t.service.utils.RestUtils.initHttpConnection(RestUtils.java:298)
	at org.jets3t.service.impl.rest.httpclient.RestStorageService.initHttpConnection(RestStorageService.java:209)
	at org.jets3t.service.impl.rest.httpclient.RestStorageService.initializeDefaults(RestStorageService.java:166)
	at org.jets3t.service.StorageService.<init>(StorageService.java:125)
	at org.jets3t.service.impl.rest.httpclient.RestStorageService.<init>(RestStorageService.java:153)
	at org.jets3t.service.S3Service.<init>(S3Service.java:91)
	at org.jets3t.service.impl.rest.httpclient.RestS3Service.<init>(RestS3Service.java:157)
	at org.jets3t.service.impl.rest.httpclient.RestS3Service.<init>(RestS3Service.java:131)
	at org.jets3t.service.impl.rest.httpclient.RestS3Service.<init>(RestS3Service.java:109)
	at org.apache.hadoop.fs.s3native.Jets3tNativeFileSystemStore.initialize(Jets3tNativeFileSystemStore.java:85)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:409)
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163)
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155)
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:346)
	at org.apache.hadoop.fs.s3native.$Proxy31.initialize(Unknown Source)
	at org.apache.hadoop.fs.s3native.NativeS3FileSystem.initialize(NativeS3FileSystem.java:335)
	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2859)
	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99)
	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2896)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2878)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:392)
	at org.apache.hadoop.fs.Path.getFileSystem(Path.java:356)
	at org.apache.parquet.hadoop.util.HadoopInputFile.fromPath(HadoopInputFile.java:38)
	at org.apache.parquet.hadoop.ParquetRecordReader.initializeInternalReader(ParquetRecordReader.java:163)
	at org.apache.parquet.hadoop.ParquetRecordReader.initialize(ParquetRecordReader.java:140)
	at org.apache.hadoop.mapreduce.lib.input.DelegatingRecordReader.initialize(DelegatingRecordReader.java:84)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:557)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:795)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:175)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:169)

Someone have a lead for me ? what am i doing wrong ?

Alon

tested on local hadoop. same issue.
If i try to remove “hadoopDependencyCoordinates”: [“org.apache.hadoop:hadoop-client:2.8.3”, “org.apache.hadoop:hadoop-aws:2.8.3”], i am getting :
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.s3native.NativeS3FileSystem not found

Played and remove the payload to a minimum and got a new and unclear message (

Error in custom provider, java.lang.NoSuchFieldError: INSTANCE)

now, how do i know if its the changes i made ? or an improvment ?

2019-02-07 17:05:56,663 ERROR [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.ExceptionInInitializerError
	at org.apache.druid.data.input.parquet.simple.DruidParquetReadSupport.getPartialReadSchema(DruidParquetReadSupport.java:51)
	at org.apache.druid.data.input.parquet.simple.DruidParquetReadSupport.init(DruidParquetReadSupport.java:92)
	at org.apache.parquet.hadoop.InternalParquetRecordReader.initialize(InternalParquetRecordReader.java:199)
	at org.apache.parquet.hadoop.ParquetRecordReader.initializeInternalReader(ParquetRecordReader.java:182)
	at org.apache.parquet.hadoop.ParquetRecordReader.initialize(ParquetRecordReader.java:140)
	at org.apache.hadoop.mapreduce.lib.input.DelegatingRecordReader.initialize(DelegatingRecordReader.java:84)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:557)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:795)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:175)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:169)
Caused by: com.google.inject.ProvisionException: Unable to provision, see the following errors:

1) Error in custom provider, java.lang.NoSuchFieldError: INSTANCE
  at org.apache.druid.storage.s3.S3StorageDruidModule.getAmazonS3Client(S3StorageDruidModule.java:131) (via modules: com.google.inject.util.Modules$OverrideModule -> org.apache.druid.storage.s3.S3StorageDruidModule)
  at org.apache.druid.storage.s3.S3StorageDruidModule.getAmazonS3Client(S3StorageDruidModule.java:131) (via modules: com.google.inject.util.Modules$OverrideModule -> org.apache.druid.storage.s3.S3StorageDruidModule)
  while locating org.apache.druid.storage.s3.ServerSideEncryptingAmazonS3
    for the 1st parameter of org.apache.druid.storage.s3.S3DataSegmentPusher.<init>(S3DataSegmentPusher.java:57)
  while locating org.apache.druid.storage.s3.S3DataSegmentPusher
  at org.apache.druid.storage.s3.S3StorageDruidModule.configure(S3StorageDruidModule.java:108) (via modules: com.google.inject.util.Modules$OverrideModule -> org.apache.druid.storage.s3.S3StorageDruidModule)
  while locating org.apache.druid.segment.loading.DataSegmentPusher annotated with @com.google.inject.multibindings.Element(setName=,uniqueId=79, type=MAPBINDER, keyType=java.lang.String)
  at org.apache.druid.guice.PolyBind.createChoice(PolyBind.java:70) (via modules: com.google.inject.util.Modules$OverrideModule -> com.google.inject.util.Modules$OverrideModule -> org.apache.druid.guice.LocalDataStorageDruidModule)
  while locating org.apache.druid.segment.loading.DataSegmentPusher

1 error
	at com.google.inject.internal.InjectorImpl$2.get(InjectorImpl.java:1028)
	at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1054)
	at org.apache.druid.indexer.HadoopDruidIndexerConfig.<clinit>(HadoopDruidIndexerConfig.java:124)
	... 14 more
Caused by: java.lang.NoSuchFieldError: INSTANCE
	at org.apache.http.impl.io.DefaultHttpRequestWriterFactory.<init>(DefaultHttpRequestWriterFactory.java:52)
	at org.apache.http.impl.io.DefaultHttpRequestWriterFactory.<init>(DefaultHttpRequestWriterFactory.java:56)
	at org.apache.http.impl.io.DefaultHttpRequestWriterFactory.<clinit>(DefaultHttpRequestWriterFactory.java:46)
	at org.apache.http.impl.conn.ManagedHttpClientConnectionFactory.<init>(ManagedHttpClientConnectionFactory.java:83)
	at org.apache.http.impl.conn.ManagedHttpClientConnectionFactory.<init>(ManagedHttpClientConnectionFactory.java:96)
	at org.apache.http.impl.conn.ManagedHttpClientConnectionFactory.<init>(ManagedHttpClientConnectionFactory.java:105)
	at org.apache.http.impl.conn.ManagedHttpClientConnectionFactory.<clinit>(ManagedHttpClientConnectionFactory.java:63)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager$InternalConnectionFactory.<init>(PoolingHttpClientConnectionManager.java:586)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.<init>(PoolingHttpClientConnectionManager.java:180)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.<init>(PoolingHttpClientConnectionManager.java:164)
	at com.amazonaws.http.apache.client.impl.ApacheConnectionManagerFactory.create(ApacheConnectionManagerFactory.java:73)
	at com.amazonaws.http.apache.client.impl.ApacheConnectionManagerFactory.create(ApacheConnectionManagerFactory.java:58)
	at com.amazonaws.http.apache.client.impl.ApacheHttpClientFactory.create(ApacheHttpClientFactory.java:50)
	at com.amazonaws.http.apache.client.impl.ApacheHttpClientFactory.create(ApacheHttpClientFactory.java:38)
	at com.amazonaws.http.AmazonHttpClient.<init>(AmazonHttpClient.java:315)
	at com.amazonaws.http.AmazonHttpClient.<init>(AmazonHttpClient.java:299)
	at com.amazonaws.AmazonWebServiceClient.<init>(AmazonWebServiceClient.java:172)
	at com.amazonaws.services.s3.AmazonS3Client.<init>(AmazonS3Client.java:627)
	at com.amazonaws.services.s3.AmazonS3Builder$1.apply(AmazonS3Builder.java:35)
	at com.amazonaws.services.s3.AmazonS3Builder$1.apply(AmazonS3Builder.java:32)
	at com.amazonaws.services.s3.AmazonS3ClientBuilder.build(AmazonS3ClientBuilder.java:64)
	at com.amazonaws.services.s3.AmazonS3ClientBuilder.build(AmazonS3ClientBuilder.java:28)
	at com.amazonaws.client.builder.AwsSyncClientBuilder.build(AwsSyncClientBuilder.java:46)
	at org.apache.druid.storage.s3.S3StorageDruidModule.getAmazonS3Client(S3StorageDruidModule.java:147)
	at org.apache.druid.storage.s3.S3StorageDruidModule$$FastClassByGuice$$3f7cfccc.invoke(<generated>)
	at com.google.inject.internal.ProviderMethod$FastClassProviderMethod.doProvision(ProviderMethod.java:264)
	at com.google.inject.internal.ProviderMethod$Factory.provision(ProviderMethod.java:401)
	at com.google.inject.internal.ProviderMethod$Factory.get(ProviderMethod.java:376)
	at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46)
	at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1092)
	at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40)
	at com.google.inject.internal.SingletonScope$1.get(SingletonScope.java:194)
	at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:41)
	at com.google.inject.internal.SingleParameterInjector.inject(SingleParameterInjector.java:38)
	at com.google.inject.internal.SingleParameterInjector.getAll(SingleParameterInjector.java:62)
	at com.google.inject.internal.ConstructorInjector.provision(ConstructorInjector.java:110)
	at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:90)
	at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:268)
	at com.google.inject.internal.FactoryProxy.get(FactoryProxy.java:56)
	at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46)
	at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1092)
	at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40)
	at com.google.inject.internal.SingletonScope$1.get(SingletonScope.java:194)
	at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:41)
	at com.google.inject.internal.InjectorImpl$2$1.call(InjectorImpl.java:1019)
	at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1092)
	at com.google.inject.internal.InjectorImpl$2.get(InjectorImpl.java:1015)
	at com.google.inject.spi.ProviderLookup$1.get(ProviderLookup.java:104)
	at com.google.inject.spi.ProviderLookup$1.get(ProviderLookup.java:104)
	at com.google.inject.multibindings.MapBinder$RealMapBinder$ValueProvider.get(MapBinder.java:821)
	at org.apache.druid.guice.PolyBind$ConfiggedProvider.get(PolyBind.java:204)
	at com.google.inject.internal.ProviderInternalFactory.provision(ProviderInternalFactory.java:81)
	at com.google.inject.internal.InternalFactoryToInitializableAdapter.provision(InternalFactoryToInitializableAdapter.java:53)
	at com.google.inject.internal.ProviderInternalFactory.circularGet(ProviderInternalFactory.java:61)
	at com.google.inject.internal.InternalFactoryToInitializableAdapter.get(InternalFactoryToInitializableAdapter.java:45)
	at com.google.inject.internal.InjectorImpl$2$1.call(InjectorImpl.java:1019)
	at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1085)
	at com.google.inject.internal.InjectorImpl$2.get(InjectorImpl.java:1015)
	... 16 more