I’m seeing an annoying error with Hadoop task after enabling monitoring metrics , the only new addition to this changes is an addition of **sigar jar (sigar-184.108.40.206.jar). **
I have tried removing all jars from /tmp/druid-indexing/classpath and adding "mapreduce.job.user.classpath.first": "true", to task spec , nothing seems help .
I have been working with the same setup since a few months now without any issue .
Druid version -0.83
**Hadoop - 2.7.2 **
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r b165c4fe8a74265c792ce23f546c64604acf0e41
Compiled by jenkins on 2016-01-26T00:08Z
Compiled with protoc 2.5.0
From source with checksum d0fda26633fa762bff87ec759ebe689c
This command was run using /root/ephemeral-hdfs/share/hadoop/common/hadoop-common-2.7.2.jar
**Error: class com.fasterxml.jackson.datatype.guava.deser.HostAndPortDeserializer overrides final method deserialize.(Lcom/fasterxml/jackson/core/JsonParser;Lcom/fasterxml/jackson/databind/DeserializationContext;)Ljava/lang/Object;**
Thank you .