Annoying error with Hadoop task

Hello ,

I’m seeing an annoying error with Hadoop task after enabling monitoring metrics , the only new addition to this changes is an addition of **sigar jar (sigar-1.6.5.132.jar). **
I have tried removing all jars from /tmp/druid-indexing/classpath and adding "mapreduce.job.user.classpath.first": "true", to task spec , nothing seems help .

I have been working with the same setup since a few months now without any issue .

Druid version -0.83

**Hadoop - 2.7.2 **

/hadoop version

Hadoop 2.7.2

Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r b165c4fe8a74265c792ce23f546c64604acf0e41

Compiled by jenkins on 2016-01-26T00:08Z

Compiled with protoc 2.5.0

From source with checksum d0fda26633fa762bff87ec759ebe689c

This command was run using /root/ephemeral-hdfs/share/hadoop/common/hadoop-common-2.7.2.jar

**Error: class com.fasterxml.jackson.datatype.guava.deser.HostAndPortDeserializer overrides final method deserialize.(Lcom/fasterxml/jackson/core/JsonParser;Lcom/fasterxml/jackson/databind/DeserializationContext;)Ljava/lang/Object;**

Thank you .

Does things work if you disable metrics monitoring ?
It looks like a configuration error, probably not related enabling monitoring.

Can you also share your task spec and runtime.props for middlemanager.

Hi Nishant ,

Thank you for the mail . I was able to find the issue , we have a central provisioning via salt and due to a bug, it updated `yarn.application.classpath’ for all spark/Hadoop cluster .