Spark tranquility 0.8.2 : Exception : Caused by: java.lang.ClassNotFoundException: SmileFactory

HI guys,

I used spark tranquility****0.8.2 for runtime task. Spark version - spark-2.1.0-bin-hadoop2.7.

but when I submit the jar to spark I am facing exceptions.

17/03/06 11:20:26 WARN TaskSetManager: Lost task 11.0 in stage 23.0 (TID 2490, 172.30.9.174, executor 9): java.lang.NoClassDefFoundError: com/fasterxml/jackson/dataformat/smile/SmileFactory

at com.spark.report.dal.dao.SimpleEventBeamFactory$.(SimpleEventBeamFactory.scala:37)

at com.spark.report.dal.dao.SimpleEventBeamFactory$.(SimpleEventBeamFactory.scala)

at com.spark.report.dal.dao.SimpleEventBeamFactory.makeBeam(SimpleEventBeamFactory.scala:23)

at com.metamx.tranquility.spark.BeamRDD$$anonfun$propagate$1.apply(BeamRDD.scala:44)

at com.metamx.tranquility.spark.BeamRDD$$anonfun$propagate$1.apply(BeamRDD.scala:43)

at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:925)

at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:925)

at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1944)

at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1944)

at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)

at org.apache.spark.scheduler.Task.run(Task.scala:99)

at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

at java.lang.Thread.run(Thread.java:745)

Caused by: java.lang.ClassNotFoundException: com.fasterxml.jackson.dataformat.smile.SmileFactory

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

… 15 more

I have config jar path in spark config properties.

/opt/spark-2.1.0-bin-hadoop2.7/bin//spark-submit --class com.spark.druid.MainApp --master spark://xxx.xxx.xxx.xxx:7777 --executor-memory 6g --total-executor-cores 44 --deploy-mode client /home/ubuntu/runnable_main.jar

I have spent much time to figure it out but not able to find any solution. Can someone please help to resolve this?

Thanks

Jitesh