Use Spark to push/pull data to/from Druid

Hi Guys,

Are there druid connectors in Spark which can ingest data into Druid?

I went through some of them available online such as

  1)  [https://github.com/metamx/druid-spark-batch](https://github.com/metamx/druid-spark-batch) -

  2)  [https://www.linkedin.com/pulse/combining-druid-spark-interactive-flexible-analytics-scale-butani/](https://www.linkedin.com/pulse/combining-druid-spark-interactive-flexible-analytics-scale-butani/)   - SparkBI

  3)  [https://github.com/metamx/tranquility/blob/0.8.2-mmx/docs/spark.md](https://github.com/metamx/tranquility/blob/0.8.2-mmx/docs/spark.md)
  1. & 2) - Sounds a bit immature from the adoption perspective.

Is 3) the only way to go then? Are there any other connectors?

My intention is to assimilate possibility of for the spark connector

a. reading from Druid

b. ingesting into Druid

Best Regards

Varaga