[Please Read] Migration to the new extension loading mechanism in 0.9.0

In Druid 0.9, we have refactored the extension loading mechanism. The main reason behind this change is to make Druid load extensions from the local file system without having to download stuff from the internet at runtime (see here for more details).

To learn all about the new extension loading mechanism, see Include extensions and Include Hadoop Dependencies. If you are impatient, here is the summary.

The following properties have been deprecated:





Instead, specify **druid.extensions.loadList**, **druid.extensions.directory** and **druid.extensions.hadoopDependenciesDir**.

**druid.extensions.loadList** specifies the list of extensions that will be loaded by Druid at runtime. An example would be **druid.extensions.loadList=["druid-datasketches", "mysql-metadata-storage"].**

**druid.extensions.directory** specifies the directory where all the extensions live. An example would be **druid.extensions.directory=/xxx/extensions.**

**druid.extensions.hadoopDependenciesDir** specifies the directory where all the Hadoop dependencies live. An example would be **druid.extensions.hadoopDependenciesDir=/xxx/hadoop-dependencies.** Note: We didn’t change the way of specifying which Hadoop version to use. So you just need to make sure the Hadoop you want to use exists underneath /xxx/hadoop-dependencies.

You might now wonder if you have to manually put extensions inside **/xxx/extensions** and **/xxx/hadoop-dependencies**. The answer is no, we already have created them for you. Download the latest Druid tarball at http://druid.io/downloads.html. Unpack it and you will see **extensions** and **hadoop-dependencies** folders there. Simply copy them to **/xxx/extensions** and **/xxx/hadoop-dependencies** respectively, now you are all set!

If the extension or the Hadoop dependency you want to load is not included in the core extension, you can use pull-deps to download it to your extension directory.

If you want to load your own extension, you can first do **mvn install **to install it into local repository, and then use pull-deps to download it to your extension directory.

Please feel free to leave any questions regarding the migration.

Thanks Bingkun!


As a side note we still use the pull-deps tool to setup our deployment tarball (in druid-0.9.0) and it is working fine.

our options to pull-deps include

-clean --no-default-hadoop -r XXXXX -h XXXXX -c XXXX


with -r or -h or -c specified multiple times.

We pull in both spark and hadoop with the -h options:

-h org.apache.hadoop:hadoop-client:2.4.0-mmx1 -h org.apache.spark:spark-core_2.10:1.5.2-mmx1

which gives us



In our deployment tarball.

the -c options include -c io.druid.extensions:druid-spark-batch_2.10:0.0.31

which gives extensions/druid-spark-batch_2.10 in the tarball

For the MiddleManager config we have druid.indexer.task.defaultHadoopCoordinates=[“org.apache.spark:spark-core_2.10:1.5.2-mmx1”]

((hadoop is deployed but not currently enabled))

And druid.extensions.loadList includes druid-spark-batch_2.10

So the extension stuff seems to be working as intended and is pretty extensible.

Using the pull-deps tool during the packaging of our deployments has helped simplify managing dependencies and extensions. The changes due to the 0.9.0 extensions stuff also helped pinpoint a lot of library conflicts and miss-configured POMs.


Charles Allen

I think users do need to download the mysql extension from druid.io and put it manually inside extensions directory. its not included in the distribution anymore.

release notes need to be updated, they falsely claim that user does not have to download it.

– Himanshu

updated the release notes.


mysql extension link in release notes seem to be broken.


Can you please fix it? Or please provide alternate source to download 0.9.0 version.


Hey Jagadeesh,

This link is now fixed.

That was quick! Thanks!