Writing data to Druid from Spark using Python

I am new to Druid, and would appreciate help in writing to Druid from Spark (particularly PySpark). I have a small number of tables already in Druid that I would like to populate via PySpark SQL queries. Any pointers would be greatly appreciated.