Using graphite emitter extension

Hi all,

I’m trying out the graphite emitter plugin in druid to collect certain druid metrics in graphite during druid performance tests.

The intent is to then query these metrics using the REST API provided by graphite in order to characterize the performance of the deployment.

However, the numbers returned by graphite don’t make sense. So, I wanted to check if I’m interpreting the results in the right manner.


  • The kafka indexing service is used to ingest data from kafka into druid.

  • I’ve enabled the graphite emitter and provided a whitelist of metrics to collect.

  • Then I pushed 5000 events to the kafka topic being indexed. Using kafka-related tools, I confirmed that the messages are indeed stored in the kafka logs.

  • Next, I retrieved the ingest.rows.output metric from graphite using the following call:

    curl “http://<Graphite_IP>:/render/?target=druid.test.ingest.rows.output&format=csv”


  • Following are the results I got:

druid.test.ingest.rows.output,2017-02-22 01:11:00,0.0
druid.test.ingest.rows.output,2017-02-22 01:12:00,152.4
druid.test.ingest.rows.output,2017-02-22 01:13:00,97.0
druid.test.ingest.rows.output,2017-02-22 01:14:00,0.0


I don’t know how these numbers need to be interpreted:


(i) What do the numbers 152.4 and 97.0 in the output indicate?

(ii) How can the ‘number of rows’ be a floating point value like 152.4?

(iii) How do these numbers relate to the ‘5000’ messages I pushed to Kafka?

Thanks in advance,