When I'm querying I can't obtain data from metrics.

Hi and thank you in advance!

I’ve ingested with CSV format, data like this:

2015-07-20T00:00:00Z,“30”,“18”,“14”,“enab”,“item”,25,4

Apart from the other fields, here there is how I am specyfing the indexing task:

“parser”: {

“type”: “string”,

“parseSpec”: {

“format”: “csv”,

“timestampSpec”: {

“column”: “timestamp”

},

“columns” : [“timestamp”,“campID”,“adID”,“keyID”,“State”,“key”],

“dimensionsSpec”: {

“dimesions”: [

“timestamp”,“campID”,“adID”,“keyID”,“State”,“key”]

}

}

},

“metricsSpec”: [

{

“type”: “count”,

“name”: “count”

},

{

“type”: “longSum”,

“name”: “impresions”,

“fieldName”: “impresions”

},

{

“type”: “longSum”,

“name”: “CPC”,

“fieldName”: “CPC”

}

],

And the query is topN:

“queryType”: “topN”,

“dataSource”: “rocket”,

“granularity”: “all”,

“dimension”: “keyword”,

“metric”: “count”,

“threshold”: 1000,

“aggregations”: [

{

“type”: “doubleSum”,

“name”: “count”,

“fieldName”: “count”

},

{

“fieldName”: “impresions”,

“name”: “impresions_longMin”,

“type”: “longMin”

},

{

“type”: “longMin”,

“name”: “CPC”,

“fieldName”: “CPC”

}

],

“intervals”: [“2012-10-01T00:00/2020-01-01T00”]

}

I obtain the count value properly but the metrics appears as 0, for example:

“result” : [ {

“CPC” : 0,

“count” : 8.0,

“impresions_longMin” : 0,

“keyword” : “apartamentos kensington”

},

If someone could help me It will be appreciated!!!

Juan!

In your indexing, you have identified the columns:

timestamp","campID","adID","keyID","State","key"

But your aggregators are looking for

"fieldName": "impresions"
"fieldName": "CPC"

I.e. it is trying to aggregation the "impresions" and the "CPC" column
out of the data. But you apparently don't have an impresions or a CPC
column, so it is just aggregating 0s

--Eric

Also, timestamp likely shouldn't be a dimension.

Hi Eric!

I appreciate so much your advices.
I see what you mean with I’m not identifying the columns (impresions and CPC) in mi indexing. I’ve followed the documentation (http://druid.io/docs/latest/ingestion/data-formats.html) and metric-columns (57,200, -143) are not added into the indexing :

csv used to ingest in the documentation:

2013-08-31T01:02:33Z,"Gypsy Danger","en","nuclear","true","true","false","false","article","North America","United States","Bay Area","San Francisco",57,200,-143

indexing used in the example without columns added, deleted, delta:

"parseSpec":{
    "format" : "csv",
    "timestampSpec" : {
      "column" : "timestamp"
    },
    "columns" : ["timestamp","page","language","user","unpatrolled","newPage","robot","anonymous","namespace","continent","country","region","city"],
    "dimensionsSpec" : {
      "dimensions" : ["page","language","user","unpatrolled","newPage","robot","anonymous","namespace","continent","country","region","city"]
    }
  }

On the other hand I’ve seen the error of considering timestamp as a dimension.

Thanks again!

Juan

Hi Juan, your “columns” field should include every column in your data set so Druid can parse the fields and determine what is a timestamp, dimension, or metric.

Yeah!!! it’s working perfectly Fangjin! Thank youuu!