metrics and dimensions supported

Was wondering how many metrics and dimensions does druid support.
see I am getting this error :

java.lang.Exception: java.lang.ArrayIndexOutOfBoundsException: 81

Although I only picked 2 records in my file for batch ingestion, and partition size is 500000. Still . . .

Any specific reasons to this ? I see the map phase is completing, but what is it in the reduce phase of ingestion that it fails ?

Thanks,
Sumit

Well, looks like something with hyperUnique, as soon as I commented those, it succeeded.

I wonder If you were having multiple aggregates with duplicate names which caused this ?
FWIW, It should not be an issue with hyperUniques, but something wrong with ingestion spec.

Nishant, well the ingestion spec looks Ok only, far as naming conflicts are concerned. Shall look at it, again.

Do you have a full stack trace for that error? It might be available on the hadoop map/reduce logs.

http://pastebin.com/S5sye10j there it is.

Thanks Sumit. Could you please also attach the ingestion spec for this job?

http://pastebin.com/huz0DuLc I have commented the path name though :slight_smile: thanks.

Hey gian,

Could you look at the ingestion spec ? I think one of the hyperUnique metric, as in the buyer_name, say was replicated. Let me try out with removing the duplicate one. Thanks!

Hi Sumit,

I think you found the issue, having two aggregators with the same “name” will cause that ArrayIndexOutOfBoundsException when persisting a segment, I just replicated that behavior locally

Thanks,

Jon

Yes, thanks a lot guys, though. But still, I believe that file/batch ingestion on a single node cluster doesnt exceed any more than say 10 GBs of data (it might be a different topic altogether)