Storing and analyzing JSON with 15k+ keys

Hello all,

I am looking at druid as a way to store json content that has a large amount of keys in it, keys will be in the order of 15-20k. Each JSON will be < 5 MB in size, and total data size will be around 50 GB growth per month.
Has anyone used druid for this kind of content ?


Hi, what kind of workflows and queries are you trying to do with the data? If you haven’t done so already, it may make sense to read to get a sense of what Druid is good for.