We’ve been trying to POST some big queries to Druid’s brokers, however given a max POST size it starts to fail with the following logs:
ERROR [qtp1387511555-53] com.sun.jersey.spi.container.ContainerResponse - The exception contained within MappableContainerException could not be mapped to a response, re-throwing to the HTTP container
at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._parseName(UTF8StreamJsonParser.java:1527) ~[jackson-core-2.4.6.jar:2.4.6]
at com.fasterxml.jackson.core.json.UTF8StreamJsonParser.nextToken(UTF8StreamJsonParser.java:693) ~[jackson-core-2.4.6.jar:2.4.6]
at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer._deserializeTypedForId(AsPropertyTypeDeserializer.java:111) ~[jackso
Is it possible to increase the POST size that Druid can receive?
Why are you POSTing such a big query?
Because we are trying to get some data out of druid (for DS purposes), and to do that we make a big filter for user ID and then group by on one dimension, which is very fast for us, and as we add more users to that filter, the extraction speed goes faster. But now we’ve came up with a limit on the query POST size.
Is there any way to change that?
That’s a StackOverflowError, meaning too many calls in the java call stack. How are you constructing the filter? If you’re doing something like: OR(user=a, OR(user=b, OR(user=c, user=d))) then after a few hundred or thousand users then you might run into something like this due to the deep nesting. If that’s what your filter looks like now, try rewriting it as OR(user=a, user=b, user=c, user=d) or even better, IN(user, [a, b, c, d]).
Yes! that’s exactly the problem. Thank you very much, you’ve been always very helpful !
Great to hear that helped!