Plyql giving error

Hi All ,

Querying a set of dimensions via plyql is giving me the following error :

buffer.js:382

throw new Error(‘toString failed’);

^

Error: toString failed

at Buffer.toString (buffer.js:382:11)

at Request. (/usr/local/lib/node_modules/plyql/node_modules/plywood-druid-requester/node_modules/request/request.js:1055:39)

at emitOne (events.js:77:13)

Omitting 1 or 2 dimensions makes the query work. Can someone please help me here.

Regards

Rajitha

Hey Rajitha,

What’s the query you’re trying to run, and which 1 or 2 dimensions make the query work when omitted? Could you also try running plyql with -v?

Hi Gian,

I used the -v option to get the actual query fired on Druid by Plyql. Posted it using curl on Druid broker and receive the response correctly.

However plyql isn’t able to deliver the same output and returns error. Does Plyql have any limitation on the no. of rows it can return ? This would hinder our efforts in using plyql for querying Druid data. Please help.

buffer.js:528

throw new Error(’“toString()” failed’);

^

Error: “toString()” failed

at Buffer.toString (buffer.js:528:11)

at Request. (/home/rajitha.r/local/node/lib/node_modules/plyql/node_modules/request/request.js:1055:39)

at emitOne (events.js:96:13)

at Request.emit (events.js:188:7)

at IncomingMessage. (/home/rajitha.r/local/node/lib/node_modules/plyql/node_modules/request/request.js:1001:12)

at IncomingMessage.g (events.js:286:16)

at emitNone (events.js:91:20)

at IncomingMessage.emit (events.js:185:7)

at endReadableNT (_stream_readable.js:926:12)

at _combinedTickCallback (internal/process/next_tick.js:74:11)

This issue seems to be due to Nodejs limitation as mentioned here :

Any druid result json if more than 250 MB or so can’t be returned by Plyql in that case.

plyql current does buffer up all the results from Druid. I know that team is working on streaming results back from Druid, to be able to handle larger result sets. So you may find that the next version of plyql works better in this regard.

Fwiw, we’re also working on a built-in SQL layer in Druid, which also is fully streaming and doesn’t require any external programs. If you build from master, you can try that out now, or you can wait for 0.10.0-rc1 which should be coming soon.

That sounds great Gian. Will explore both the options.

Thanks a lot !