Load-ui 'hangs' takes forever on large files?

We have imply 3.0.13 build.

which ships with:

Druid 0.15.0-incubating-iap9

ZooKeeper 3.4.13

right now we have 1 master node, 1 query node, 1 data node.

I can load a file with 200 lines no issues, using the Load-Data UI.

when I attempt to load a file with 500k lines…will hang till I run away…or sometimes hang until a 504 appears in the UI.

I have seen the 504 on parse date step. I have seen the ‘forever hang’ on the parse data step.

I will continue to poke around over the weekend…

My question is:

when the us running the steps…where is running?

does it run processes on the broker? or for these steps does it actually send ‘requests’ off down to the middle manager etc? with a sample from the file?

Hi Daniel,

Where are you loading the file from? Could you paste in the body of the payload sent to the sampler route? (You can get it from the developer console of your browser).
All sampler requests are actually done by the coordinator process that runs on the master server.

I don’t have access it right now.

but I have actually spun up an http python server on the query node (which is where we dropped the test file)…the file is json file, each line is ‘record’).

We did this with imply 3.0.3 (and the same test file) without issue, on different cluster without issue.

good to know the coordinator does all the sampling.

one other thing of note I should have mentioned: usually if I ‘kill’ the http server, the ‘sampler’ i.e. the ui…will imediatly update, with the ‘content’, allowing us to proceed to the next step. I had originally suspected it may have been a really slow network, but when I did a manual wget, it was ‘instant’ in the file transfer.