Error 500 java.lang.NullPointerException

curl -X ‘POST’ -H ‘Content-Type:application/json’ -d @quickstart/dakh-index.json localhost:8090/druid/indexer/v1/task

Warning: Couldn’t read data from file “quickstart/dakh-index.json”, this makes

Warning: an empty POST.

Error 500

HTTP ERROR: 500

Problem accessing /druid/indexer/v1/task. Reason:

    java.lang.NullPointerException: task

Powered by Jetty://

a line in my json looks like this:

{“HTTP_USER_AGENT”: “Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.63 Safari/537.36”, “PORTFOLIO_ID”: null, “NAME”: “no_op”, “POPUP_ID”: 5, “REMOTE_ADDR”: “122.15.120.178”, “COUNTRY”: “IN”, “CREATED_AT”: “2016-06-07 08:34:33”, “FRAMEWORK_ID”: null, “DOMAIN_NAME”: “unknown”, “TEMPLATE_ID”: null, “TOKEN”: null, “BUCKET_ID”: null, “EMAIL”: null, “ID”: 1}

Kindly help

dakh-index-task.json (1.25 KB)

Hi Tausif,

Couldn’t read data from file “quickstart/dakh-index.json” is a curl error. Based on the file you attached, you probably want ‘dakh-index-task.json’

hey thnx for that
now I have a new error.

Error 500

HTTP ERROR: 500

Problem accessing /druid/indexer/v1/task. Reason:

    javax.servlet.ServletException: com.fasterxml.jackson.core.JsonParseException: Unexpected character ('"' (code 34)): was expecting comma to separate ARRAY entries

at [Source: HttpInputOverHTTP@5c8a7de0; line: 1, column: 768]

Powered by Jetty://

solved .it was a json error

curl -X ‘POST’ -H ‘Content-Type:application/json’ -d @quickstart/dakh-index-task.json localhost:8090/druid/indexer/v1/task

printed {“task”:“index_hadoop_dakhevents_2016-07-11T18:12:41.846Z”}

but when i saw http://localhost:8090/console.html

the status code was FAILED.

WHAT COULD BE THE REASON.

curl -X ‘POST’ -H ‘Content-Type:application/json’ -d @quickstart/dakh-index-task.json localhost:8090/druid/indexer/v1/task

printed {“task”:“index_hadoop_dakhevents_2016-07-11T18:12:41.846Z”}

but when i saw http://localhost:8090/console.html

the status code was FAILED.

WHAT COULD BE THE REASON.

this is my log:

{“task”:“index_hadoop_dakhevents_2016-07-11T18:23:35.696Z”,“payload”:{“id”:“index_hadoop_dakhevents_2016-07-11T18:23:35.696Z”,“spec”:{“dataSchema”:{“dataSource”:“dakhevents”,“parser”:{“type”:“string”,“parseSpec”:{“format”:“json”,“dimensionsSpec”:{“dimensions”:[“HTTP_USER_AGENT”,“PORTFOLIO_ID”,“NAME”,“POPUP_ID”,“REMOTE_ADDR”,“COUNTRY”,“FRAMEWORK_ID”,“DOMAIN_NAME”,“TEMPLATE_ID”,“TOKEN”,“BUCKET_ID”,“EMAIL”]},“timestampSpec”:{“format”:“auto”,“column”:“CREATED_AT”}}},“metricsSpec”:[{“type”:“count”,“name”:“count”},{“type”:“hyperUnique”,“name”:“email_unique”,“fieldName”:“EMAIL”}],“granularitySpec”:{“type”:“uniform”,“segmentGranularity”:“DAY”,“queryGranularity”:{“type”:“none”},“intervals”:[“2016-01-01T00:00:00.000Z/2016-01-01T00:00:00.000Z”]}},“ioConfig”:{“type”:“hadoop”,“inputSpec”:{“type”:“static”,“paths”:“quickstart/dakh_events.json”},“metadataUpdateSpec”:null,“segmentOutputPath”:null},“tuningConfig”:{“type”:“hadoop”,“workingPath”:null,“version”:“2016-07-11T18:23:35.695Z”,“partitionsSpec”:{“type”:“hashed”,“targetPartitionSize”:5000000,“maxPartitionSize”:7500000,“assumeGrouped”:false,“numShards”:-1,“partitionDimensions”:},“shardSpecs”:{},“indexSpec”:{“bitmap”:{“type”:“concise”},“dimensionCompression”:null,“metricCompression”:null},“maxRowsInMemory”:75000,“leaveIntermediate”:false,“cleanupOnFailure”:true,“overwriteFiles”:false,“ignoreInvalidRows”:false,“jobProperties”:{},“combineText”:false,“useCombiner”:false,“buildV9Directly”:false,“numBackgroundPersistThreads”:0},“uniqueId”:“998a1a38917146e880f294b6bec72986”},“hadoopDependencyCoordinates”:null,“classpathPrefix”:null,“context”:null,“groupId”:“index_hadoop_dakhevents_2016-07-11T18:23:35.696Z”,“dataSource”:“dakhevents”,“resource”:{“availabilityGroup”:“index_hadoop_dakhevents_2016-07-11T18:23:35.696Z”,“requiredCapacity”:1}}}

log all :

2016-07-11T18:36:50,708 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - Starting flush of map output
2016-07-11T18:36:50,719 INFO [Thread-21] org.apache.hadoop.mapred.LocalJobRunner - map task executor complete.
2016-07-11T18:36:50,720 WARN [Thread-21] org.apache.hadoop.mapred.LocalJobRunner - job_local704288055_0001
java.lang.Exception: [com.metamx.common.RE](http://com.metamx.common.RE): Failure on row[{"HTTP_USER_AGENT": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.63 Safari/537.36", "PORTFOLIO_ID": null, "NAME": "no_op", "POPUP_ID": 5, "REMOTE_ADDR": "122.15.120.178", "COUNTRY": "IN", "CREATED_AT": "2016-06-07 08:34:33", "FRAMEWORK_ID": null, "DOMAIN_NAME": "unknown", "TEMPLATE_ID": null, "TOKEN": null, "BUCKET_ID": null, "EMAIL": null, "ID": 1}]
	at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462) ~[hadoop-mapreduce-client-common-2.3.0.jar:?]
	at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:522) [hadoop-mapreduce-client-common-2.3.0.jar:?]
Caused by: [com.metamx.common.RE](http://com.metamx.common.RE): Failure on row[{"HTTP_USER_AGENT": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.63 Safari/537.36", "PORTFOLIO_ID": null, "NAME": "no_op", "POPUP_ID": 5, "REMOTE_ADDR": "122.15.120.178", "COUNTRY": "IN", "CREATED_AT": "2016-06-07 08:34:33", "FRAMEWORK_ID": null, "DOMAIN_NAME": "unknown", "TEMPLATE_ID": null, "TOKEN": null, "BUCKET_ID": null, "EMAIL": null, "ID": 1}]
	at io.druid.indexer.HadoopDruidIndexerMapper.map(HadoopDruidIndexerMapper.java:88) ~[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
	at io.druid.indexer.DetermineHashedPartitionsJob$DetermineCardinalityMapper.run(DetermineHashedPartitionsJob.java:283) ~[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
	at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243) ~[hadoop-mapreduce-client-common-2.3.0.jar:?]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_91]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_91]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_91]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_91]
	at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_91]
Caused by: com.metamx.common.parsers.ParseException: Unparseable timestamp found!
	at io.druid.data.input.impl.MapInputRowParser.parse(MapInputRowParser.java:72) ~[druid-api-0.9.1.1.jar:0.9.1.1]
	at io.druid.data.input.impl.StringInputRowParser.parseMap(StringInputRowParser.java:136) ~[druid-api-0.9.1.1.jar:0.9.1.1]
	at io.druid.data.input.impl.StringInputRowParser.parse(StringInputRowParser.java:131) ~[druid-api-0.9.1.1.jar:0.9.1.1]
	at io.druid.indexer.HadoopDruidIndexerMapper.parseInputRow(HadoopDruidIndexerMapper.java:98) ~[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
	at io.druid.indexer.HadoopDruidIndexerMapper.map(HadoopDruidIndexerMapper.java:69) ~[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
	at io.druid.indexer.DetermineHashedPartitionsJob$DetermineCardinalityMapper.run(DetermineHashedPartitionsJob.java:283) ~[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
	at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243) ~[hadoop-mapreduce-client-common-2.3.0.jar:?]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_91]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_91]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_91]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_91]
	at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_91]
Caused by: java.lang.IllegalArgumentException: Invalid format: "2016-06-07 08:34:33" is malformed at " 08:34:33"

again done index added … the issue was timestamp format was incorrect.

can u help me with the best book tutorial or website to learn and understand json query to druid.