Ingestion fails: com.metamx.common.parsers.ParseException: Unable to parse row []

Im trying to parse Json formatted data and running into parsing error.



Data that's being ingested looks like this:

{"timestamp":"2016-11-01T12:36:00-04:00","word":"practical","is_mwu":"no","host":"ws1","ref":"dictionary"}
{"timestamp":"2016-11-01T12:36:00-04:00","word":"devour","is_mwu":"no","host":"ws1","ref":"dictionary"}
{"timestamp":"2016-11-01T12:36:00-04:00","word":"synonymous","is_mwu":"no","host":"ws1","ref":"dictionary"}
{"timestamp":"2016-11-01T12:36:00-04:00","word":"cognitive","is_mwu":"no","host":"ws1","ref":"dictionary"}
{"timestamp":"2016-11-01T12:36:01-04:00","word":"physician","is_mwu":"no","host":"ws1","ref":"dictionary"}
{"timestamp":"2016-11-01T12:36:01-04:00","word":"arrive","is_mwu":"no","host":"ws1","ref":"dictionary"}
{"timestamp":"2016-11-01T12:36:01-04:00","word":"culture","is_mwu":"no","host":"ws1","ref":"dictionary"}
{"timestamp":"2016-11-01T12:36:01-04:00","word":"Latino","is_mwu":"no","host":"ws1","ref":"dictionary"}
{"timestamp":"2016-11-01T12:36:01-04:00","word":"gag law","is_mwu":"yes","host":"ws1","ref":"dictionary"}
{"timestamp":"2016-11-01T12:36:01-04:00","word":"Celsius","is_mwu":"no","host":"ws1","ref":"dictionary"}
{"timestamp":"2016-11-01T12:36:01-04:00","word":"maniacal","is_mwu":"no","host":"ws1","ref":"dictionary"}
{"timestamp":"2016-11-01T12:36:01-04:00","word":"droned","is_mwu":"no","host":"ws1","ref":"dictionary"}
{"timestamp":"2016-11-01T12:36:01-04:00","word":"enquire","is_mwu":"no","host":"ws1","ref":"dictionary"}
{"timestamp":"2016-11-01T12:36:01-04:00","word":"sedulously","is_mwu":"no","host":"ws1","ref":"dictionary"}
{"timestamp":"2016-11-01T12:36:01-04:00","word":"sue for divorce","is_mwu":"no","host":"ws1","ref":"dictionary"}
{"timestamp":"2016-11-01T12:36:01-04:00","word":"aquaculture","is_mwu":"no","host":"ws1","ref":"dictionary"}
{"timestamp":"2016-11-01T12:36:01-04:00","word":"percentage","is_mwu":"no","host":"ws1","ref":"dictionary"}
{"timestamp":"2016-11-01T12:36:02-04:00","word":"international relations","is_mwu":"no","host":"ws1","ref":"dictionary"}
{"timestamp":"2016-11-01T12:36:02-04:00","word":"retaliate","is_mwu":"no","host":"ws1","ref":"dictionary"}



Exception:

Nov 01, 2016 5:14:05 PM org.hibernate.validator.internal.util.Version <clinit>
INFO: HV000001: Hibernate Validator 5.1.3.Final
2016-11-01T17:14:10,363 WARN [main] org.apache.curator.retry.ExponentialBackoffRetry - maxRetries too large (30). Pinning to 29
Nov 01, 2016 5:14:15 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider as a provider class
Nov 01, 2016 5:14:15 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering com.fasterxml.jackson.jaxrs.smile.JacksonSmileProvider as a provider class
Nov 01, 2016 5:14:15 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering io.druid.server.initialization.jetty.CustomExceptionMapper as a provider class
Nov 01, 2016 5:14:15 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering io.druid.server.StatusResource as a root resource class
Nov 01, 2016 5:14:15 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.19 02/11/2015 03:25 AM'
Nov 01, 2016 5:14:15 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding io.druid.server.initialization.jetty.CustomExceptionMapper to GuiceManagedComponentProvider with the scope "Singleton"
Nov 01, 2016 5:14:15 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider to GuiceManagedComponentProvider with the scope "Singleton"
Nov 01, 2016 5:14:15 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding com.fasterxml.jackson.jaxrs.smile.JacksonSmileProvider to GuiceManagedComponentProvider with the scope "Singleton"
Nov 01, 2016 5:14:15 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding io.druid.server.QueryResource to GuiceInstantiatedComponentProvider
Nov 01, 2016 5:14:15 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding io.druid.segment.realtime.firehose.ChatHandlerResource to GuiceInstantiatedComponentProvider
Nov 01, 2016 5:14:15 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding io.druid.query.lookup.LookupListeningResource to GuiceInstantiatedComponentProvider
Nov 01, 2016 5:14:15 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding io.druid.query.lookup.LookupIntrospectionResource to GuiceInstantiatedComponentProvider
Nov 01, 2016 5:14:15 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding io.druid.server.http.security.StateResourceFilter to GuiceInstantiatedComponentProvider
Nov 01, 2016 5:14:15 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding io.druid.server.StatusResource to GuiceManagedComponentProvider with the scope "Undefined"
2016-11-01T17:14:17,212 WARN [task-runner-0-priority-0] org.jets3t.service.impl.rest.httpclient.RestStorageService - Retrying request with "AWS4-HMAC-SHA256" signing mechanism: GET https://word-lookup.s3.amazonaws.com:443/ws1/2016/11/12-53-05_l10001.json.gz HTTP/1.1
2016-11-01T17:14:17,213 WARN [task-runner-0-priority-0] org.jets3t.service.impl.rest.httpclient.RestStorageService - Retrying request following error response: GET '/ws1/2016/11/12-53-05_l10001.json.gz' -- ResponseCode: 400, ResponseStatus: Bad Request, Request Headers: [Date: Tue, 01 Nov 2016 17:14:16 GMT, Authorization: AWS AKIAJP7DZIZHGBDNPWLA:bKVn1L6bxP+AOLs0uIPLUGO1u+E=], Response Headers: [x-amz-request-id: B3BFE77BAAFC0F33, x-amz-id-2: pXmiiMbbB5XgWl3nItIgQIYWhDjeCJkd4UwyW/JW99RJfZqjDXzFY2V81HN79053FLzdlMVNg+4=, x-amz-region: us-east-2, Content-Type: application/xml, Transfer-Encoding: chunked, Date: Tue, 01 Nov 2016 17:14:17 GMT, Connection: close, Server: AmazonS3]
2016-11-01T17:14:17,484 WARN [task-runner-0-priority-0] org.jets3t.service.impl.rest.httpclient.RestStorageService - Retrying request after automatic adjustment of Host endpoint from "word-lookup.s3.amazonaws.com" to "word-lookup.s3-us-east-2.amazonaws.com" following request signing error using AWS request signing version 4: GET https://word-lookup.s3-us-east-2.amazonaws.com:443/ws1/2016/11/12-53-05_l10001.json.gz HTTP/1.1
2016-11-01T17:14:17,484 WARN [task-runner-0-priority-0] org.jets3t.service.impl.rest.httpclient.RestStorageService - Retrying request following error response: GET '/ws1/2016/11/12-53-05_l10001.json.gz' -- ResponseCode: 400, ResponseStatus: Bad Request, Request Headers: [Date: Tue, 01 Nov 2016 17:14:17 GMT, x-amz-content-sha256: e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855, Host: word-lookup.s3.amazonaws.com, x-amz-date: 20161101T171417Z, Authorization: AWS4-HMAC-SHA256 Credential=AKIAJP7DZIZHGBDNPWLA/20161101/us-east-1/s3/aws4_request,SignedHeaders=date;host;x-amz-content-sha256;x-amz-date,Signature=1e74c3de4808b4b85e70b0b50def28d235d8577ea8566f3f8ab06175776aba9e], Response Headers: [x-amz-request-id: 2F5504538FD35E0E, x-amz-id-2: WzMOFl0zXS5KHkuqFJocxDFk/RyjWM/bBhmCqgE7a8HdTzo70iJ0O+OZA5EHbWnhycvJD99I6u4=, Content-Type: application/xml, Transfer-Encoding: chunked, Date: Tue, 01 Nov 2016 17:14:16 GMT, Connection: close, Server: AmazonS3]
2016-11-01T17:14:17,888 ERROR [task-runner-0-priority-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Exception while running task[IndexTask{id=index_lookups_2016-11-01T17:14:02.749Z, type=index, dataSource=lookups}]
com.metamx.common.parsers.ParseException: Unable to parse row []
	at com.metamx.common.parsers.JSONPathParser.parse(JSONPathParser.java:127) ~[java-util-0.27.9.jar:?]
	at io.druid.data.input.impl.StringInputRowParser.parseString(StringInputRowParser.java:126) ~[druid-api-0.9.1.1.jar:0.9.1.1]
	at io.druid.data.input.impl.StringInputRowParser.parse(StringInputRowParser.java:131) ~[druid-api-0.9.1.1.jar:0.9.1.1]
	at io.druid.data.input.impl.FileIteratingFirehose.nextRow(FileIteratingFirehose.java:72) ~[druid-api-0.9.1.1.jar:0.9.1.1]
	at io.druid.indexing.common.task.IndexTask.getDataIntervals(IndexTask.java:244) ~[druid-indexing-service-0.9.1.1.jar:0.9.1.1]
	at io.druid.indexing.common.task.IndexTask.run(IndexTask.java:200) ~[druid-indexing-service-0.9.1.1.jar:0.9.1.1]
	at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:436) [druid-indexing-service-0.9.1.1.jar:0.9.1.1]
	at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:408) [druid-indexing-service-0.9.1.1.jar:0.9.1.1]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_40]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_40]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_40]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_40]
Caused by: com.fasterxml.jackson.databind.JsonMappingException: No content to map due to end-of-input
 at [Source: ; line: 1, column: 1]
	at com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148) ~[jackson-databind-2.4.6.jar:2.4.6]
	at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:3110) ~[jackson-databind-2.4.6.jar:2.4.6]
	at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3051) ~[jackson-databind-2.4.6.jar:2.4.6]
	at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2168) ~[jackson-databind-2.4.6.jar:2.4.6]
	at com.metamx.common.parsers.JSONPathParser.parse(JSONPathParser.java:99) ~[java-util-0.27.9.jar:?]
	... 11 more



Any advice?

not sure but seems you have empty row.

@Slim

Ahh darn it - you right. im testing and log output only mentioned 1 file, but in fact second file had an empty line. works now.

Thx