org.apache.druid.server.lookup.namespace.cache.CacheScheduler$EntryImpl@65a318e6: CACHE_NOT_INITIAL

i
I tried to lookup a value using global lookup cache.

My query is:

{

“queryType”: “groupBy”,

“dataSource”: {

“type”: “table”,

“name”: “test_v1.0”

},

“granularity”: {

“type”: “period”,

“period”: “PT1H”,

“timeZone”: “Asia/Singapore”,

“origin”: “2019-06-04T00:00:00.000+08:00”

},

“dimensions”: [{

“type”: “extraction”,

“dimension”: “a2”,

“outputName”: “a2_n1”,

“extractionFn”: {

“type”: “registeredLookup”,

“lookup”: “keyvalue”

}

}, “sa3”],

“filter”: {

“dimension”: “a2”,

“value”: “104021087”,

“type”: “selector”

},

“aggregations”: [{

“type”: “longSum”,

“name”: “longSum_total_stays”,

“fieldName”: “total_stays”

}, {

“type”: “hyperUnique”,

“name”: “hyperUnique_unique_agents”,

“fieldName”: “unique_agents”

}, {

“type”: “longSum”,

“name”: “longSum_sum_stay_duration”,

“fieldName”: “sum_stay_duration”

}],

“intervals”: [“2019-06-04T00:00:00.000+08:00/2019-06-04T23:59:59.999+08:00”]

}

I am getting this error:

{

“error” : “Unknown exception”,

“errorMessage” : “namespace [UriExtractionNamespace{uri=file:/opt/druid/lookupcache/keyvalue1.json, uriPrefix=null, namespaceParseSpec=ObjectMapperFlatDataParser{}, fileRegex=‘null’, pollPeriod=PT30S}] : org.apache.druid.server.lookup.namespace.cache.CacheScheduler$EntryImpl@65a318e6: CACHE_NOT_INITIALIZED, extractorID = namespace-factory-UriExtractionNamespace{uri=file:/opt/druid/lookupcache/keyvalue1.json, uriPrefix=null, namespaceParseSpec=ObjectMapperFlatDataParser{}, fileRegex=‘null’, pollPeriod=PT30S}-fba1ff7f-8e31-441b-a959-2f9cca086cf2”,

“errorClass” : “org.apache.druid.java.util.common.ISE”,

“host” : null

}

Eventhough after loading the keyvalue1.json file in historical and broker nodes log shoing this error

ERROR [NamespaceExtractionCacheManager-1] org.apache.druid.server.lookup.namespace.cache.CacheScheduler - Failed to update namespace [UriExtractionNamespace{uri=file:/opt/druid/cache/keyvalue1.json, uriPrefix=null, namespaceParseSpec=ObjectMapperFlatDataParser{}, fileRegex=‘null’, pollPeriod=PT30S}] : org.apache.druid.server.lookup.namespace.cache.CacheScheduler$EntryImpl@13463c35

java.io.FileNotFoundException: /opt/druid/lookupcache/keyvalue1.json (No such file or directory)

at java.io.FileInputStream.open0(Native Method) ~[?:1.8.0_121]

at java.io.FileInputStream.open(FileInputStream.java:195) ~[?:1.8.0_121]

at java.io.FileInputStream.(FileInputStream.java:138) ~[?:1.8.0_121]

at org.apache.druid.segment.loading.LocalDataSegmentPuller$1.openInputStream(LocalDataSegmentPuller.java:76) ~[druid-server-0.14.0-incubating.jar:0.14.0-incubating]

at org.apache.druid.segment.loading.LocalDataSegmentPuller.getInputStream(LocalDataSegmentPuller.java:203) ~[druid-server-0.14.0-incubating.jar:0.14.0-incubating]

at org.apache.druid.server.lookup.namespace.UriCacheGenerator$1.openStream(UriCacheGenerator.java:140) ~[?:?]

curl -X GET http://localhost:8081/druid/coordinator/v1/lookups/config/__default/keyvalue

{“version”:“v1”,“lookupExtractorFactory”:{“type”:“cachedNamespace”,“extractionNamespace”:{“type”:“uri”,“uri”:“file:/opt/druid/lookupcache/keyvalue1.json”,“namespaceParseSpec”:{“format”:“simpleJson”},“pollPeriod”:“PT30S”},“firstCacheTimeout”:0}}

Hi
Is this file content format is correct? Uploading in cache from coordinator

{

“__default”: {

 "keyvalue": {

   "version": "v1",

   "lookupExtractorFactory": {

     "type": "cachedNamespace",

     "extractionNamespace": {

       "type": "uri",

       "uri": "file:/opt/druid/lookupcache/keyvalue1.json",

       "namespaceParseSpec": {

         "format": "simpleJson"

       },

       "pollPeriod": "PT30S"

     },

     "firstCacheTimeout": 0

   }

 }

}

}

Loaded in cache by below command :

curl -H “Content-Type: application/json” --data @lookup_script.json http://localhost:8081/druid/coordinator/v1/lookups/config

Seems like it's not able to find the file based on the below exception

java.io.FileNotFoundException: /opt/druid/lookupcache/keyvalue1.json (No such file or directory)

Can you check if the above file is present in the mentioned folder. If it is there can you change your uri to the below in the config

"uri": "file:///opt/druid/lookupcache/keyvalue1.json"

And please make sure the file is present from the machine where you are uploading the cache

Also all the query serving components (Peon, Router, Broker, and Historical ) have the ability to consume lookup configuration. So you have to make sure the json file (your keyvalue1.json) containing the lookups are present in the nodes where these processes are running.

Hi
Thanks for the reply. You mean to keep the keyvalue1.json in all the service nodes?

Thanks

Prabakaran krishnan

Yes (not all in the broker and historical). Since you are using the file uri (which is the local file system), if you are using jdbc, s3 or http uri then not needed, obviously.

Hi Muthu
I placed the keyvalue1.json file in historical node , broker node also.

In my service, coordinatory, middlemanager and loadlord are running in same node. Keyvalue1 file already exists there.

Now executedd the below in broker:

curl -X POST -H “Content-Type: application/json” -d @query-testspec2-lookup.jsonhttp://localhost:8082/druid/v2/?pretty

It is giving output , but extract function dimension not at all displaying

Sample output:

{

“version” : “v1”,

“timestamp” : “2019-06-04T20:00:00.000+10:00”,

“event” : {

"longSum_sum_stay_duration" : 960,

"hyperUnique_unique_agents" : 46.524469830108174,

"longSum_total_stays" : 48

}

}, {

“version” : “v1”,

“timestamp” : “2019-06-04T21:00:00.000+10:00”,

“event” : {

"longSum" : 57,

"hyperUnique" : 3.0021994137521975,

"longSum" : 3

}

query-testspec2-lookup.json

{

“queryType”: “groupBy”,

“dataSource”: {

“type”: “table”,

“name”: “staypoint_v1.0”

},

“context”: {

“timeout”: 150001,

“queryId”: “daas-88952e089b638af00eabf7a186e21a04-58-3fpgB-1559815238454”

},

“granularity”: {

“type”: “period”,

“period”: “PT1H”,

“timeZone”: “Australia/Sydney”,

“origin”: “2019-06-04T00:00:00.000+08:00”

},

    "dimension":{"type":"extraction","dimension":"sa2","outputname":"saname","extractionFn":{"type":"lookup","lookup":"keyvalue"}},

“filter”: {

“dimension”: “sa2”,

“value”: “104021087”,

“type”: “selector”

},

“aggregations”: [{

“type”: “longSum”,

“name”: “longSum_total_stays”,

“fieldName”: “total_stays”

}, {

“type”: “hyperUnique”,

“name”: “hyperUnique_unique_agents”,

“fieldName”: “unique_agents”

}, {

“type”: “longSum”,

“name”: “longSum_sum_stay_duration”,

“fieldName”: “sum_stay_duration”

}],

“intervals”: [“2019-06-04T00:00:00.000+08:00/2019-06-04T23:59:59.999+08:00”]

}

Getting below error:

ERROR [qtp789885174-142] org.apache.druid.server.QueryResource - Exception handling request: {class=org.apache.druid.server.QueryResource, exceptionType=class com.fasterxml.jackson.databind.JsonMappingException, exceptionMessage=No content to map due to end-of-input

at [Source: HttpInputOverHTTP@4b981ad5[c=0,q=0,[0]=null,s=EOF]; line: 1, column: 1], exception=com.fasterxml.jackson.databind.JsonMappingException: No content to map due to end-of-input

at [Source: HttpInputOverHTTP@4b981ad5[c=0,q=0,[0]=null,s=EOF]; line: 1, column: 1], query=unparseable query, peer=127.0.0.1}

com.fasterxml.jackson.databind.JsonMappingException: No content to map due to end-of-input

at [Source: HttpInputOverHTTP@4b981ad5[c=0,q=0,[0]=null,s=EOF]; line: 1, column: 1]

at com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148) ~[jackson-databind-2.6.7.jar:2.6.7]

at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:3781) ~[jackson-databind-2.6.7.jar:2.6.7]

at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3721) ~[jackson-databind-2.6.7.jar:2.6.7]

at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2796) ~[jackson-databind-2.6.7.jar:2.6.7]

at org.apache.druid.server.QueryResource.readQuery(QueryResource.java:312) ~[druid-server-0.14.0-incubating.jar:0.14.0-incubating]

at org.apache.druid.server.QueryResource.doPost(QueryResource.java:176) [druid-server-0.14.0-incubating.jar:0.14.0-incubating]

at sun.reflect.GeneratedMethodAccessor21.invoke(Unknown Source) ~[?:?]

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_121]

at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_121]

at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60) [jersey-server-1.19.3.jar:1.19.3]

at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205) [jersey-server-1.19.3.jar:1.19.3]

at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75) [jersey-server-1.19.3.jar:1.19.3]

at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302) [jersey-server-1.19.3.jar:1.19.3]

at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108) [jersey-server-1.19.3.jar:1.19.3]

at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147) [jersey-server-1.19.3.jar:1.19.3]

at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84) [jersey-server-1.19.3.jar:1.19.3]

at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542) [jersey-server-1.19.3.jar:1.19.3]

at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473) [jersey-server-1.19.3.jar:1.19.3]

at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419) [jersey-server-1.19.3.jar:1.19.3]

at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409) [jersey-server-1.19.3.jar:1.19.3]

at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409) [jersey-servlet-1.19.3.jar:1.19.3]

at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558) [jersey-servlet-1.19.3.jar:1.19.3]

at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733) [jersey-servlet-1.19.3.jar:1.19.3]

at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) [javax.servlet-api-3.1.0.jar:3.1.0]

at com.google.inject.servlet.ServletDefinition.doServiceImpl(ServletDefinition.java:286) [guice-servlet-4.1.0.jar:?]

at com.google.inject.servlet.ServletDefinition.doService(ServletDefinition.java:276) [guice-servlet-4.1.0.jar:?]

at com.google.inject.servlet.ServletDefinition.service(ServletDefinition.java:181) [guice-servlet-4.1.0.jar:?]

at com.google.inject.servlet.ManagedServletPipeline.service(ManagedServletPipeline.java:91) [guice-servlet-4.1.0.jar:?]

at com.google.inject.servlet.FilterChainInvocation.doFilter(FilterChainInvocation.java:85) [guice-servlet-4.1.0.jar:?]

at com.google.inject.servlet.ManagedFilterPipeline.dispatch(ManagedFilterPipeline.java:120) [guice-servlet-4.1.0.jar:?]

at com.google.inject.servlet.GuiceFilter.doFilter(GuiceFilter.java:135) [guice-servlet-4.1.0.jar:?]

at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) [jetty-servlet-9.4.10.v20180503.jar:9.4.10.v20180503]

at org.apache.druid.server.security.PreResponseAuthorizationCheckFilter.doFilter(PreResponseAuthorizationCheckFilter.java:82) [druid-server-0.14.0-incubating.jar:0.14.0-incubating]

at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) [jetty-servlet-9.4.10.v20180503.jar:9.4.10.v20180503]

at org.apache.druid.server.security.AllowOptionsResourceFilter.doFilter(AllowOptionsResourceFilter.java:75) [druid-server-0.14.0-incubating.jar:0.14.0-incubating]

at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) [jetty-servlet-9.4.10.v20180503.jar:9.4.10.v20180503]

at org.apache.druid.server.security.AllowAllAuthenticator$1.doFilter(AllowAllAuthenticator.java:84) [druid-server-0.14.0-incubating.jar:0.14.0-incubating]

at org.apache.druid.server.security.AuthenticationWrappingFilter.doFilter(AuthenticationWrappingFilter.java:59) [druid-server-0.14.0-incubating.jar:0.14.0-incubating]

at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) [jetty-servlet-9.4.10.v20180503.jar:9.4.10.v20180503]

at org.apache.druid.server.security.SecuritySanityCheckFilter.doFilter(SecuritySanityCheckFilter.java:86) [druid-server-0.14.0-incubating.jar:0.14.0-incubating]

at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) [jetty-servlet-9.4.10.v20180503.jar:9.4.10.v20180503]

at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:533) [jetty-servlet-9.4.10.v20180503.jar:9.4.10.v20180503]

at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255) [jetty-server-9.4.10.v20180503.jar:9.4.10.v20180503]

The error says JsonMappingException. It’s most likely an issue with the json content ? Can you check if the json in query-testspec2-lookup.json file is in a proper format ?

Thanks,

Sashi

Sure Sashi. I will check. Thanks for the reply.

Thanks

Prabakaran Krishnan