Druid Query failure - java.lang.UnsupportedOperationException

We are trying the following join query with an aggregate and the query is failing with ava.lang.UnsupportedOperationException

Query - 1:

WITH dim AS (
SELECT column1, column2, column3 FROM table1 WHERE column1 IN (1748968765, 1749783113)
)
SELECT column1, column2, column3, sum(column4) FROM table2 INNER JOIN dim ON table1.column1 = table2.column5 WHERE table2.__time >= ‘2021-07-10 00:00:00’ and table2.__time <= ‘2021-08-02 23:59:59’
GROUP BY 1,2,3

Query -2:

WITH dim AS (
SELECT column1, column2, column3 FROM table1 WHERE column1 IN (1748968765)
)
SELECT column1, column2, column3, sum(column4) FROM table2 INNER JOIN dim ON table1.column1 = table2.column5 WHERE table2.__time >= ‘2021-07-10 00:00:00’ and table2.__time <= ‘2021-08-02 23:59:59’
GROUP BY 1,2,3

Query -3:

WITH dim AS (
SELECT column1, column2, column3 FROM table1 WHERE column1 = 1748968765
)
SELECT column1, column2, column3, sum(column4) FROM table2 INNER JOIN dim ON table1.column1 = table2.column5 WHERE table2.__time >= ‘2021-07-10 00:00:00’ and table2.__time <= ‘2021-08-02 23:59:59’
GROUP BY 1,2,3

The query-1 is running without any issue and I see data for both the column1 values. If I remove any one value for column1 filter, we are getting error response:

Error: Unknown exception. java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.UnsupportedOperationException java.lang.RuntimeException

Even the broker logs doesn’t have much info. Any idea what the issue in this query or what I can do to get more info regarding this issue

Logs ``` org.apache.druid.query.QueryInterruptedException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.UnsupportedOperationException at org.apache.druid.client.JsonParserIterator.convertException(JsonParserIterator.java:268) ~[druid-server-0.21.1.jar:0.21.1] at org.apache.druid.client.JsonParserIterator.init(JsonParserIterator.java:183) ~[druid-server-0.21.1.jar:0.21.1] at org.apache.druid.client.JsonParserIterator.hasNext(JsonParserIterator.java:93) ~[druid-server-0.21.1.jar:0.21.1] at org.apache.druid.java.util.common.guava.BaseSequence.makeYielder(BaseSequence.java:89) ~[druid-core-0.21.1.jar:0.21.1] at org.apache.druid.java.util.common.guava.BaseSequence.toYielder(BaseSequence.java:69) ~[druid-core-0.21.1.jar:0.21.1] at org.apache.druid.java.util.common.guava.MappedSequence.toYielder(MappedSequence.java:49) ~[druid-core-0.21.1.jar:0.21.1] at org.apache.druid.java.util.common.guava.ParallelMergeCombiningSequence$ResultBatch.fromSequence(ParallelMergeCombiningSequence.java:869) ~[druid-core-0.21.1.jar:0.21.1] at org.apache.druid.java.util.common.guava.ParallelMergeCombiningSequence$SequenceBatcher.block(ParallelMergeCombiningSequence.java:920) ~[druid-core-0.21.1.jar:0.21.1] at java.util.concurrent.ForkJoinPool.managedBlock(ForkJoinPool.java:3313) ~[?:1.8.0_121] at org.apache.druid.java.util.common.guava.ParallelMergeCombiningSequence$SequenceBatcher.getBatchYielder(ParallelMergeCombiningSequence.java:909) ~[druid-core-0.21.1.jar:0.21.1] at org.apache.druid.java.util.common.guava.ParallelMergeCombiningSequence$YielderBatchedResultsCursor.initialize(ParallelMergeCombiningSequence.java:1017) ~[druid-core-0.21.1.jar:0.21.1] at org.apache.druid.java.util.common.guava.ParallelMergeCombiningSequence$PrepareMergeCombineInputsAction.compute(ParallelMergeCombiningSequence.java:721) ~[druid-core-0.21.1.jar:0.21.1] at java.util.concurrent.RecursiveAction.exec(RecursiveAction.java:189) ~[?:1.8.0_121] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289) ~[?:1.8.0_121] at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056) ~[?:1.8.0_121] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692) ~[?:1.8.0_121] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157) ~[?:1.8.0_121] 2021-09-09T18:26:02,209 DEBUG [qtp1241385933-332] org.eclipse.jetty.server.HttpOutput - write(array HeapByteBuffer@6535107b[p=0,l=233,c=233,r=233]={<>>}) 2021-09-09T18:26:02,209 DEBUG [qtp1241385933-332] org.eclipse.jetty.server.HttpOutput - write(array) s=CLOSING,api=BLOCKED,sc=false,e=null last=true agg=false flush=true async=false, len=233 null 2021-09-09T18:26:02,209 DEBUG [qtp1241385933-332] org.eclipse.jetty.server.handler.gzip.GzipHttpOutputInterceptor - org.eclipse.jetty.server.handler.gzip.GzipHttpOutputInterceptor@4772cb13 exclude by status 500 2021-09-09T18:26:02,209 DEBUG [qtp1241385933-332] org.eclipse.jetty.server.HttpChannel - sendResponse info=null content=HeapByteBuffer@2f704bb5[p=0,l=233,c=233,r=233]={<>>} complete=true committing=true callback=Blocker@1e08e448{null} 2021-09-09T18:26:02,209 DEBUG [qtp1241385933-332] org.eclipse.jetty.server.HttpChannel - COMMIT for /druid/v2/sql/ on HttpChannelOverHttp@237fba53{s=HttpChannelState@27402ae4{s=HANDLING rs=BLOCKING os=COMMITTED is=IDLE awp=false se=false i=true al=0},r=3,c=false/false,a=HANDLING,uri=//dod0027.atl1.turn.com:8082/druid/v2/sql/,age=359} ```

Relates to Apache Druid <Druid version - 0.21.1>

Is your IN list strings or integers? If they are strings, have you tried putting single quotes around them?

they are integers only

I couldn’t reproduce the issue. This is my test case:

WITH dim AS (
SELECT sum_delta, __time FROM “wikipedia-demo” WHERE sum_added IN (5, 6, 9)
)
SELECT “wikipedia-demo”.sum_delta, “wikipedia-demo”.__time, sum(“wikipedia-demo”.sum_deleted)
FROM “wikipedia-demo” INNER JOIN dim ON dim.__time = “wikipedia-demo”.__time
GROUP BY 1,2

Hi @Rachel_Pedreschi , thanks for checking.

I found one way to reproduce this error in 0.21.1

I ingested the data file provided in Quickstart - wikiticker-2015-09-12-sampled.json.gz. and when I ran the following query, I’m getting unsupported error

WITH dim AS (
SELECT added, __time,delta FROM wikipedia WHERE added IN (8745)
)
SELECT  i1 .added,  i1 .__time, sum( i1 .delta)
FROM wikipedia i1 INNER JOIN dim ON dim.delta =  i1 .delta
GROUP BY 1,2

8745 - this value is not there in added column

When I ran the same ingestion and query in 0.20.2 and 0.19.0, the query didn’t fail. It just says no data.

From the logs in Historical,

2021-09-18T03:48:04,068 ERROR [processing-0] org.apache.druid.query.groupby.epinephelinae.GroupByMergingQueryRunnerV2 - Exception with one of the sequences!
java.lang.UnsupportedOperationException: null
        at org.apache.druid.segment.join.table.MapIndex.findUniqueLong(MapIndex.java:101) ~[druid-processing-0.21.1.jar:0.21.1]
        at org.apache.druid.segment.join.table.IndexedTableJoinMatcher$ConditionMatcherFactory$1.matchSingleRow(IndexedTableJoinMatcher.java:465) ~[druid-processing-0.21.1.jar:0.21.1]
        at org.apache.druid.segment.join.table.IndexedTableJoinMatcher.matchCondition(IndexedTableJoinMatcher.java:193) ~[druid-processing-0.21.1.jar:0.21.1]
        at org.apache.druid.segment.join.HashJoinEngine$1JoinCursor.matchCurrentPosition(HashJoinEngine.java:167) ~[druid-processing-0.21.1.jar:0.21.1]
        at org.apache.druid.segment.join.HashJoinEngine$1JoinCursor.initialize(HashJoinEngine.java:127) ~[druid-processing-0.21.1.jar:0.21.1]
        at org.apache.druid.segment.join.HashJoinEngine.makeJoinCursor(HashJoinEngine.java:219) ~[druid-processing-0.21.1.jar:0.21.1]
        at org.apache.druid.segment.join.HashJoinSegmentStorageAdapter.lambda$makeCursors$0(HashJoinSegmentStorageAdapter.java:267) ~[druid-processing-0.21.1.jar:0.21.1]
        at org.apache.druid.java.util.common.guava.MappingAccumulator.accumulate(MappingAccumulator.java:40) ~[druid-core-0.21.1.jar:0.21.1]

From the git repo, MapIndex class is added in 0.21 only

Hey Krishna - did you get any more insights? I wonder if you need to log this as an Issue in github?

I tested the recently released 0.22 version locally and it worked fine. Have to test it on Cluster. Since it worked locally, I didn’t file issue in GitHub . Issue Does exist in 0.21