Realtime java.util.common.ISE: Can not combine streams for version 2

Hi Team,

We use 0.11.0 version and have been facing issues with index job

We tried to looks at the code but haven’t been able to figure out anything yet.

the druid table Have one big string column included the log message.

when we lowed the segmentGranularity from thirty_minute to five_minute,druid run good and somtimes throw this Exception

Would be good if you can have a look at this once and guide urgently. Thanks in advance.



Number of tasks per middleManager


Task launch parameters

druid.indexer.runner.javaOpts=-server -Xmx3g -Xss1m -XX:MaxDirectMemorySize=5g -XX:+UseG1GC -XX:MaxGCPauseMillis=100 -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -Duser.timezone=GMT+8 -Dfile.encoding=UTF-8 -Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager


HTTP server threads


Processing threads and buffers on Peons

Hadoop indexing



“dataSources” : {

“app_log”: {

“spec”: {

“dataSchema”: {

“dataSource”: “app_log”,

“parser”: {

“type”: “string”,

“parseSpec”: {

“timestampSpec”: {

“column”: “timestamp”,

“format”: “auto”


“dimensionsSpec”: {

“dimensions”: ,

“dimensionExclusions”: [




“format”: “json”



“metricsSpec”: [


“type”: “longSum”,

“name”: “response_time”,

“fieldName”: “response_time”



“granularitySpec”: {

“type”: “uniform”,

“segmentGranularity”: “five_minute”,

“queryGranularity”: “none”



“ioConfig”: {

“type”: “realtime”


“tuningConfig”: {

“type”: “realtime”,

“maxRowsInMemory”: “100000”,

“intermediatePersistPeriod”: “PT10M”,

“windowPeriod”: “PT10M”



“properties”: {

“task.partitions”: “1”,

“task.replicants”: “1”



Hey Mylinushr,

This error can happen when you have a single string column that is too big for one segment (>2GB for a column). You should be able to work around it by increasing your “task.partitions”. The idea there is that having more partitions means that each individual partition is smaller.


Hi Gian Merlino,

Thank you very much,You saved me,Amen! I’ve been working on this problem for a long time.

在 2018年10月2日星期二 UTC+8上午12:06:58,Gian Merlino写道: