Row Parser Question in Batch Ingestion

Hi,

I want to load some JSON files into Druid, before that I need to transform the rows. For example, a row with a column value, “a,b,c” -> three rows with column value “a”, “b”, “c”.

I know we can build a custom row parser for realtime ingestion from Kafka. How could I build a row parser for batch ingestion? I used the same row parser for batch ingestion but I get an error

java.lang.ClassCastException: org.apache.druid.segment.transform.TransformingInputRowParser cannot be cast to org.apache.druid.data.input.impl.StringInputRowParser

``

Any Idea?

Best,

Jiyang