Read parquet file exception

Our hive table stored by parquet,and we use decimal type in parquet

druid read parquet file with avro ,decimal in avro is fixed type

MapBasedRow can not read fixed type

Exception info:

at io.druid.indexer.HadoopDruidIndexerMapper.map(HadoopDruidIndexerMapper.java:91)

at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)

at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)

at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)

at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1714)

at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

Caused by: io.druid.java.util.common.parsers.ParseException: Encountered parse error for aggregator[tax_value]

at io.druid.indexer.InputRowSerde.toBytes(InputRowSerde.java:103)

at io.druid.indexer.IndexGeneratorJob$IndexGeneratorMapper.innerMap(IndexGeneratorJob.java:300)

at io.druid.indexer.HadoopDruidIndexerMapper.map(HadoopDruidIndexerMapper.java:87)

… 8 more

Caused by: io.druid.java.util.common.parsers.ParseException: Unknown type[class org.apache.avro.generic.GenericData$Fixed]

at io.druid.data.input.MapBasedRow.getFloatMetric(MapBasedRow.java:133)

at io.druid.query.groupby.RowBasedColumnSelectorFactory$5.get(RowBasedColumnSelectorFactory.java:314)

at io.druid.query.aggregation.DoubleSumAggregator.aggregate(DoubleSumAggregator.java:60)

at io.druid.indexer.InputRowSerde.toBytes(InputRowSerde.java:98)

… 10 more

“metricsSpec” : [

    {

      "type" : "count",

      "name" : "count"

    },

    {

      "type" : "doubleSum",

      "name" : "tax_value",

      "fieldName" : "tax_value"

    },

    {

      "type" : "doubleSum",

      "name" : "cost_val",

      "fieldName" : "cost_val"

    }

]