Problem in data type conversion of AWS S3 and druid

Greeting community,

I am working on ingestion from aws s3 to druid. The data in s3 is of parquet format. All columns are ingested as intended but 2 columns date1 (type: date) and date2 (type: timestamp) throw errors.

  1. When ingesting date1 as a dimension, druid changes it into string (expected) which is not the actual date(ex: 2018-02-01) but some another string which in not comprehensible format (ex: 17479). No errors are raised and ingestion is successful.

  2. When ingesting date1 as timestamp (with format described as yyyy-MM-dd and without any format), it throws an error:

Caused by: java.lang.RuntimeException: No buckets?? seems there is no data to index.
Possible reason is the date1 is 17479 which is not a standard timestampspec{} format.
  1. When ingesting date2 as a timestamp or dimension, it throws an error:
Caused by: java.lang.IllegalArgumentException: INT96 not yet implemented.
Possible Reason is parquet's dependencies is avro, which throws this error as s3 stores timestamp as int96 and avro doesn't support it.

I wish to include both columns in druid and got no way out yet.

Thanks in advanced for any leads to issues.