A table in hive have a partition column called dayno,which is used describe time(2019-08-01 e.g). As parition column, it isn’t written in data file but data directory. If data of this table need to be imported to druid, what should i do with timestampSpec?
Druid don’t support import directly from Hive, you can ingest from HDFS otherwise. There are two methods:
use static type of inputSpec with hadoop-based batch ingestion that will ingest data day by day;
or use granularity type of inputSpec;
在 2019年8月27日星期二 UTC+8上午11:40:57，Bin Li写道：