Removing obsolete versions of segment in deep storage

Hi all,

I googled a lot before posting this and found some relevant information, but want to make sure that my understanding is correct.

I use workflow when I have to reindex one segment about 30 times during a week, after that It is not changed anymore. Each of this reindex operation produces new one version of segment data in deep storage and makes the old one obsolete. I would like to know how to automatically delete these old versions of segment data in deep storage.

By now I see the only way to do this by hand, removing row from segments table in metadata database and removing files in deep storage.

By the way, is this frequent reindexing is antipattern for Druid usage?

Hi Kirill, you can use the kill task:

Frequent reindexing is not an anti pattern.