Encounter Post-Index-Task Error when ingesting a Huge Number of Columns (9000) of Data

I wanted to ingest 9000 columns of data into druid, but when I started a post-task-index, after a second, I receive this error message:

Task index_data_ehbomfhh_2020-04-21T08:21:27.412Z still running…

Traceback (most recent call last):

File “/home/user/apache-druid-0.18.0/bin/post-index-task-main”, line 174, in

main()

File “/home/user/apache-druid-0.18.0/bin/post-index-task-main”, line 164, in main

task_status = await_task_completion(args, task_id, complete_timeout_at)

File “/home/user/apache-druid-0.18.0/bin/post-index-task-main”, line 89, in await_task_completion

response = urllib2.urlopen(req, None, response_timeout)

File “/usr/lib/python2.7/urllib2.py”, line 154, in urlopen

return opener.open(url, data, timeout)

File “/usr/lib/python2.7/urllib2.py”, line 435, in open

response = meth(req, response)

File “/usr/lib/python2.7/urllib2.py”, line 548, in http_response

‘http’, request, response, code, msg, hdrs)

File “/usr/lib/python2.7/urllib2.py”, line 467, in error

result = self._call_chain(*args)

File “/usr/lib/python2.7/urllib2.py”, line 407, in _call_chain

result = func(*args)

File “/usr/lib/python2.7/urllib2.py”, line 654, in http_error_302

return self.parent.open(new, timeout=req.timeout)

File “/usr/lib/python2.7/urllib2.py”, line 435, in open

response = meth(req, response)

File “/usr/lib/python2.7/urllib2.py”, line 548, in http_response

‘http’, request, response, code, msg, hdrs)

File “/usr/lib/python2.7/urllib2.py”, line 467, in error

result = self._call_chain(*args)

File “/usr/lib/python2.7/urllib2.py”, line 407, in _call_chain

result = func(*args)

File “/usr/lib/python2.7/urllib2.py”, line 654, in http_error_302

return self.parent.open(new, timeout=req.timeout)

File “/usr/lib/python2.7/urllib2.py”, line 435, in open

response = meth(req, response)

File “/usr/lib/python2.7/urllib2.py”, line 548, in http_response

‘http’, request, response, code, msg, hdrs)

File “/usr/lib/python2.7/urllib2.py”, line 467, in error

result = self._call_chain(*args)

File “/usr/lib/python2.7/urllib2.py”, line 407, in _call_chain

result = func(*args)

File “/usr/lib/python2.7/urllib2.py”, line 654, in http_error_302

return self.parent.open(new, timeout=req.timeout)

File “/usr/lib/python2.7/urllib2.py”, line 435, in open

response = meth(req, response)

File “/usr/lib/python2.7/urllib2.py”, line 548, in http_response

‘http’, request, response, code, msg, hdrs)

File “/usr/lib/python2.7/urllib2.py”, line 467, in error

result = self._call_chain(*args)

File “/usr/lib/python2.7/urllib2.py”, line 407, in _call_chain

result = func(*args)

File “/usr/lib/python2.7/urllib2.py”, line 654, in http_error_302

return self.parent.open(new, timeout=req.timeout)

File “/usr/lib/python2.7/urllib2.py”, line 435, in open

response = meth(req, response)

File “/usr/lib/python2.7/urllib2.py”, line 548, in http_response

‘http’, request, response, code, msg, hdrs)

File “/usr/lib/python2.7/urllib2.py”, line 467, in error

result = self._call_chain(*args)

File “/usr/lib/python2.7/urllib2.py”, line 407, in _call_chain

result = func(*args)

File “/usr/lib/python2.7/urllib2.py”, line 644, in http_error_302

self.inf_msg + msg, headers, fp)

urllib2.HTTPError: HTTP Error 307: The HTTP server returned a redirect error that would lead to an infinite loop.

The last 30x error message was:

Temporary Redirect

I did tried to ingest 8500 columns of data, it can be ingested successfully, but once I tried 9000 columns of data, I straight receive this error message. May anyone guide me/ advise me on how to ingest around 10K+ columns of data into druid?

Thank you very much!