Druid Cluster using S3 compatible (IBM Cloud Storage) as Deep storage, tutorial batch load hang

Hi, I am using current druid version 0.12.3 and follow the cluster setting up instruction, deployed coordinator, overlord, zookeeper on 1 node, historical on 1 node and MiddleManager on another node. Configured using S3 compatible (IBM Cloud Storage) as deep storage. I can’t get the tutorial working. Here is my configurations:

in conf/druid/_common/common.runtime.properties:

druid.extensions.loadList=[“druid-s3-extensions”, “mysql-metadata-storage”, …]

For S3:

druid.storage.type=s3

druid.storage.bucket=myBucket

druid.storage.baseKey=druid/segments

druid.s3.accessKey=###

druid.s3.secretKey=###

Indexing service logs

Looked at the log and found:

2018-12-10T17:00:16,851 DEBUG [appenderator_merge_0] org.jets3t.service.impl.rest.httpclient.RestStorageService - Rethrowing as a ServiceException error in performRequest: org.jets3t.service.ServiceException: Service Error Message. – ResponseCode: 400, ResponseStatus: null, XML Error Message: <?xml version="1.0" encoding="UTF-8" standalone="yes"?>InvalidRequestInvalid canned ACL/ul30cadevmdcp01/druid/segments/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z/2018-12-10T17:00:02.452Z/0/index.zip2ba82552-dae1-4716-8aeb-b4cc181ca1bb400

Does it mean that jets3t doesn’t support IBM Cloud Storage (claim S3 compatible)?