Problems to load historical data to s3

Hello!

Currently i am trying to use AWS Role for access to S3 bucket.
I have set configuration like “druid.s3.fileSessionCredentials=ARN_ROLE”.
It works for logs, i can see all of logs in S3 bucket.

But when i am trying to load test data “quickstart/wikiticker-2015-09-12-sampled.json.gz” i am getting an error:

“AWS Access Key ID and Secret Access Key must be specified as the username or password (respectively) of a s3n URL, or by setting the fs.s3n.awsAccessKeyId or fs.s3n.awsSecretAccessKey properties (respectively)”

So the main question, is it possible to use AWS Role to work with S3? Or probably i should to configure something additionally for hadoop?

{
“type” : “index_hadoop”,
“spec” : {
“ioConfig” : {
“type” : “hadoop”,
“inputSpec” : {
“type” : “static”,
“paths” : “quickstart/wikiticker-2015-09-12-sampled.json.gz”
}
},
“dataSchema” : {
“dataSource” : “wikiticker”,
“granularitySpec” : {
“type” : “uniform”,
“segmentGranularity” : “day”,
“queryGranularity” : “none”,
“intervals” : [“2015-09-12/2015-09-13”]
},
“parser” : {
“type” : “hadoopyString”,
“parseSpec” : {
“format” : “json”,
“dimensionsSpec” : {
“dimensions” : [
“channel”,
“cityName”,
“comment”,
“countryIsoCode”,
“countryName”,
“isAnonymous”,
“isMinor”,
“isNew”,
“isRobot”,
“isUnpatrolled”,
“metroCode”,
“namespace”,
“page”,
“regionIsoCode”,
“regionName”,
“user”
]
},
“timestampSpec” : {
“format” : “auto”,
“column” : “time”
}
}
},
“metricsSpec” : [
{
“name” : “count”,
“type” : “count”
},
{
“name” : “added”,
“type” : “longSum”,
“fieldName” : “added”
},
{
“name” : “deleted”,
“type” : “longSum”,
“fieldName” : “deleted”
},
{
“name” : “delta”,
“type” : “longSum”,
“fieldName” : “delta”
},
{
“name” : “user_unique”,
“type” : “hyperUnique”,
“fieldName” : “user”
}
]
},
“tuningConfig” : {
“type” : “hadoop”,
“partitionsSpec” : {
“type” : “hashed”,
“targetPartitionSize” : 5000000
},
“jobProperties” : {}
}
}
}