How to upload data from s3 to overload

hi,
I am trying to upload my data from s3 to overload , I have copied data into s3 from snowflakes db with the following command

copy into ‘s3://druid-poc/druid/test_table.json’ from (select * from test_table)
CREDENTIALS = ( AWS_KEY_ID = ‘’ AWS_SECRET_KEY = ‘’)
ENCRYPTION = ( TYPE = ‘AWS_SSE_S3’ )
SINGLE = TRUE
MAX_FILE_SIZE = 160000000

``

the data is getting copy to s3 in encrypted format

I am using the following ingestion spec to load data to the oveload node

{
“type” : “index”,
“spec” : {
“dataSchema” : {
“dataSource” : “test_table_source”,
“parser” : {
“type” : “string”,
“parseSpec” : {
“format” : “json”,
“dimensionsSpec” : {
“dimensions” : [
“dimension1”,
“dimension2”,
“dimension3”,
“dimension4”
]
}
}
},
“metricsSpec” : ,
“granularitySpec” : {
“type” : “uniform”,
“segmentGranularity” : “day”,
“queryGranularity” : “none”,
“intervals” : [“2006-06-01/2017-05-01”],
“rollup” : false
}
},
“ioConfig” : {
“type” : “index”,
“firehose” : {
“type” : “static-s3”,
“uris”: [“s3://druid-poc/druid/test_table.json”]
},
“appendToExisting” : false
},
“tuningConfig” : {
“type” : “index”,
“targetPartitionSize” : 5000000,
“maxRowsInMemory” : 25000,
“forceExtendableShardSpecs” : true
}
}
}

``

the task is getting created and completing in the overload console, but I see no datasource getting created by the name ‘test_table_source’ in the coordinator console. And don’t get any error too.

anything I am doing wrong here

Hi Joshi,
Are there any error messages in the coordinator console? What are you using for deep storage? If the indexing went properly, your deep storage should have a copy of the indexed files, you can verify if the files have been written there.

-Robert

My guess is that possibly druid is not able to parse the rows properly and not ingesting them.
Try setting “*logParseExceptions”*****to true in the tuningConfig for logging more info about parsing errors, if any.