Batch ingest json file, log shows success, but no segment is created

Hi, I am new to druid. After successfully tried the tutorial, I tried to ingest my own test data. Though log does show success, but no segment is created. I wonder whether it is related to my timestamp field format. My data has “TXN_TIMESTAMP”: “2017-09-20T09:13:46.000-04:00”. and I have the format set to “yyyy-MM-dd’T’HH:mm:ssZ”.

Could anyone help? Many thanks,

my index.json has:

{
  "type" : "index",
  "spec" : {
    "dataSchema" : {
      "dataSource" : "timhortons",
      "parser" : {
        "type" : "string",
        "parseSpec" : {
          "format" : "json",
          "dimensionsSpec" : {
            "dimensions" : [ "CLNT_NO", "place_id", "city", "province", "credit_or_debit", "ACCOUNT_NUMBER", "transaction_currency", "online", "txn_action", "merchant_types", "neighbourhood", "price_level", "rating", "sic_category", "clnt_income", "clnt_fsa_name", "sex,personal_or_business_client", "age_group" ]
          },
          "timestampSpec" : {
            "column" : "TXN_TIMESTAMP",
            "format" : "yyyy-MM-dd'T'HH:mm:ssZ"
          }
        }
      },
      "metricsSpec" : [ {
        "type" : "count",
        "name" : "count"
      }, {
        "type" : "doubleSum",
        "name" : "Total_Sales",
        "fieldName" : "TRAN_AMOUNT"
      } ],
      "granularitySpec" : {
        "type" : "uniform",
        "segmentGranularity" : "DAY",
        "queryGranularity" : "none",
        "rollup" : false,
        "intervals" : [ "2017-01-12/2018-10-30" ],
        "rollup" : false
      }
    },
    "ioConfig" : {
      "type" : "index",
      "firehose" : {
        "type" : "local",
        "baseDir" : "quickstart",
        "filter" : "tims.json"
      },
      "appendToExisting" : false
    },
    "tuningConfig" : {
      "type" : "index",
      "targetPartitionSize" : 5000000,
      "maxRowsInMemory" : 25000,
      "maxTotalRows" : 20000000,
      "forceExtendableShardSpecs" : true

    }
  }
}

And my data looks like:
{
    "CLNT_NO": 493986743,
    "place_id": "ChIJfWkU9tM0K4gRZhietAbH3FI",
    "city": "Toronto",
    "province": "ON",
    "credit_or_debit": "c",
    "ACCOUNT_NUMBER": "677993ab5088d650d97d20bfef7955f4f9a352d67469fe8f38a975cb006d0ed9",
    "transaction_currency": "CAD",
    "TXN_TIMESTAMP": "2017-09-20T09:13:46.000-04:00",
    "online": 0,
    "txn_action": "Purchase",
    "merchant_types": "[cafe, food, point_of_interest, establishment]",
    "neighbourhood": "Downtown",
    "price_level": "INEXPENSIVE",
    "rating": 4.2,
    "sic_category": "",
    "clnt_income": 56024.73,
    "clnt_fsa_name": "Scarborough (Malvern / Rouge River)",
    "sex": "M",
    "personal_or_business_client": "P",
    "age_group": "41-45",
    "TRAN_AMOUNT": 1.77
  }

Let me answer myself.
  1. I must have misunderstood the log configuration setting. I added the following to config/druid/_common/log4j2.xml and restarted all services, but didn’t see more log.
    <Logger name="io.druid.jetty.RequestLog" additivity="false" level="DEBUG">
        <AppenderRef ref="Console"/>
    </Logger>
2. Apparently, adding  "reportParseExceptions" : true to "tuningConfig" section of index.json file, allows me to see more error in the log
3. I made an obvious mistake by providing the json file with nice format. each row should be in 1 line. After those changes I can see segments got created.