When i use the transformSpec during ingesting data from kafka,I met a trouble

Dear everyone,

**I used the kafka_index spec to ingest data from kafka, And i added the transformSpec in the spec. I want to modify the column ‘visitTime’,so i used abs(visitTime) expression. But finally i found the value is null. **


"type": "kafka",

"dataSchema": {

    "dataSource": "klicenLogTest",

    "parser": {

        "type": "string",

        "parseSpec": {

            "format": "tsv",

            "timestampSpec": {

                "column": "optionTime",

                "format": "auto"



            "columns": ["moduleCode","moduleName","modulePage","moduleFeature","moduleRemarks","triggerType","moduleStatus","position","targetType","targetValue","optionTime","buryingType","visitTime","userId","vehicleId","clientType","osType","vip","osVersion","appVersion","product","mobileBrand","mobileSerial","mobileScreenSize","deviceInfo","channel","channelName"],



            "dimensionsSpec": {

                "dimensions": ["moduleCode","moduleName","modulePage","moduleFeature","triggerType","moduleStatus","position","targetType","targetValue","buryingType","userId","vehicleId","clientType","osType","vip","osVersion","appVersion","product","mobileBrand","mobileSerial","mobileScreenSize","deviceInfo","channel","channelName"],


    		"dimensionExclusions" : ["moduleRemarks"]





    "metricsSpec": [

        {"type": "longSum", "name": "visitTime", "fieldName": "visitTime"},



    "granularitySpec": {

        "type": "uniform",

        "segmentGranularity": "day",

        "queryGranularity": "none"













            			"lowerStrict": false,

            			"upperStrict": true,

    					"ordering": "numeric"




"tuningConfig": {

    "type": "kafka",

    "maxRowsPerSegment": 5000000


"ioConfig": {

    "topic": "KlicenLog",

    "consumerProperties": {

        "bootstrap.servers": "master:6667,slave1:6667,slave2:6667"


    "taskCount": 1,

    "replicas": 2,

    "taskDuration": "PT1H",




The result is as follows.



“__time”: “2019-07-31T02:25:03.000Z”,

“appVersion”: “6.3.1”,

“buryingType”: “click”,

“channel”: “100”,

“channelName”: “官网”,

“clientType”: “MOBILE”,

“count”: 2,

“deviceInfo”: “ceffd666a13fbc2c”,

“mobileBrand”: “Xiaomi”,

“mobileScreenSize”: “1080x1920”,

“mobileSerial”: “MI 5s”,

“moduleCode”: “1”,

“moduleFeature”: “tabbar”,

“moduleName”: “点击底部tabbar”,

“modulePage”: “app”,

“moduleStatus”: “pv”,

“osType”: “Android”,

“osVersion”: “26”,

“position”: “null”,

“product”: “KLICEN_APP”,

“targetType”: “null”,

“targetValue”: “profile”,

“triggerType”: “click”,

“userId”: “10540113”,

“vehicleId”: “10144507”,

“vip”: “true”,

“visitTime”: null



Why the value of visitTime is null . I have throw the message to kafka manually .

The message is:

1 点击底部tabbar app tabbar 点击时采集 click pv null null profile 1564539903000 click 555 10540113 10144507 MOBILE Android true 26 6.3.1 KLICEN_APP Xiaomi MI 5s 1080x1920 ceffd666a13fbc2c 100 官网

the current value of visitTime in the message is 555.

Anyone who can help me?

can you try “expression”:“abs(“visitTime”)” ?

I would have a try at once.

Ming F ming.fang@imply.io 于2019年7月31日周三 下午1:15写道:

Sorry Sir, your solution can not be work.

Ming F ming.fang@imply.io 于2019年7月31日周三 下午1:15写道:

Please try again with “expression”:“abs(parse_long(visitTime))”

I tested and it shall work for you as well.