Hi All, I have three date columns, So, I written l...
# troubleshooting
r
Hi All, I have three date columns, So, I written like this,
Copy code
"dateTimeFieldSpecs": [
  {
    "name": "_source.startDate",
    "dataType": "STRING",
    "format": "1:SECONDS:SIMPLE_DATE_FORMAT:yyyy-MM-dd",
    "granularity": "1:DAYS"
  },
  {
    "name": "_source.lastUpdate",
    "dataType": "STRING",
    "format": "1:SECONDS:SIMPLE_DATE_FORMAT:yyyy-MM-dd HH:mm:ss",
    "granularity": "1:DAYS"
  },
  {
    "name": "_source.sDate",
    "dataType": "STRING",
    "format": "1:SECONDS:SIMPLE_DATE_FORMAT:yyyy-MM-dd HH:mm:ss",
    "granularity": "1:DAYS"
  }
]
can you please correct. I am getting error
Copy code
{
  "code": 400,
  "error": "Cannot find valid fieldSpec for timeColumn: timestamp from the table config: eventflow_REALTIME, in the schema: eventflowstats"
}
Need your help 🙂
n
can you share the table config?
r
Table Config:
Copy code
{
  "tableName": "eventflow",
  "tableType": "REALTIME",
  "segmentsConfig": {
    "timeColumnName": "timestamp",
    "timeType": "MILLISECONDS",
    "schemaName": "eventflowstats",
    "replicasPerPartition": "1"
  },
  "tenants": {},
  "tableIndexConfig": {
    "loadMode": "MMAP",
    "streamConfigs": {
      "streamType": "kafka",
      "stream.kafka.consumer.type": "lowlevel",
      "stream.kafka.topic.name": "event_count-topic",
      "stream.kafka.decoder.class.name": "org.apache.pinot.plugin.stream.kafka.KafkaJSONMessageDecoder",
      "stream.kafka.consumer.factory.class.name": "org.apache.pinot.plugin.stream.kafka20.KafkaConsumerFactory",
      "stream.kafka.broker.list": "localhost:9876",
      "realtime.segment.flush.threshold.time": "3600000",
      "realtime.segment.flush.threshold.size": "50000",
      "stream.kafka.consumer.prop.auto.offset.reset": "smallest"
    }
  },
  "metadata": {
    "customConfigs": {}
  }
}
And Schema File:
Copy code
{
  "schemaName": "eventflowstats",
  "eventflow": [
    {
      "name": "_index",
      "dataType": "INT"
    },
    {
      "name": "_type",
      "dataType": "STRING"
    },
    {
      "name": "id",
      "dataType": "INT"
    }
  ],
  "dateTimeFieldSpecs": [
    {
      "name": "_source.startDate",
      "dataType": "STRING",
      "format": "1:SECONDS:SIMPLE_DATE_FORMAT:yyyy-MM-dd",
      "granularity": "1:DAYS"
    },
    {
      "name": "_source.lastUpdate",
      "dataType": "STRING",
      "format": "1:SECONDS:SIMPLE_DATE_FORMAT:yyyy-MM-dd HH:mm:ss",
      "granularity": "1:DAYS"
    },
    {
      "name": "_source.sDate",
      "dataType": "STRING",
      "format": "1:SECONDS:SIMPLE_DATE_FORMAT:yyyy-MM-dd HH:mm:ss",
      "granularity": "1:DAYS"
    }
  ]
}
n
In your table config, you've configured "timeColumnName" : "timestamp"
You need to change that to one of the dateTime columns from your schema
✅ 1
Also, in your schema, you have the dimensions under "eventFlow" instead of "dimensionFieldSpecs"
✅ 1
r
ok, got , so I have to remove remaining two. And I have to add as normal fields am I right?
n
you can keep all 3 as dateTimeFieldSpecs
but select one of them as the primary time column, and enter that in the tableConfig
r
is It right:
Copy code
"timeColumnName": "_source.startDate, _source.lastUpdate, _source.sDate",
is It correct:
Copy code
"dateTimeFieldSpecs": [
  {
    "name": "_source.startDate",
    "dataType": "STRING",
    "format": "1:SECONDS:SIMPLE_DATE_FORMAT:yyyy-MM-dd",
    "granularity": "1:DAYS"
  },
  {
    "name": "_source.lastUpdate",
    "dataType": "STRING",
    "format": "1:SECONDS:SIMPLE_DATE_FORMAT:yyyy-MM-dd HH:mm:ss",
    "granularity": "1:DAYS"
  },
  {
    "name": "_source.sDate",
    "dataType": "STRING",
    "format": "1:SECONDS:SIMPLE_DATE_FORMAT:yyyy-MM-dd HH:mm:ss",
    "granularity": "1:DAYS"
  }
]
n
No, you have to put just one column in the tableConfig.
Copy code
"timeColumnName": "_source.sDate"
schema is correct
r
got
after this changes, I am getting this error: Sending request: http://172.31.10.219:9001/schemas to controller: localhost, version: Unknown Got Exception to upload Pinot Schema: aschema I think Pinot server went down. Any idea
@Neha Pawar -- can check once