Hi, I just started playing with ingesting data so ...
# troubleshooting
k
Hi, I just started playing with ingesting data so I decided to use the query console and modify the example one given
Copy code
INSERT INTO "baseballStats"
FROM FILE '<s3://my-bucket/public_data_set/baseballStats/rawdata/>'
OPTION(taskName=myTask-s3)
OPTION(input.fs.className=org.apache.pinot.plugin.filesystem.S3PinotFS)
OPTION(input.fs.prop.accessKey=my-key)
OPTION(input.fs.prop.secretKey=my-secret)
OPTION(input.fs.prop.region=us-west-2)
I just changed the table name, location and aws credentials, but when I run it I get
Copy code
[
  {
    "message": "QueryExecutionError:\nshaded.org.apache.commons.httpclient.HttpException: Unable to get tasks states map. Error code 400, Error message: {\"code\":400,\"error\":\"No task is generated for table: segments_aggregated, with task type: SegmentGenerationAndPushTask\"}\n\tat org.apache.pinot.common.minion.MinionClient.executeTask(MinionClient.java:123)\n\tat org.apache.pinot.core.query.executor.sql.SqlQueryExecutor.executeDMLStatement(SqlQueryExecutor.java:95)\n\tat org.apache.pinot.controller.api.resources.PinotQueryResource.executeSqlQuery(PinotQueryResource.java:120)\n\tat org.apache.pinot.controller.api.resources.PinotQueryResource.handlePostSql(PinotQueryResource.java:100)",
    "errorCode": 200
  }
]
I am also seeing this in the terminal
2022/06/15 02:02:05.604 ERROR [JobDispatcher] [HelixController-pipeline-task-QuickStartCluster-(e322dd58_TASK)] Job configuration is NULL for TaskQueue_SegmentGenerationAndPushTask_Task_SegmentGenerationAndPushTask_cafd03d1-f383-48ba-aea6-bc0a934522db_1655153184751
Any ideas of what I am doing wrong or where I can go to dig in more? I am running this off the latest docker image for pinot.