Alexander Vivas
03/04/2021, 3:21 PMprabhu om
03/09/2021, 6:20 AMGirish bhat m
03/09/2021, 11:37 AMjava.lang.IllegalStateException: PinotFS for scheme: gs has not been initialized
at shaded.com.google.common.base.Preconditions.checkState(Preconditions.java:518) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-513702582e620829419a93c322740a7193e941c3]
at org.apache.pinot.spi.filesystem.PinotFSFactory.create(PinotFSFactory.java:80) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-513702582e620829419a93c322740a7193e941c3]
at org.apache.pinot.plugin.ingestion.batch.standalone.SegmentGenerationJobRunner.init(SegmentGenerationJobRunner.java:94) ~[pinot-batch-ingestion-standalone-0.7.0-SNAPSHOT-shaded.jar:0.7.0-SNAPSHOT-513702582e620829419a93c322740a71
I have taken the master branch built the source code . Is there any configuration required in classpathKishore G
Kishore G
Girish bhat m
03/10/2021, 6:09 AM_inputDirFS = PinotFSFactory.create(_inputDirURI.getScheme());
if the scheme is not loaded and in the run method schemes are registered . Isn't it good way to add method like below in the init method of the Jobrunner ?
void registerSchemsIfNotRegistered(_spec)
Manish Bhoge
03/10/2021, 3:07 PM# Build Pinot
$ mvn clean install -DskipTests -Pbin-dist
But, it is failing with an error, any idea on this below error:
[ERROR] Failed to execute goal org.apache.maven.pluginsmaven shade plugin3.2.1:shade (default) on project pinot-yammer: Execution default of goal org.apache.maven.pluginsmaven shade plugin3.2.1:shade failed: Plugin org.apache.maven.pluginsmaven shade plugin3.2.1 or one of its dependencies could not be resolved: The following artifacts could not be resolved: org.apache.maven.sharedmaven artifact transferjar:0.10.0, org.ow2.asmasmjar7.0 Could not transfer artifact org.apache.maven.sharedmaven artifact transferjar:0.10.0 from/to central (https://repo.maven.apache.org/maven2): Connect to repo.maven.apache.org:443 [repo.maven.apache.org/151.101.12.215] failed: Connection timed out (Connection timed out) -> [Help 1]Josh Highley
03/10/2021, 3:46 PMtroywinter
03/11/2021, 3:57 AMAlexander Vivas
03/11/2021, 4:15 PMtroywinter
03/12/2021, 3:16 AMMatteo Satnero
03/12/2021, 11:42 AMupsertConfig mode true in REALTIME
pinot 0.6.0
presto 0.247
dbeaver 21.0.0.202103021012
Thanks you very much in advanceJosh Highley
03/12/2021, 6:38 PMElon
03/13/2021, 3:30 AMminwoo jung
03/16/2021, 4:09 AM./run-frontend.sh
.
Error: Could not find or load main class org.apache.pinot.thirdeye.dashboard.ThirdEyeDashboardApplication
I tested the same in several environments, but the problem occurred the same.
### 2
The same problem occurs when executing the "./run-backend.sh" script. However, the cause of the problem seems to be different. There seems to be a problem because there is no org.apache.pinot.thirdeye.anomaly.ThirdEyeAnomalyApplication
class in the jar file.
Tell me how to fix it, and I'll send you a PR.Ravikumar Maddi
03/16/2021, 3:03 PM"dateTimeFieldSpecs": [
{
"name": "_source.startDate",
"dataType": "STRING",
"format": "1:SECONDS:SIMPLE_DATE_FORMAT:yyyy-MM-dd",
"granularity": "1:DAYS"
},
{
"name": "_source.lastUpdate",
"dataType": "STRING",
"format": "1:SECONDS:SIMPLE_DATE_FORMAT:yyyy-MM-dd HH:mm:ss",
"granularity": "1:DAYS"
},
{
"name": "_source.sDate",
"dataType": "STRING",
"format": "1:SECONDS:SIMPLE_DATE_FORMAT:yyyy-MM-dd HH:mm:ss",
"granularity": "1:DAYS"
}
]
can you please correct.
I am getting error
{
"code": 400,
"error": "Cannot find valid fieldSpec for timeColumn: timestamp from the table config: eventflow_REALTIME, in the schema: eventflowstats"
}
Need your help 🙂jiatao
03/16/2021, 8:52 PMPinot Quickstart on JDK 15-ea
is failing for my pr (which only change log messages). Seems like the test is running java 16 instead of 15: JAVA_HOME_16.0.0_x64=/opt/hostedtoolcache/jdk/16.0.0/x64
. Any idea how to fix this?
The pr test link for reference: https://github.com/apache/incubator-pinot/pull/6684/checks?check_run_id=2124725931Ravikumar Maddi
03/17/2021, 7:20 AMRavikumar Maddi
03/17/2021, 9:29 AM{
"tableName": "mytable",
"tableType": "REALTIME",
"segmentsConfig": {
"timeColumnName": "_source.sDate",
"timeType": "MILLISECONDS",
"schemaName": "myschema",
"replicasPerPartition": "1"
},
"tenants": {},
"tableIndexConfig": {
"loadMode": "MMAP",
"streamConfigs": {
"streamType": "kafka",
"stream.kafka.consumer.type": "lowlevel",
"stream.kafka.topic.name": "mytopic",
"stream.kafka.decoder.class.name": "org.apache.pinot.plugin.stream.kafka.KafkaJSONMessageDecoder",
"stream.kafka.consumer.factory.class.name": "org.apache.pinot.plugin.stream.kafka20.KafkaConsumerFactory",
"stream.kafka.broker.list": "localhost:9876",
"realtime.segment.flush.threshold.time": "3600000",
"realtime.segment.flush.threshold.size": "50000",
"stream.kafka.consumer.prop.auto.offset.reset": "smallest"
}
},
"metadata": {
"customConfigs": {}
}
}
And
Schema Config:
{
"schemaName": "myschema",
"eventflow": [
{
"name": "_index",
"dataType": "INT"
},
{
"name": "_type",
"dataType": "STRING"
},
{
"name": "id",
"dataType": "INT"
},
{
"name": "_source.madids",
"datatype": "INT",
"singleValueField": false
},
],
"dateTimeFieldSpecs": [
{
"name": "_source.sDate",
"dataType": "STRING",
"format": "1:SECONDS:SIMPLE_DATE_FORMAT:yyyy-MM-dd HH:mm:ss",
"granularity": "1:DAYS"
}
]
}
Matteo Satnero
03/17/2021, 12:00 PMYash Agarwal
03/17/2021, 4:29 PMselect distinct city from transactions limit 100000
select city from transactions group by city limit 100000
Qiaochu Liu
03/17/2021, 5:48 PMmvn test
I think it’s related to the dependency: org.apache.maven.pluginsmaven surefire plugin3.0.0-M5:test
mvn clean install works perfectly. Is there a solution to fix this error?
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M5:test (default-test) on project pinot-spi: There are test failures.
[ERROR]
[ERROR] Please refer to /Users/qiaochu/Fork/incubator-pinot/pinot-spi/target/surefire-reports for the individual test results.
[ERROR] Please refer to dump files (if any exist) [date].dump, [date]-jvmRun[N].dump and [date].dumpstream.
[ERROR] The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /Users/qiaochu/Fork/incubator-pinot/pinot-spi && /Library/Java/JavaVirtualMachines/adoptopenjdk-11.jdk/Contents/Home/bin/java -javaagent:/Users/qiaochu/.m2/repository/org/jacoco/org.jacoco.agent/0.7.7.201606060606/org.jacoco.agent-0.7.7.201606060606-runtime.jar=destfile=/Users/qiaochu/Fork/incubator-pinot/pinot-spi/target/jacoco.exec -Xms4g -Xmx4g -jar /Users/qiaochu/Fork/incubator-pinot/pinot-spi/target/surefire/surefirebooter12276158737966275331.jar /Users/qiaochu/Fork/incubator-pinot/pinot-spi/target/surefire 2021-03-17T10-07-59_551-jvmRun1 surefire6422534819305887743tmp surefire_05875304854941226633tmp
[ERROR] Error occurred in starting fork, check output in log
[ERROR] Process Exit Code: 134
[ERROR] org.apache.maven.surefire.booter.SurefireBooterForkException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /Users/qiaochu/Fork/incubator-pinot/pinot-spi && /Library/Java/JavaVirtualMachines/adoptopenjdk-11.jdk/Contents/Home/bin/java -javaagent:/Users/qiaochu/.m2/repository/org/jacoco/org.jacoco.agent/0.7.7.201606060606/org.jacoco.agent-0.7.7.201606060606-runtime.jar=destfile=/Users/qiaochu/Fork/incubator-pinot/pinot-spi/target/jacoco.exec -Xms4g -Xmx4g -jar /Users/qiaochu/Fork/incubator-pinot/pinot-spi/target/surefire/surefirebooter12276158737966275331.jar /Users/qiaochu/Fork/incubator-pinot/pinot-spi/target/surefire 2021-03-17T10-07-59_551-jvmRun1 surefire6422534819305887743tmp surefire_05875304854941226633tmp
[ERROR] Error occurred in starting fork, check output in log
[ERROR] Process Exit Code: 134
[ERROR] at org.apache.maven.plugin.surefire.booterclient.ForkStarter.fork(ForkStarter.java:748)
[ERROR] at org.apache.maven.plugin.surefire.booterclient.ForkStarter.run(ForkStarter.java:305)
[ERROR] at org.apache.maven.plugin.surefire.booterclient.ForkStarter.run(ForkStarter.java:265)
[ERROR] at org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeProvider(AbstractSurefireMojo.java:1314)
[ERROR] at org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeAfterPreconditionsChecked(AbstractSurefireMojo.java:1159)
[ERROR] at org.apache.maven.plugin.surefire.AbstractSurefireMojo.execute(AbstractSurefireMojo.java:932)
[ERROR] at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:137)
[ERROR] at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:210)
[ERROR] at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:156)
[ERROR] at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:148)
[ERROR] at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:117)
[ERROR] at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:81)
[ERROR] at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:56)
[ERROR] at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
[ERROR] at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:305)
[ERROR] at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:192)
[ERROR] at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:105)
[ERROR] at org.apache.maven.cli.MavenCli.execute(MavenCli.java:957)
[ERROR] at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:289)
[ERROR] at org.apache.maven.cli.MavenCli.main(MavenCli.java:193)
[ERROR] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[ERROR] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[ERROR] at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[ERROR] at java.base/java.lang.reflect.Method.invoke(Method.java:566)
[ERROR] at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:282)
[ERROR] at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:225)
[ERROR] at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:406)
[ERROR] at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:347)
[ERROR]
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] <http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException>
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <args> -rf :pinot-spi
Phúc Huỳnh
03/18/2021, 3:00 AMRealtimeToOfflineSegmentsTask
status or error message ?
I’m find warn log in brokers-server. But cannot find any log information in minion-server.
Is this log relations to this task ?
2021/03/17 09:26:49.639 WARN [TimeBoundaryManager] [HelixTaskExecutor-message_handle_thread] Failed to find segment with valid end time for table: RuleLogsUAT_OFFLINE, no time boundary generated
2021/03/17 09:27:06.989 WARN [BaseBrokerRequestHandler] [jersey-server-managed-async-executor-0] Failed to find time boundary info for hybrid table: RuleLogsUAT
Ravikumar Maddi
03/18/2021, 6:38 PMbin/kafka-console-producer.sh --broker-list localhost:19092 --topic mytopic opt_flatten_json.json
I am getting output this only:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
Matt
03/19/2021, 1:55 AMlog
and want to extract values based on keys (urlpath
). So tried to use JSONIndex however fails during parsing. So ingested it as normal string and tried JSONEXTRACTSCALAR/*json_extract_scalar* however this also fails during parsing. Finally I ended up using Groovy function like GROOVY('{"returnType": "STRING", "isSingleValue": true}', 'java.util.regex.Pattern p = java.util.regex.Pattern.compile("(\"urlpath\":\")((?:\\\"|[^\\\"]*))"); java.util.regex.Matcher m = p.matcher(arg0); if(m.find()){ return m.group(2); } else { return "";}',log)
and this works in SQL. Now I want to add this Groovy function inside table config to do ingestionTransform to define a new columnName. Is this possible? For ingestion transform can we do multi line , semi colon separated script?Ravikumar Maddi
03/19/2021, 5:15 AMbin/kafka-console-producer.sh --broker-list localhost:19092 --topic mytopic < $PDATA_HOME/opt_flatten_json.json
But Ia m getting error:
Exception while executing a state transition task mystats__0__0__20210319T0430Z
java.lang.reflect.InvocationTargetException: null
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_282]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_282]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_282]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_282]
at org.apache.helix.messaging.handling.HelixStateTransitionHandler.invoke(HelixStateTransitionHandler.java:404) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
at org.apache.helix.messaging.handling.HelixStateTransitionHandler.handleMessage(HelixStateTransitionHandler.java:331) [pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
at org.apache.helix.messaging.handling.HelixTask.call(HelixTask.java:97) [pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
at org.apache.helix.messaging.handling.HelixTask.call(HelixTask.java:49) [pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_282]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_282]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_282]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_282]
Caused by: java.lang.OutOfMemoryError: Direct buffer memory
at java.nio.Bits.reserveMemory(Bits.java:695) ~[?:1.8.0_282]
at java.nio.DirectByteBuffer.<init>(DirectByteBuffer.java:123) ~[?:1.8.0_282]
at java.nio.ByteBuffer.allocateDirect(ByteBuffer.java:311) ~[?:1.8.0_282]
at org.apache.pinot.core.segment.memory.PinotByteBuffer.allocateDirect(PinotByteBuffer.java:39) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
at org.apache.pinot.core.segment.memory.PinotDataBuffer.allocateDirect(PinotDataBuffer.java:116) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
at org.apache.pinot.core.io.writer.impl.DirectMemoryManager.allocateInternal(DirectMemoryManager.java:53) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
at org.apache.pinot.core.io.readerwriter.RealtimeIndexOffHeapMemoryManager.allocate(RealtimeIndexOffHeapMemoryManager.java:79) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
at org.apache.pinot.core.realtime.impl.forward.FixedByteMVMutableForwardIndex.addDataBuffer(FixedByteMVMutableForwardIndex.java:162) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
at org.apache.pinot.core.realtime.impl.forward.FixedByteMVMutableForwardIndex.<init>(FixedByteMVMutableForwardIndex.java:137) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
at org.apache.pinot.core.indexsegment.mutable.MutableSegmentImpl.<init>(MutableSegmentImpl.java:307) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
at org.apache.pinot.core.data.manager.realtime.LLRealtimeSegmentDataManager.<init>(LLRealtimeSegmentDataManager.java:1270) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
at org.apache.pinot.core.data.manager.realtime.RealtimeTableDataManager.addSegment(RealtimeTableDataManager.java:324) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
at org.apache.pinot.server.starter.helix.HelixInstanceDataManager.addRealtimeSegment(HelixInstanceDataManager.java:132) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
at org.apache.pinot.server.starter.helix.SegmentOnlineOfflineStateModelFactory$SegmentOnlineOfflineStateModel.onBecomeOnlineFromOffline(SegmentOnlineOfflineStateModelFactory.java:164) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
at org.apache.pinot.server.starter.helix.SegmentOnlineOfflineStateModelFactory$SegmentOnlineOfflineStateModel.onBecomeConsumingFromOffline(SegmentOnlineOfflineStateModelFactory.java:88) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
... 12 more
Default rollback method invoked on error. Error Code: ERROR
Message execution failed. msgId: eed5b297-ea20-437e-a0b5-ad4d0be75c3c, errorMsg: java.lang.reflect.InvocationTargetException
Skip internal error. errCode: ERROR, errMsg: null
Event bcbad381_DEFAULT : Unable to find a next state for resource: mystats_REALTIME partition: mystats__0__0__20210319T0430Z from stateModelDefinitionclass org.apache.helix.model.StateModelDefinition from:ERROR to:CONSUMING
Event c910d226_DEFAULT : Unable to find a next state for resource: mystats_REALTIME partition: mystats__0__0__20210319T0430Z from stateModelDefinitionclass org.apache.helix.model.StateModelDefinition from:ERROR to:CONSUMING
Event d194950f_DEFAULT : Unable to find a next state for resource: mystats_REALTIME partition: mystats__0__0__20210319T0430Z from stateModelDefinitionclass org.apache.helix.model.StateModelDefinition from:ERROR to:CONSUMING
Need Help 🙂Ravikumar Maddi
03/19/2021, 11:26 AM{
"schemaName": "eventflowstats",
"dimensionFieldSpecs": [
{
"name": "_index",
"dataType": "STRING"
},
{
"name": "_type",
"dataType": "STRING",
"maxLength": 5
},
{
"name": "_id",
"dataType": "STRING"
},
{
"name": "_source.aExpIds",
"dataType": "INT",
"singleValueField": false
}
]
"dateTimeFieldSpecs": [
{
"name": "_source.sDate",
"dataType": "LONG",
"format": "1:SECONDS:SIMPLE_DATE_FORMAT:SECONDS:SIMPLE_DATE_FORMAT",
"granularity": "1:DAYS"
}
]
}
My Table Config like this:
{
"tableName": "mytable",
"tableType": "REALTIME",
"tenants": {},
"segmentsConfig": {
"timeColumnName": "_source.sDate",
"timeType": "MILLISECONDS",
"segmentPushType": "APPEND",
"replicasPerPartition": "1",
"retentionTimeUnit": "DAYS",
"retentionTimeValue": "1"
},
"tableIndexConfig": {
"loadMode": "MMAP",
"streamConfigs": {
"streamType": "kafka",
"stream.kafka.consumer.type": "lowLevel",
"stream.kafka.topic.name": "mytopic",
"stream.kafka.decoder.class.name": "org.apache.pinot.plugin.stream.kafka.KafkaJSONMessageDecoder",
"stream.kafka.hlc.zk.connect.string": "localhost:2191/kafka",
"stream.kafka.consumer.factory.class.name": "org.apache.pinot.plugin.stream.kafka20.KafkaConsumerFactory",
"stream.kafka.zk.broker.url": "localhost:2191/kafka",
"stream.kafka.broker.list": "localhost:19092"
}
},
"metadata": {
"customConfigs": {}
}
}
And Data like this:
{
"_index": "dhfkdfkdsjfk",
"_type": "_doc",
"_id": "68767677989hjhjkhkjh",
"_source.aExpIds": [
815850,
815857,
821331
],
"_source.sDate": "2021-01-04 00:00:00"
}
I check all logs, I did not find any exceptions. But data is not appearing in Pinot controller portal.ayush sharma
03/19/2021, 3:16 PM$ pinot-admin.sh LaunchDataIngestionJob -jobSpecFile /home/ayush/ayush_workspace/iVoyant/analytics/data/hospital_data/job-spec.yml
.....
Pushing segment: hospital to location: <http://localhost:9000> for table hospital
Sending request: <http://localhost:9000/v2/segments?tableName=hospital> to controller: 4cb684aaf215, version: Unknown
Response for pushing table hospital segment hospital to location <http://localhost:9000> - 200: {"status":"Successfully uploaded segment: hospital of table: hospital"}
But, the table status on UI turns BAD
Here is the Error logged in pinot-server:
2021/03/19 15:03:24.082 ERROR [SegmentOnlineOfflineStateModelFactory$SegmentOnlineOfflineStateModel] [HelixTaskExecutor-message_handle_thread] Caught exception in state transition from OFFLINE -> ONLINE for resource: hospital_OFFLINE, partition: hospital
java.lang.IllegalStateException: Key separator not found: APR, segment: /tmp/pinotServerData/hospital_OFFLINE/hospital/v3
at shaded.com.google.common.base.Preconditions.checkState(Preconditions.java:444)
Any idea? what could be wrong here ?
I have attached the error log. Any help is appreciated !Matt
03/20/2021, 12:40 AMSameera
03/20/2021, 7:01 PM