https://pinot.apache.org/ logo
Join Slack
Powered by
# troubleshooting
  • a

    Alexander Vivas

    03/04/2021, 3:21 PM
    Hey guys, for some reason every query I sent to pinot is only returning 10 records at most, only if I specify a limit it brings more than 10 records, is there something I have to do to get the full amount of records?
    k
    • 2
    • 3
  • p

    prabhu om

    03/09/2021, 6:20 AM
    Hi Everyone, I am trying to ingest the data from kerberos enable kafka cluster. Could you please help me in how to pass kerberos principal and keytab to streamConfig
    k
    • 2
    • 1
  • g

    Girish bhat m

    03/09/2021, 11:37 AM
    <!here> I am following the pinot docs for batch importing from the gcs bucket. getting the below error
    Copy code
    java.lang.IllegalStateException: PinotFS for scheme: gs has not been initialized
    	at shaded.com.google.common.base.Preconditions.checkState(Preconditions.java:518) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-513702582e620829419a93c322740a7193e941c3]
    	at org.apache.pinot.spi.filesystem.PinotFSFactory.create(PinotFSFactory.java:80) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-513702582e620829419a93c322740a7193e941c3]
    	at org.apache.pinot.plugin.ingestion.batch.standalone.SegmentGenerationJobRunner.init(SegmentGenerationJobRunner.java:94) ~[pinot-batch-ingestion-standalone-0.7.0-SNAPSHOT-shaded.jar:0.7.0-SNAPSHOT-513702582e620829419a93c322740a71
    I have taken the master branch built the source code . Is there any configuration required in classpath
    k
    • 2
    • 3
  • k

    Kishore G

    03/09/2021, 4:07 PM
    thats right distinct output represent multiple rows and existing post aggregation functions can only act of one row
    a
    • 2
    • 6
  • k

    Kishore G

    03/09/2021, 4:11 PM
    please file an issue and we can add some udf's to support this
    a
    • 2
    • 5
  • g

    Girish bhat m

    03/10/2021, 6:09 AM
    <!here> In the class org.apache.pinot.plugin.ingestion.batch.standalone.SegmentGenerationJobRunner.init(SegmentGenerationJobSpec) method below statements throws error
    Copy code
    _inputDirFS = PinotFSFactory.create(_inputDirURI.getScheme());
    if the scheme is not loaded and in the run method schemes are registered . Isn't it good way to add method like below in the init method of the Jobrunner ?
    Copy code
    void registerSchemsIfNotRegistered(_spec)
    k
    x
    • 3
    • 4
  • m

    Manish Bhoge

    03/10/2021, 3:07 PM
    I'm trying to set up the docker image of Pinot, and to set up this I'm doing the maven build :
    Copy code
    # Build Pinot
    $ mvn clean install -DskipTests -Pbin-dist
    But, it is failing with an error, any idea on this below error: [ERROR] Failed to execute goal org.apache.maven.pluginsmaven shade plugin3.2.1:shade (default) on project pinot-yammer: Execution default of goal org.apache.maven.pluginsmaven shade plugin3.2.1:shade failed: Plugin org.apache.maven.pluginsmaven shade plugin3.2.1 or one of its dependencies could not be resolved: The following artifacts could not be resolved: org.apache.maven.sharedmaven artifact transferjar:0.10.0, org.ow2.asmasmjar7.0 Could not transfer artifact org.apache.maven.sharedmaven artifact transferjar:0.10.0 from/to central (https://repo.maven.apache.org/maven2): Connect to repo.maven.apache.org:443 [repo.maven.apache.org/151.101.12.215] failed: Connection timed out (Connection timed out) -> [Help 1]
    k
    • 2
    • 3
  • j

    Josh Highley

    03/10/2021, 3:46 PM
    Using cli pinot-admin.sh to start the components, how do I get it to log to the individual log files -- pinotServer, pinotBroker, etc ? I see the files defined in the admin log4j config, but they don't appear to be used
    m
    • 2
    • 1
  • t

    troywinter

    03/11/2021, 3:57 AM
    Hi team, I’m using Superset query a pinot table, but the explore button are disabled for pinot table, other data sources works fine. Anyone know the reason?
    x
    • 2
    • 13
  • a

    Alexander Vivas

    03/11/2021, 4:15 PM
    @Daniel Lavoie, yeah... it's a cached schema registry, which I'm not sure it's the source of our error
    d
    • 2
    • 10
  • t

    troywinter

    03/12/2021, 3:16 AM
    Hi team, when connecting pinot using python pinotdb driver, how should I route to different tenant broker? I have configured different table using different tenant, should I use different connection string for different tenant?
    x
    • 2
    • 8
  • m

    Matteo Satnero

    03/12/2021, 11:42 AM
    Hello Here, I am new in Pinot and Presto, so maybe my question has an obvious explenation I am not seeing. I ve a table split in OFFLINE and REALTIME, with a defined key and time. From pinot interface I am getting 1 record (the OFFLINE one) while from Presto by DBeaver client I am getting 2 records (one for each table) Is there something I am missing in thew configuration for this?
    Copy code
    upsertConfig mode true in REALTIME
    pinot 0.6.0
    presto 0.247
    dbeaver 21.0.0.202103021012
    Thanks you very much in advance
    k
    y
    • 3
    • 6
  • j

    Josh Highley

    03/12/2021, 6:38 PM
    when ingesting large amounts of streaming data, is there any mechanism for capturing/notifying about data errors other than looking through the server log files? We're ingesting millions of records as fast as Pinot can ingest them, so looking through logs isn't really practical.
    d
    k
    • 3
    • 7
  • e

    Elon

    03/13/2021, 3:30 AM
    Hi, we noticed that for a segment on a table 2 out of 3 servers have the same data, but 1 of the servers has less data in the segment. External view == ideal state and in the cluster manager, when I click on the table it says it's in a "good" state. What would cause that? It's an offline table.
    k
    j
    • 3
    • 17
  • m

    minwoo jung

    03/16/2021, 4:09 AM
    To run thirdeye locally, refer to the manual below. - https://thirdeye.readthedocs.io/en/latest/quick_start.html I used the master branch, and building thirdeye was successful. ### 1 By the way An error occurs when executing
    ./run-frontend.sh
    .
    Copy code
    Error: Could not find or load main class org.apache.pinot.thirdeye.dashboard.ThirdEyeDashboardApplication
    I tested the same in several environments, but the problem occurred the same. ### 2 The same problem occurs when executing the "./run-backend.sh" script. However, the cause of the problem seems to be different. There seems to be a problem because there is no
    org.apache.pinot.thirdeye.anomaly.ThirdEyeAnomalyApplication
    class in the jar file. Tell me how to fix it, and I'll send you a PR.
    x
    • 2
    • 2
  • r

    Ravikumar Maddi

    03/16/2021, 3:03 PM
    Hi All, I have three date columns, So, I written like this,
    Copy code
    "dateTimeFieldSpecs": [
      {
        "name": "_source.startDate",
        "dataType": "STRING",
        "format": "1:SECONDS:SIMPLE_DATE_FORMAT:yyyy-MM-dd",
        "granularity": "1:DAYS"
      },
      {
        "name": "_source.lastUpdate",
        "dataType": "STRING",
        "format": "1:SECONDS:SIMPLE_DATE_FORMAT:yyyy-MM-dd HH:mm:ss",
        "granularity": "1:DAYS"
      },
      {
        "name": "_source.sDate",
        "dataType": "STRING",
        "format": "1:SECONDS:SIMPLE_DATE_FORMAT:yyyy-MM-dd HH:mm:ss",
        "granularity": "1:DAYS"
      }
    ]
    can you please correct. I am getting error
    Copy code
    {
      "code": 400,
      "error": "Cannot find valid fieldSpec for timeColumn: timestamp from the table config: eventflow_REALTIME, in the schema: eventflowstats"
    }
    Need your help 🙂
    n
    • 2
    • 15
  • j

    jiatao

    03/16/2021, 8:52 PM
    Hi,
    Pinot Quickstart on JDK 15-ea
    is failing for my pr (which only change log messages). Seems like the test is running java 16 instead of 15:
    JAVA_HOME_16.0.0_x64=/opt/hostedtoolcache/jdk/16.0.0/x64
    . Any idea how to fix this? The pr test link for reference: https://github.com/apache/incubator-pinot/pull/6684/checks?check_run_id=2124725931
    x
    j
    c
    • 4
    • 14
  • r

    Ravikumar Maddi

    03/17/2021, 7:20 AM
    Hi All, I started quick-start-batch instead quick-start-streaming, how to stop quick-start-batch, any idea?
    x
    • 2
    • 3
  • r

    Ravikumar Maddi

    03/17/2021, 9:29 AM
    Hi All, I am trying to create schema, but I am getting error, this: bin/pinot-admin.sh AddTable -tableConfigFile $PDATA_HOME table-config.json -schemaFile schema-config.json -controllerPort 9000 -exec Executing command: AddTable -tableConfigFile table-config.json -schemaFile schema-config.json -controllerProtocol http -controllerHost 172.31.10.219 -controllerPort 9000 -exec Sending request: http://172.31.10.219:9000/schemas to controller: localhost, version: Unknown Got Exception to upload Pinot Schema: myschema org.apache.pinot.common.exception.HttpErrorStatusException: Got error status code: 400 (Bad Request) with reason: "Cannot add invalid schema: myschema. Reason: null" while sending request: http://172.31.10.219:9000/schemas to controller: localhost, version: Unknown at org.apache.pinot.common.utils.FileUploadDownloadClient.sendRequest(FileUploadDownloadClient.java:397) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac] Need Help 🙂 Table Config:
    Copy code
    {
      "tableName": "mytable",
      "tableType": "REALTIME",
      "segmentsConfig": {
        "timeColumnName": "_source.sDate",
        "timeType": "MILLISECONDS",
        "schemaName": "myschema",
        "replicasPerPartition": "1"
      },
      "tenants": {},
      "tableIndexConfig": {
        "loadMode": "MMAP",
        "streamConfigs": {
          "streamType": "kafka",
          "stream.kafka.consumer.type": "lowlevel",
          "stream.kafka.topic.name": "mytopic",
          "stream.kafka.decoder.class.name": "org.apache.pinot.plugin.stream.kafka.KafkaJSONMessageDecoder",
          "stream.kafka.consumer.factory.class.name": "org.apache.pinot.plugin.stream.kafka20.KafkaConsumerFactory",
          "stream.kafka.broker.list": "localhost:9876",
          "realtime.segment.flush.threshold.time": "3600000",
          "realtime.segment.flush.threshold.size": "50000",
          "stream.kafka.consumer.prop.auto.offset.reset": "smallest"
        }
      },
      "metadata": {
        "customConfigs": {}
      }
    }
    And Schema Config:
    Copy code
    {
      "schemaName": "myschema",
      "eventflow": [
        {
          "name": "_index",
          "dataType": "INT"
        },
        {
          "name": "_type",
          "dataType": "STRING"
        },
        {
          "name": "id",
          "dataType": "INT"
        },
       {
          "name": "_source.madids",
          "datatype": "INT",
          "singleValueField": false
        },
    
      ],
      "dateTimeFieldSpecs": [
        {
          "name": "_source.sDate",
          "dataType": "STRING",
          "format": "1:SECONDS:SIMPLE_DATE_FORMAT:yyyy-MM-dd HH:mm:ss",
          "granularity": "1:DAYS"
        }
      ]
    }
    k
    c
    • 3
    • 4
  • m

    Matteo Satnero

    03/17/2021, 12:00 PM
    Hello, is there a document that explains the “cutoff” time in detail for the data handled by time (and pk in case)? I am asking because it seems that I ve a record that is present in both OFFLINE and REALTIME (with same “primary key”) But when I am looking for it in the final table I am not finding it at all. OFFLINE record TIME 1615891108000 (ms) — max 1615939199415 — min 1612137600000 REALTIME record TIME 1615723114000(ms) — max 1615981903000 — min 1515496517000 FINAL record not present — max 1615981930000 — min 1612137600000
    m
    j
    • 3
    • 11
  • y

    Yash Agarwal

    03/17/2021, 4:29 PM
    Hello, Is there any performance difference between the following two queries for pinot
    select distinct city from transactions limit 100000
    select city from transactions group by city limit 100000
    k
    m
    • 3
    • 11
  • q

    Qiaochu Liu

    03/17/2021, 5:48 PM
    hello team, i got some error after rebased lastest pinot master when running
    mvn test
    I think it’s related to the dependency: org.apache.maven.pluginsmaven surefire plugin3.0.0-M5:test mvn clean install works perfectly. Is there a solution to fix this error?
    Copy code
    [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M5:test (default-test) on project pinot-spi: There are test failures.
    [ERROR] 
    [ERROR] Please refer to /Users/qiaochu/Fork/incubator-pinot/pinot-spi/target/surefire-reports for the individual test results.
    [ERROR] Please refer to dump files (if any exist) [date].dump, [date]-jvmRun[N].dump and [date].dumpstream.
    [ERROR] The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
    [ERROR] Command was /bin/sh -c cd /Users/qiaochu/Fork/incubator-pinot/pinot-spi && /Library/Java/JavaVirtualMachines/adoptopenjdk-11.jdk/Contents/Home/bin/java -javaagent:/Users/qiaochu/.m2/repository/org/jacoco/org.jacoco.agent/0.7.7.201606060606/org.jacoco.agent-0.7.7.201606060606-runtime.jar=destfile=/Users/qiaochu/Fork/incubator-pinot/pinot-spi/target/jacoco.exec -Xms4g -Xmx4g -jar /Users/qiaochu/Fork/incubator-pinot/pinot-spi/target/surefire/surefirebooter12276158737966275331.jar /Users/qiaochu/Fork/incubator-pinot/pinot-spi/target/surefire 2021-03-17T10-07-59_551-jvmRun1 surefire6422534819305887743tmp surefire_05875304854941226633tmp
    [ERROR] Error occurred in starting fork, check output in log
    [ERROR] Process Exit Code: 134
    [ERROR] org.apache.maven.surefire.booter.SurefireBooterForkException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
    [ERROR] Command was /bin/sh -c cd /Users/qiaochu/Fork/incubator-pinot/pinot-spi && /Library/Java/JavaVirtualMachines/adoptopenjdk-11.jdk/Contents/Home/bin/java -javaagent:/Users/qiaochu/.m2/repository/org/jacoco/org.jacoco.agent/0.7.7.201606060606/org.jacoco.agent-0.7.7.201606060606-runtime.jar=destfile=/Users/qiaochu/Fork/incubator-pinot/pinot-spi/target/jacoco.exec -Xms4g -Xmx4g -jar /Users/qiaochu/Fork/incubator-pinot/pinot-spi/target/surefire/surefirebooter12276158737966275331.jar /Users/qiaochu/Fork/incubator-pinot/pinot-spi/target/surefire 2021-03-17T10-07-59_551-jvmRun1 surefire6422534819305887743tmp surefire_05875304854941226633tmp
    [ERROR] Error occurred in starting fork, check output in log
    [ERROR] Process Exit Code: 134
    [ERROR] 	at org.apache.maven.plugin.surefire.booterclient.ForkStarter.fork(ForkStarter.java:748)
    [ERROR] 	at org.apache.maven.plugin.surefire.booterclient.ForkStarter.run(ForkStarter.java:305)
    [ERROR] 	at org.apache.maven.plugin.surefire.booterclient.ForkStarter.run(ForkStarter.java:265)
    [ERROR] 	at org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeProvider(AbstractSurefireMojo.java:1314)
    [ERROR] 	at org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeAfterPreconditionsChecked(AbstractSurefireMojo.java:1159)
    [ERROR] 	at org.apache.maven.plugin.surefire.AbstractSurefireMojo.execute(AbstractSurefireMojo.java:932)
    [ERROR] 	at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:137)
    [ERROR] 	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:210)
    [ERROR] 	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:156)
    [ERROR] 	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:148)
    [ERROR] 	at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:117)
    [ERROR] 	at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:81)
    [ERROR] 	at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:56)
    [ERROR] 	at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
    [ERROR] 	at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:305)
    [ERROR] 	at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:192)
    [ERROR] 	at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:105)
    [ERROR] 	at org.apache.maven.cli.MavenCli.execute(MavenCli.java:957)
    [ERROR] 	at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:289)
    [ERROR] 	at org.apache.maven.cli.MavenCli.main(MavenCli.java:193)
    [ERROR] 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [ERROR] 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    [ERROR] 	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    [ERROR] 	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
    [ERROR] 	at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:282)
    [ERROR] 	at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:225)
    [ERROR] 	at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:406)
    [ERROR] 	at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:347)
    [ERROR] 
    [ERROR] -> [Help 1]
    [ERROR] 
    [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
    [ERROR] Re-run Maven using the -X switch to enable full debug logging.
    [ERROR] 
    [ERROR] For more information about the errors and possible solutions, please read the following articles:
    [ERROR] [Help 1] <http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException>
    [ERROR] 
    [ERROR] After correcting the problems, you can resume the build with the command
    [ERROR]   mvn <args> -rf :pinot-spi
    x
    • 2
    • 7
  • p

    Phúc Huỳnh

    03/18/2021, 3:00 AM
    hi team. Is there anyway to check minion task:
    RealtimeToOfflineSegmentsTask
    status or error message ? I’m find warn log in brokers-server. But cannot find any log information in minion-server. Is this log relations to this task ?
    Copy code
    2021/03/17 09:26:49.639 WARN [TimeBoundaryManager] [HelixTaskExecutor-message_handle_thread] Failed to find segment with valid end time for table: RuleLogsUAT_OFFLINE, no time boundary generated
    2021/03/17 09:27:06.989 WARN [BaseBrokerRequestHandler] [jersey-server-managed-async-executor-0] Failed to find time boundary info for hybrid table: RuleLogsUAT
    n
    m
    j
    • 4
    • 18
  • r

    Ravikumar Maddi

    03/18/2021, 6:38 PM
    kafka-console-producer — Not getting any response I am trying to push to kafka, I am not able to get any thing as response. And data not appearing in the consumer console also. Please help me how to trace it , where can I found logs. is there any options to add to my command to see more detailed log. I am running this command:
    Copy code
    bin/kafka-console-producer.sh --broker-list localhost:19092 --topic mytopic opt_flatten_json.json
    I am getting output this only:
    Copy code
    >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
    x
    • 2
    • 3
  • m

    Matt

    03/19/2021, 1:55 AM
    Hello, I have a JSON data
    log
    and want to extract values based on keys (
    urlpath
    ). So tried to use JSONIndex however fails during parsing. So ingested it as normal string and tried JSONEXTRACTSCALAR/*json_extract_scalar* however this also fails during parsing. Finally I ended up using Groovy function like
    GROOVY('{"returnType": "STRING", "isSingleValue": true}', 'java.util.regex.Pattern p = java.util.regex.Pattern.compile("(\"urlpath\":\")((?:\\\"|[^\\\"]*))"); java.util.regex.Matcher m = p.matcher(arg0); if(m.find()){ return m.group(2); } else { return "";}',log)
    and this works in SQL. Now I want to add this Groovy function inside table config to do ingestionTransform to define a new columnName. Is this possible? For ingestion transform can we do multi line , semi colon separated script?
    n
    k
    • 3
    • 11
  • r

    Ravikumar Maddi

    03/19/2021, 5:15 AM
    Hi All I am trying to ingres data through kafka and json file and running this command:
    Copy code
    bin/kafka-console-producer.sh --broker-list localhost:19092 --topic mytopic < $PDATA_HOME/opt_flatten_json.json
    But Ia m getting error:
    Copy code
    Exception while executing a state transition task mystats__0__0__20210319T0430Z
    java.lang.reflect.InvocationTargetException: null
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_282]
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_282]
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_282]
    	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_282]
    	at org.apache.helix.messaging.handling.HelixStateTransitionHandler.invoke(HelixStateTransitionHandler.java:404) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
    	at org.apache.helix.messaging.handling.HelixStateTransitionHandler.handleMessage(HelixStateTransitionHandler.java:331) [pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
    	at org.apache.helix.messaging.handling.HelixTask.call(HelixTask.java:97) [pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
    	at org.apache.helix.messaging.handling.HelixTask.call(HelixTask.java:49) [pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_282]
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_282]
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_282]
    	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_282]
    Caused by: java.lang.OutOfMemoryError: Direct buffer memory
    	at java.nio.Bits.reserveMemory(Bits.java:695) ~[?:1.8.0_282]
    	at java.nio.DirectByteBuffer.<init>(DirectByteBuffer.java:123) ~[?:1.8.0_282]
    	at java.nio.ByteBuffer.allocateDirect(ByteBuffer.java:311) ~[?:1.8.0_282]
    	at org.apache.pinot.core.segment.memory.PinotByteBuffer.allocateDirect(PinotByteBuffer.java:39) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
    	at org.apache.pinot.core.segment.memory.PinotDataBuffer.allocateDirect(PinotDataBuffer.java:116) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
    	at org.apache.pinot.core.io.writer.impl.DirectMemoryManager.allocateInternal(DirectMemoryManager.java:53) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
    	at org.apache.pinot.core.io.readerwriter.RealtimeIndexOffHeapMemoryManager.allocate(RealtimeIndexOffHeapMemoryManager.java:79) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
    	at org.apache.pinot.core.realtime.impl.forward.FixedByteMVMutableForwardIndex.addDataBuffer(FixedByteMVMutableForwardIndex.java:162) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
    	at org.apache.pinot.core.realtime.impl.forward.FixedByteMVMutableForwardIndex.<init>(FixedByteMVMutableForwardIndex.java:137) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
    	at org.apache.pinot.core.indexsegment.mutable.MutableSegmentImpl.<init>(MutableSegmentImpl.java:307) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
    	at org.apache.pinot.core.data.manager.realtime.LLRealtimeSegmentDataManager.<init>(LLRealtimeSegmentDataManager.java:1270) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
    	at org.apache.pinot.core.data.manager.realtime.RealtimeTableDataManager.addSegment(RealtimeTableDataManager.java:324) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
    	at org.apache.pinot.server.starter.helix.HelixInstanceDataManager.addRealtimeSegment(HelixInstanceDataManager.java:132) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
    	at org.apache.pinot.server.starter.helix.SegmentOnlineOfflineStateModelFactory$SegmentOnlineOfflineStateModel.onBecomeOnlineFromOffline(SegmentOnlineOfflineStateModelFactory.java:164) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
    	at org.apache.pinot.server.starter.helix.SegmentOnlineOfflineStateModelFactory$SegmentOnlineOfflineStateModel.onBecomeConsumingFromOffline(SegmentOnlineOfflineStateModelFactory.java:88) ~[pinot-all-0.7.0-SNAPSHOT-jar-with-dependencies.jar:0.7.0-SNAPSHOT-d87755899eccba3554e9cc39a1439d5ecb53aaac]
    	... 12 more
    Default rollback method invoked on error. Error Code: ERROR
    Message execution failed. msgId: eed5b297-ea20-437e-a0b5-ad4d0be75c3c, errorMsg: java.lang.reflect.InvocationTargetException
    Skip internal error. errCode: ERROR, errMsg: null
    Event bcbad381_DEFAULT : Unable to find a next state for resource: mystats_REALTIME partition: mystats__0__0__20210319T0430Z from stateModelDefinitionclass org.apache.helix.model.StateModelDefinition from:ERROR to:CONSUMING
    Event c910d226_DEFAULT : Unable to find a next state for resource: mystats_REALTIME partition: mystats__0__0__20210319T0430Z from stateModelDefinitionclass org.apache.helix.model.StateModelDefinition from:ERROR to:CONSUMING
    Event d194950f_DEFAULT : Unable to find a next state for resource: mystats_REALTIME partition: mystats__0__0__20210319T0430Z from stateModelDefinitionclass org.apache.helix.model.StateModelDefinition from:ERROR to:CONSUMING
    Need Help 🙂
    x
    • 2
    • 3
  • r

    Ravikumar Maddi

    03/19/2021, 11:26 AM
    Hi Team Data not appearing in Pinot Query Console. I am pushing data to Pinot through kafka, By command bin/kafka-console-producer.sh --broker-list localhost:19092 --topic mytopic < $PDATA_HOME/data.json I check all logs, there is no exceptions, but my data not appearing in query tool. Need Help 🙂 My Schema look like this:
    Copy code
    {
      "schemaName": "eventflowstats",
      "dimensionFieldSpecs": [
        {
          "name": "_index",
          "dataType": "STRING"
        },
        {
          "name": "_type",
          "dataType": "STRING",
          "maxLength": 5
        },
        {
          "name": "_id",
          "dataType": "STRING"
        },
         {
          "name": "_source.aExpIds",
          "dataType": "INT",
          "singleValueField": false
        }
    	]
      "dateTimeFieldSpecs": [
        {
          "name": "_source.sDate",
          "dataType": "LONG",
          "format": "1:SECONDS:SIMPLE_DATE_FORMAT:SECONDS:SIMPLE_DATE_FORMAT",
          "granularity": "1:DAYS"
        }
    	]
    }
    My Table Config like this:
    Copy code
    {
      "tableName": "mytable",
      "tableType": "REALTIME",
      "tenants": {},
      "segmentsConfig": {
        "timeColumnName": "_source.sDate",
        "timeType": "MILLISECONDS",
        "segmentPushType": "APPEND",
        "replicasPerPartition": "1",
        "retentionTimeUnit": "DAYS",
        "retentionTimeValue": "1"
      },
      "tableIndexConfig": {
        "loadMode": "MMAP",
        "streamConfigs": {
          "streamType": "kafka",
          "stream.kafka.consumer.type": "lowLevel",
          "stream.kafka.topic.name": "mytopic",
          "stream.kafka.decoder.class.name": "org.apache.pinot.plugin.stream.kafka.KafkaJSONMessageDecoder",
          "stream.kafka.hlc.zk.connect.string": "localhost:2191/kafka",
          "stream.kafka.consumer.factory.class.name": "org.apache.pinot.plugin.stream.kafka20.KafkaConsumerFactory",
          "stream.kafka.zk.broker.url": "localhost:2191/kafka",
          "stream.kafka.broker.list": "localhost:19092"
        }
      },
      "metadata": {
        "customConfigs": {}
      }
    }
    And Data like this:
    Copy code
    {
      "_index": "dhfkdfkdsjfk",
      "_type": "_doc",
      "_id": "68767677989hjhjkhkjh",
      "_source.aExpIds": [
        815850,
        815857,
        821331
      ],
      "_source.sDate": "2021-01-04 00:00:00"
    }
    I check all logs, I did not find any exceptions. But data is not appearing in Pinot controller portal.
    x
    • 2
    • 7
  • a

    ayush sharma

    03/19/2021, 3:16 PM
    Hi everyone, I am facing an issue while ingesting batch data into Pinot. The command to ingest the data executes successfully,
    Copy code
    $ pinot-admin.sh LaunchDataIngestionJob -jobSpecFile /home/ayush/ayush_workspace/iVoyant/analytics/data/hospital_data/job-spec.yml
    .....
    Pushing segment: hospital to location: <http://localhost:9000> for table hospital
    Sending request: <http://localhost:9000/v2/segments?tableName=hospital> to controller: 4cb684aaf215, version: Unknown
    Response for pushing table hospital segment hospital to location <http://localhost:9000> - 200: {"status":"Successfully uploaded segment: hospital of table: hospital"}
    But, the table status on UI turns BAD Here is the Error logged in pinot-server:
    2021/03/19 15:03:24.082 ERROR [SegmentOnlineOfflineStateModelFactory$SegmentOnlineOfflineStateModel] [HelixTaskExecutor-message_handle_thread] Caught exception in state transition from OFFLINE -> ONLINE for resource: hospital_OFFLINE, partition: hospital
    java.lang.IllegalStateException: Key separator not found: APR, segment: /tmp/pinotServerData/hospital_OFFLINE/hospital/v3
    at shaded.com.google.common.base.Preconditions.checkState(Preconditions.java:444)
    Any idea? what could be wrong here ? I have attached the error log. Any help is appreciated !
    err.txt
    x
    • 2
    • 5
  • m

    Matt

    03/20/2021, 12:40 AM
    Hello, Just wondering is it normal MMAP going very high ? Also do this means I need to have ~1.5TB free space to hold the MMAP?
    m
    s
    • 3
    • 29
  • s

    Sameera

    03/20/2021, 7:01 PM
    Hey everyone 👋 I'm new here, still doing the installations for Pinot on my windows based machine, I'm getting this error for the very final step, do you guys have any idea what could be wrong? Thanks in advance! After building Pinot successfully, I tried to execute: bin/quick-start-batch.sh, and I get the following:
    m
    x
    • 3
    • 6
1...101112...166Latest