Danko Andruszkiw
06/12/2023, 1:55 PM# Deep Store = WIP
realtime.segment.serverUploadToDeepStore = true
pinot.server.instance.segment.store.uri=<URI of segment store> <<hdfs://hdfs-svr01/data/hdfs/nn>> <=== or does this need to be local???
pinot.server.instance.enable.split.commit=true
pinot.server.storage.factory.class.hdfs=org.apache.pinot.plugin.filesystem.HadoopPinotFS
pinot.server.storage.factory.hdfs.hadoop.conf.path=/etc/hadoop/conf <--- does this mean need hadoop or hadoop client installed on each server node???
pinot.server.segment.fetcher.protocols=file,http,hdfs
pinot.server.segment.fetcher.hdfs.class=org.apache.pinot.common.utils.fetcher.PinotFSSegmentFetcher
pinot.server.segment.fetcher.hdfs.hadoop.kerberos.principle=<your kerberos principal> <=== hadoop security needed ???
pinot.server.segment.fetcher.hdfs.hadoop.kerberos.keytab=<your kerberos keytab> <=== hadoop security needed ???
pinot.server.grpc.enable=true
pinot.server.grpc.port=8090
abhinav wagle
06/12/2023, 7:22 PMDrop
instance on UI, I see below error :
Failed to drop instance. Failed to drop instance Server_pinot-dev-server-19 - Instance Server_pinot-dev-server-19.pinot-dev-server-headless.de-nrt-pinot.svc.cluster.local_8098 exists in ideal state for table1_OFFLINE
parth
06/12/2023, 9:04 PMabhinav wagle
06/13/2023, 2:01 AMEhsan Irshad
06/13/2023, 3:20 AMDeena Dhayalan
06/13/2023, 11:14 AMTommaso Peresson
06/13/2023, 5:03 PMAdded or replaced segment: 265f53d53e8fe849ea0b245c6ed3ae5d_gz of table: TEST_HOURLY_OFFLINE
# There is insufficient memory for the Java Runtime Environment to continue
# Native memory allocation (malloc) failed to allocate 2097152 bytes for AllocateHeap
JAVA_OPTS for the servers are: -Xms15G -Xmx30G -XX:+UseG1GC -XX:MaxGCPauseMillis=200 -XX:ActiveProcessorCount=40 -Djute.maxbuffer=10000000 -Xlog:gc*:file=/opt/pinot/gc-pinot-server.log
. I’m running Pinot in a GKE cluster.
This happens when the server tries to bring back ONLINE OFFLINE segments. My total number of segments is around ~100k * 2.Ashwin Raja
06/13/2023, 7:21 PM{
"name": "blockTimestamp",
"encodingType": "DICTIONARY",
"indexType": "TIMESTAMP",
"indexTypes": [
"TIMESTAMP"
],
"timestampConfig": {
"granularities": [
"SECOND",
"MINUTE",
"HOUR",
"DAY",
"WEEK",
"MONTH",
"YEAR"
]
},
"indexes": null
},
What we're trying to do is count the number of rows in a single day, in our case 2021-05-11.
So we do a query like this, with a date range:
select $segmentName, DATETRUNC('day', "blockTimestamp") "blockTimestamp_day", count("transactionFrom") "Count Transaction From Address" from "13059af7-8eab-4196-a7ea-1a170d73c02e" where blockTimestamp_day >= fromDateTime('2021-05-11', 'yyyy-MM-dd') and blockTimestamp_day < fromDateTime('2021-05-12', 'yyyy-MM-dd') group by "blockTimestamp_day", $segmentName order by "blockTimestamp_day"
We get these results, which look correct:
13059af7-8eab-4196-a7ea-1a170d73c02e_OFFLINE_1620454828000_1620702391000_61_2296cf5b-2896-4ff8-bb57-0405bd69a7dc 2021-05-11 00:00:00.0 34699
13059af7-8eab-4196-a7ea-1a170d73c02e_OFFLINE_1620702401000_1620964585000_62_9f7f4c0f-5d06-4909-99c0-7c009ff53385 2021-05-11 00:00:00.0 247050
But if instead, we change this to do a =
query, which should be exactly the same results, we only pick up _*1 segment*_:
13059af7-8eab-4196-a7ea-1a170d73c02e_OFFLINE_1620702401000_1620964585000_62_9f7f4c0f-5d06-4909-99c0-7c009ff53385 2021-05-11 00:00:00.0 247050
Clearly, the first query is correct and the 2nd is incorrect, since we're not picking up segments/values that we should be?eywek
06/14/2023, 1:49 PMAlexander Vivas
06/14/2023, 1:53 PMSantosh Kumar Sharma
06/14/2023, 6:02 PMstandalone
framework and simple file
scheme.
but I am getting below exception
java.lang.RuntimeException: Failed to create IngestionJobRunner instance for class - null
details below
Tarring segment from: /var/folders/p9/0nhr33vd7m90vtvzzbx9t2x80000gn/T/pinot-056eaf50-cedc-4f14-9390-b083641c2622/output/events_OFFLINE_1633114228000_1633114232000_0 to: /var/folders/p9/0nhr33vd7m90vtvzzbx9t2x80000gn/T/pinot-056eaf50-cedc-4f14-9390-b083641c2622/output/events_OFFLINE_1633114228000_1633114232000_0.tar.gz
Size for segment: events_OFFLINE_1633114228000_1633114232000_0, uncompressed: 3.66K, compressed: 1.29K
Trying to create instance for class null
Got exception to kick off standalone data ingestion job -
java.lang.RuntimeException: Failed to create IngestionJobRunner instance for class - null
at org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher.kickoffIngestionJob(IngestionJobLauncher.java:145) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher.runIngestionJob(IngestionJobLauncher.java:130) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.tools.admin.command.LaunchDataIngestionJobCommand.execute(LaunchDataIngestionJobCommand.java:130) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.tools.Command.call(Command.java:33) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.tools.Command.call(Command.java:29) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at picocli.CommandLine.executeUserObject(CommandLine.java:1953) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at picocli.CommandLine.access$1300(CommandLine.java:145) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at picocli.CommandLine$RunLast.executeUserObjectOfLastSubcommandWithSameParent(CommandLine.java:2352) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at picocli.CommandLine$RunLast.handle(CommandLine.java:2346) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at picocli.CommandLine$RunLast.handle(CommandLine.java:2311) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at picocli.CommandLine$AbstractParseResultHandler.execute(CommandLine.java:2179) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at picocli.CommandLine.execute(CommandLine.java:2078) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.tools.admin.PinotAdministrator.execute(PinotAdministrator.java:171) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.tools.admin.PinotAdministrator.main(PinotAdministrator.java:202) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
Caused by: java.lang.NullPointerException
at org.apache.pinot.spi.plugin.PluginManager.createInstance(PluginManager.java:320) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.spi.plugin.PluginManager.createInstance(PluginManager.java:306) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher.kickoffIngestionJob(IngestionJobLauncher.java:143) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
... 13 more
java.lang.RuntimeException: Failed to create IngestionJobRunner instance for class - null
at org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher.kickoffIngestionJob(IngestionJobLauncher.java:145)
at org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher.runIngestionJob(IngestionJobLauncher.java:130)
at org.apache.pinot.tools.admin.command.LaunchDataIngestionJobCommand.execute(LaunchDataIngestionJobCommand.java:130)
at org.apache.pinot.tools.Command.call(Command.java:33)
at org.apache.pinot.tools.Command.call(Command.java:29)
at picocli.CommandLine.executeUserObject(CommandLine.java:1953)
at picocli.CommandLine.access$1300(CommandLine.java:145)
at picocli.CommandLine$RunLast.executeUserObjectOfLastSubcommandWithSameParent(CommandLine.java:2352)
at picocli.CommandLine$RunLast.handle(CommandLine.java:2346)
at picocli.CommandLine$RunLast.handle(CommandLine.java:2311)
at picocli.CommandLine$AbstractParseResultHandler.execute(CommandLine.java:2179)
at picocli.CommandLine.execute(CommandLine.java:2078)
at org.apache.pinot.tools.admin.PinotAdministrator.execute(PinotAdministrator.java:171)
at org.apache.pinot.tools.admin.PinotAdministrator.main(PinotAdministrator.java:202)
Caused by: java.lang.NullPointerException
at org.apache.pinot.spi.plugin.PluginManager.createInstance(PluginManager.java:320)
at org.apache.pinot.spi.plugin.PluginManager.createInstance(PluginManager.java:306)
at org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher.kickoffIngestionJob(IngestionJobLauncher.java:143)
... 13 more
any help will be greatly appreciated, TIA.Raveendra Yerraguntla
06/16/2023, 2:10 AMLee Wei Hern Jason
06/18/2023, 10:32 AMPham Duong
06/19/2023, 4:34 AMMingmin Xu
06/19/2023, 5:00 PMmvn clean install -DskipTests
however some tests in pinot-segment-spi, pinot-segment-local, pinot-core
cannot pass with error like below.
mvn clean install -Pbin-dist
java.lang.UnsatisfiedLinkError: 'long xerial.larray.impl.LArrayNative.mmap(long, int, long, long)'
at xerial.larray.impl.LArrayNative.mmap(Native Method)
at xerial.larray.mmap.MMapBuffer.<init>(MMapBuffer.java:94)
I suspect it's related with xerial/larray which doesn't mention MacM1/2 support, don't know how to find a workaround.Ehsan Irshad
06/20/2023, 5:51 AMJessica Stewart
06/20/2023, 8:21 PMInvalid table config: app_analytics_enterprise with error: Missing required creator property 'tableType' (index 1)
at [Source: (String)"{"tableName": "app_analytics_enterprise", "offline": {"tableName": "app_analytics_enterprise_OFFLINE", "tableType": "OFFLINE", "segmentsConfig": {"timeType": "DAYS", "schemaName": "app_analytics_enterprise", "retentionTimeUnit": "DAYS", "retentionTimeValue": "400", "replication": "1", "timeColumnName": "date_str", "minimizeDataMovement": true, "segmentPushType": "APPEND"}, "tenants": {"broker": "DefaultTenant", "server": "DefaultTenant"}, "tableIndexConfig": {"nullHandlingEnabled": true, "invert"[truncated 3445 chars]; line: 1, column: 3945] (through reference chain: org.apache.pinot.spi.config.table.TableConfig["tableType"])
Jatin
06/21/2023, 7:11 AMJohan Venant
06/21/2023, 12:58 PMMike Beyer
06/22/2023, 1:00 AMEhsan Irshad
06/22/2023, 6:00 AMAlexander Vivas
06/22/2023, 9:25 AM# executionFrameworkSpec: Defines ingestion jobs to be running.
executionFrameworkSpec:
name: 'standalone'
segmentGenerationJobRunnerClassName: 'org.apache.pinot.plugin.ingestion.batch.standalone.SegmentGenerationJobRunner'
segmentTarPushJobRunnerClassName: 'org.apache.pinot.plugin.ingestion.batch.standalone.SegmentTarPushJobRunner'
segmentUriPushJobRunnerClassName: 'org.apache.pinot.plugin.ingestion.batch.standalone.SegmentUriPushJobRunner'
segmentMetadataPushJobRunnerClassName: 'org.apache.pinot.plugin.ingestion.batch.standalone.SegmentMetadataPushJobRunner'
jobType: SegmentCreationAndMetadataPush
inputDirURI: 'gs://$BUCKET_NAME/bigquery-loads/'
includeFileNamePattern: 'glob:**/*.csv'
outputDirURI: 'gs://$BUCKET_NAME/controller/data/'
overwriteOutput: true
segmentCreationJobParallelism: 4
pinotFSSpecs:
- className: org.apache.pinot.plugin.filesystem.GcsPinotFS
configs:
projectId: '$GOOGLE_PROJECT_NAME'
gcpKey: '/opt/pinot/deployment/gcs/key.json'
recordReaderSpec:
dataFormat: 'csv'
className: 'org.apache.pinot.plugin.inputformat.csv.CSVRecordReader'
configClassName: 'org.apache.pinot.plugin.inputformat.csv.CSVRecordReaderConfig'
tableSpec:
tableName: 'events'
schemaURI: '<http://test-pinot-controller:9000/tables/events_REALTIME/schema>'
tableConfigURI: '<http://test-pinot-controller:9000/tables/events_REALTIME>'
pinotClusterSpecs:
- controllerURI: '<http://test-pinot-controller:9000>'
pushJobSpec:
pushParallelism: 4
pushAttempts: 2
pushRetryIntervalMillis: 1000
copyToDeepStoreForMetadataPush: false
But then I see this error when executing the ingestion job:
ERROR [LaunchDataIngestionJobCommand] [main] Got exception to kick off standalone data ingestion job -
ava.lang.RuntimeException: Failed to decode table config from JSON - '{"REALTIME":{"tableName":"events","tableType":"REALTIME","segmentsConfig":{...} ... '
at org.apache.pinot.common.segment.generation.SegmentGenerationUtils.getTableConfig(SegmentGenerationUtils.java:146) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.plugin.ingestion.batch.standalone.SegmentGenerationJobRunner.init(SegmentGenerationJobRunner.java:158) ~[pinot-batch-ingestion-standalone-0.12.0-shaded.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher.kickoffIngestionJob(IngestionJobLauncher.java:148) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher.runIngestionJob(IngestionJobLauncher.java:129) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.tools.admin.command.LaunchDataIngestionJobCommand.execute(LaunchDataIngestionJobCommand.java:130) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.tools.Command.call(Command.java:33) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.tools.Command.call(Command.java:29) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at picocli.CommandLine.executeUserObject(CommandLine.java:1953) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at picocli.CommandLine.access$1300(CommandLine.java:145) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at picocli.CommandLine$RunLast.executeUserObjectOfLastSubcommandWithSameParent(CommandLine.java:2352) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at picocli.CommandLine$RunLast.handle(CommandLine.java:2346) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at picocli.CommandLine$RunLast.handle(CommandLine.java:2311) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at picocli.CommandLine$AbstractParseResultHandler.execute(CommandLine.java:2179) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at picocli.CommandLine.execute(CommandLine.java:2078) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.tools.admin.PinotAdministrator.execute(PinotAdministrator.java:171) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.tools.admin.PinotAdministrator.main(PinotAdministrator.java:202) [pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
Caused by: org.apache.pinot.shaded.com.fasterxml.jackson.databind.exc.MismatchedInputException: Missing required creator property 'tableName' (index 0)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.pinot.spi.config.table.TableConfig["tableName"])
at org.apache.pinot.shaded.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:59) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.shaded.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1615) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.shaded.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:194) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.shaded.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.shaded.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.shaded.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.shaded.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.shaded.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1405) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.shaded.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.shaded.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.shaded.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.shaded.com.fasterxml.jackson.databind.ObjectReader._bindAndClose(ObjectReader.java:2033) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.shaded.com.fasterxml.jackson.databind.ObjectReader.readValue(ObjectReader.java:1669) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.spi.utils.JsonUtils.jsonNodeToObject(JsonUtils.java:216) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
at org.apache.pinot.common.segment.generation.SegmentGenerationUtils.getTableConfig(SegmentGenerationUtils.java:144) ~[pinot-all-0.12.0-jar-with-dependencies.jar:0.12.0-118f5e065cb258c171d97a586183759fbc61e2bf]
... 15 more"
Vivardhan Devaki
06/22/2023, 11:03 AMWARNING: sun.reflect.Reflection.getCallerClass is not supported. This will impact performance.
ERROR StatusLogger Unrecognized format specifier [d]
ERROR StatusLogger Unrecognized conversion specifier [d] starting at position ERROR StatusLogger Unrecognized format specifier [thread]
ERROR StatusLogger Unrecognized conversion specifier [thread] starting at position ERROR StatusLogger Unrecognized format specifier [level]
ERROR StatusLogger Unrecognized conversion specifier [level] starting at position ERROR StatusLogger Unrecognized format specifier [logger]
ERROR StatusLogger Unrecognized conversion specifier [logger] starting at position ERROR StatusLogger Unrecognized format specifier [msg]
ERROR StatusLogger Unrecognized conversion specifier [msg] starting at position ERROR StatusLogger Unrecognized format specifier [n]
ERROR StatusLogger Unrecognized conversion specifier [n] starting at position ERROR StatusLogger Reconfiguration failed: No configuration found for 'Default' at 'null' in 'null'
Exception in thread "main" java.lang.ExceptionInInitializerError
at org.apache.pinot.tools.admin.command.StartKafkaCommand.<init>(StartKafkaCommand.java:
at org.apache.pinot.tools.admin.PinotAdministrator.<clinit>(PinotAdministrator.java:
Caused by: java.util.NoSuchElementException
at java.base/java.util.ServiceLoader$
at java.base/java.util.ServiceLoader$
at java.base/java.util.ServiceLoader$
at org.apache.pinot.tools.utils.KafkaStarterUtils.getKafkaConnectorPackageName(KafkaStarterUtils.java: at org.apache.pinot.tools.utils.KafkaStarterUtils.<clinit>(KafkaStarterUtils.java: ... 2 more
I verified that the JAR is being pulled correctly from the nexus repository by mouting it into /opt/pinot/data directory instead, where I am able to see the JAR. Also I have tried to pull both the versions of the JARs - with dependencies and without dependencies. Not sure what I am missing here.Mike Beyer
06/22/2023, 5:19 PM./usr/bin/
directory, not /opt/kafka/bin/
Mike Beyer
06/22/2023, 5:24 PMMike Beyer
06/22/2023, 5:32 PMdocker exec -t main-kafka-1 /usr/bin/kafka-topics --bootstrap-server localhost:9092 --create --topic transcript-topic --partitions 1 --replication-factor 1
Sonit Rathi
06/23/2023, 5:50 AMAbhishek Dubey
06/23/2023, 6:45 AM전이섭
06/23/2023, 9:28 AMjava.lang.IllegalStateException: Forward index disabled column COLUMN_NAME must have a dictionary
I have some dimension fields as strings and I am getting an error for these dimension fields.
I haven’t created any indexes for testing. As far as I know, if I don’t set any index, the forward index is automatically created by default. However, according to the error message, Forward index is disabled. Why?
How can i solve this error?
Table Configuration
{
"OFFLINE": {
"tableName": "xx_OFFLINE",
"tableType": "OFFLINE",
"segmentsConfig": {
"schemaName": "xx",
"replication": "1",
"replicasPerPartition": "1",
"timeColumnName": "order_created_at",
"segmentPushFrequency": "HOURLY",
"segmentPushType": "APPEND",
"minimizeDataMovement": false
},
"tenants": {
"broker": "DefaultTenant",
"server": "DefaultTenant"
},
"tableIndexConfig": {
"invertedIndexColumns": [],
"noDictionaryColumns": [],
"enableDynamicStarTreeCreation": false,
"aggregateMetrics": false,
"nullHandlingEnabled": false,
"optimizeDictionary": false,
"optimizeDictionaryForMetrics": false,
"noDictionarySizeRatioThreshold": 0,
"rangeIndexColumns": [],
"rangeIndexVersion": 2,
"autoGeneratedInvertedIndex": false,
"createInvertedIndexDuringSegmentGeneration": false,
"sortedColumn": [],
"bloomFilterColumns": [],
"loadMode": "MMAP",
"onHeapDictionaryColumns": [],
"varLengthDictionaryColumns": [],
"enableDefaultStarTree": false
},
"metadata": {},
"quota": {},
"routing": {},
"query": {},
"ingestionConfig": {
"segmentTimeValueCheck": true,
"continueOnError": false,
"rowTimeValueCheck": false
},
"isDimTable": false
}
}
Soo
06/23/2023, 10:45 AMBad Connection: Tableau could not connect to the data source. Error Code: 1CA83880