Mayank
Yash Agarwal
07/22/2020, 12:48 PMCaused by: java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDDLike$$anonfun$foreach$1.f$14 of type org.apache.spark.api.java.function.VoidFunction in instance of org.apache.spark.api.java.JavaRDDLike$$anonfun$foreach$1
I am using
PINOT_VERSION=0.4.0
With the following overridden versions configs to match the environment
<scala.version>2.11.8</scala.version>
<spark.version>2.3.1.tgt.17</spark.version> (which is specific to target)
Env:
Spark version 2.3.1.tgt.17
Using Scala version 2.11.8, Java HotSpot(TM) 64-Bit Server VM, 1.8.0_73
Run Command:
spark-submit \
--class org.apache.pinot.tools.admin.command.LaunchDataIngestionJobCommand \
--master yarn \
--deploy-mode client \
--conf "spark.driver.extraJavaOptions=-Dplugins.dir=${PINOT_DISTRIBUTION_DIR}/plugins -Dlog4j2.configurationFile=${PINOT_DISTRIBUTION_DIR}/conf/pinot-ingestion-job-log4j2.xml" \
--conf "spark.driver.extraClassPath=${PINOT_DISTRIBUTION_DIR}/lib/pinot-all-${PINOT_VERSION}-jar-with-dependencies.jar" \
local://${PINOT_DISTRIBUTION_DIR}/lib/pinot-all-${PINOT_VERSION}-jar-with-dependencies.jar \
-jobSpecFile /home_dir/z00290g/guestSdrGstDataSgl_sparkIngestionJobSpec.yaml
Yash Agarwal
07/23/2020, 4:48 PM{
"name": "sls_d",
"dataType": "STRING",
"format": "1:DAYS:SIMPLE_DATE_FORMAT:yyyy-MM-dd",
"granularity": "1:DAYS"
}
but i am getting
Caused by: java.lang.IllegalArgumentException: Invalid format: "null"
at org.joda.time.format.DateTimeParserBucket.doParseMillis(DateTimeParserBucket.java:187) ~[pinot-all.jar:0.4.0-8355d2e0e489a8d127f2e32793671fba505628a8]
Xiang Fu
Pradeep
07/25/2020, 2:06 AMRavi Singal
07/25/2020, 12:05 PMPradeep
07/25/2020, 11:04 PMKafkaConfluentSchemaRegistryAvroMessageDecoder
We have a schema registry set up with SSL authentication. I am getting SSLHandshakeException
Wondering what is the proper way to pass the SSL certs config for the schema registry client?
I digged a bit into the code, it seems like pinot needs to update the schema-registry-client
to include this (https://github.com/confluentinc/schema-registry/pull/957)
with some code changes. Can be accomplished without it too.
Wanted to check before if there is an alternative way to accomplish this?Pradeep
07/26/2020, 6:38 PMCaught exception while processing instance request
java.lang.NoSuchMethodError: java.nio.ByteBuffer.position(I)Ljava/nio/ByteBuffer;
at org.apache.pinot.core.common.datatable.DataTableBuilder.setColumn(DataTableBuilder.java:157) ~[pinot-all-0.5.0-SNAPSHOT-jar-with-dependencies.jar:0.5.0-SNAPSHOT-39a78f3df43ac613663975844e598f07b18bf623]
at org.apache.pinot.core.query.selection.SelectionOperatorUtils.getDataTableFromRows(SelectionOperatorUtils.java:261) ~[pinot-all-0.5.0-SNAPSHOT-jar-with-dependencies.jar:0.5.0-SNAPSHOT-39a78f3df43ac613663975844e598f07b18bf623]
at org.apache.pinot.core.operator.blocks.IntermediateResultsBlock.getSelectionResultDataTable(IntermediateResultsBlock.java:348) ~[pinot-all-0.5.0-SNAPSHOT-jar-with-dependencies.jar:0.5.0-SNAPSHOT-39a78f3df43ac613663975844e598f07b18bf623]
at org.apache.pinot.core.operator.blocks.IntermediateResultsBlock.getDataTable(IntermediateResultsBlock.java:262) ~[pinot-all-0.5.0-SNAPSHOT-jar-with-dependencies.jar:0.5.0-SNAPSHOT-39a78f3df43ac613663975844e598f07b18bf623]
at org.apache.pinot.core.operator.blocks.InstanceResponseBlock.<init>(InstanceResponseBlock.java:43) ~[pinot-all-0.5.0-SNAPSHOT-jar-with-dependencies.jar:0.5.0-SNAPSHOT-39a78f3df43ac613663975844e598f07b18bf623]
at org.apache.pinot.core.operator.InstanceResponseOperator.getNextBlock(InstanceResponseOperator.java:37) ~[pinot-all-0.5.0-SNAPSHOT-jar-with-dependencies.jar:0.5.0-SNAPSHOT-39a78f3df43ac613663975844e598f07b18bf623]
at org.apache.pinot.core.operator.InstanceResponseOperator.getNextBlock(InstanceResponseOperator.java:26) ~[pinot-all-0.5.0-SNAPSHOT-jar-with-dependencies.jar:0.5.0-SNAPSHOT-39a78f3df43ac613663975844e598f07b18bf623]
at org.apache.pinot.core.operator.BaseOperator.nextBlock(BaseOperator.java:49) ~[pinot-all-0.5.0-SNAPSHOT-jar-with-dependencies.jar:0.5.0-SNAPSHOT-39a78f3df43ac613663975844e598f07b18bf623]
at org.apache.pinot.core.plan.GlobalPlanImplV0.execute(GlobalPlanImplV0.java:48) ~[pinot-all-0.5.0-SNAPSHOT-jar-with-dependencies.jar:0.5.0-SNAPSHOT-39a78f3df43ac613663975844e598f07b18bf623]
Elon
07/27/2020, 5:00 PMElon
07/27/2020, 8:35 PMElon
07/27/2020, 9:35 PMElon
07/27/2020, 9:36 PMYash Agarwal
07/30/2020, 6:38 AMOguzhan Mangir
07/30/2020, 7:36 PMpinot.core.transport.{AsyncQueryResponse, QueryRouter, ServerInstance
) in pinot 0.5.0-snapshot version.
i'm running locally now, and using pinot 0.4.0 version in locally now(because when i try to up pinot with the master branch, it can not load the data. there maybe a problem). Can there any problem about backward compatibility?
error message;
ERROR org.apache.pinot.core.transport.DataTableHandler - Caught exception while handling response from server: 192.168.2.154_O
java.lang.NoSuchMethodError: java.nio.ByteBuffer.position(I)Ljava/nio/ByteBuffer;
at org.apache.pinot.core.common.datatable.DataTableImplV2.<init>(DataTableImplV2.java:122) ~[classes/:?]
at org.apache.pinot.core.common.datatable.DataTableFactory.getDataTable(DataTableFactory.java:35) ~[classes/:?]
at org.apache.pinot.core.transport.DataTableHandler.channelRead0(DataTableHandler.java:67) ~[classes/:?]
at org.apache.pinot.core.transport.DataTableHandler.channelRead0(DataTableHandler.java:36) ~[classes/:?]
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) ~[netty-all-4.1.42.Final.jar:4.1.42.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374) [netty-all-4.1.42.Final.jar:4.1.42.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360) [netty-all-4.1.42.Final.jar:4.1.42.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:352) [netty-all-4.1.42.Final.jar:4.1.42.Final]
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:328) [netty-all-4.1.42.Final.jar:4.1.42.Final]
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:302) [netty-all-4.1.42.Final.jar:4.1.42.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374) [netty-all-4.1.42.Final.jar:4.1.42.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360) [netty-all-4.1.42.Final.jar:4.1.42.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:352) [netty-all-4.1.42.Final.jar:4.1.42.Final]
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1422) [netty-all-4.1.42.Final.jar:4.1.42.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374) [netty-all-4.1.42.Final.jar:4.1.42.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360) [netty-all-4.1.42.Final.jar:4.1.42.Final]
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:931) [netty-all-4.1.42.Final.jar:4.1.42.Final]
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163) [netty-all-4.1.42.Final.jar:4.1.42.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:700) [netty-all-4.1.42.Final.jar:4.1.42.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:635) [netty-all-4.1.42.Final.jar:4.1.42.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:552) [netty-all-4.1.42.Final.jar:4.1.42.Final]
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:514) [netty-all-4.1.42.Final.jar:4.1.42.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$6.run(SingleThreadEventExecutor.java:1044) [netty-all-4.1.42.Final.jar:4.1.42.Final]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-all-4.1.42.Final.jar:4.1.42.Final]
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) [netty-all-4.1.42.Final.jar:4.1.42.Final]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_221]
Mayank
Mayank
Dan Hill
08/04/2020, 6:44 PMselect * from metrics limit 20000
and I see
numDocsScanned = 15400
totalDocs = 17118
totalDocs matches my raw data size (what I'd expect). The query results match numDocsScanned (which is wrong). I'm happy to share data and schema in a private message.Mayank
Elon
08/05/2020, 12:30 AMDan Hill
08/05/2020, 5:27 AMApoorva Moghey
08/05/2020, 11:17 AMRealtimeProvisioningHelperCommand
Elon
08/05/2020, 11:01 PMElon
08/05/2020, 11:41 PMSidd
08/05/2020, 11:43 PMPradeep
08/10/2020, 7:57 PMElon
08/11/2020, 10:56 PMDan Hill
08/13/2020, 3:38 AMAndrew First
08/14/2020, 7:17 PM$ kubectl -n pinot logs pinot-broker-6 broker -f
Xiang Fu
Pradeep
08/15/2020, 12:51 AM