https://pinot.apache.org/ logo
#general
Title
# general
s

Satyam Raj

04/12/2022, 7:52 AM
hey guys, I’m trying to do batch-ingestion of ORC files from S3 to pinot using the spark batch job.
Copy code
export PINOT_VERSION=0.10.0
  export PINOT_DISTRIBUTION_DIR=/Users/satyam.raj/dataplatform/pinot-dist/apache-pinot-0.10.0-bin
  
  bin/spark-submit \
  --class org.apache.pinot.tools.admin.command.LaunchDataIngestionJobCommand \
  --master "local[8]" \
  --conf "spark.driver.extraJavaOptions=-Dplugins.dir=${PINOT_DISTRIBUTION_DIR}/plugins -Dlog4j2.configurationFile=${PINOT_DISTRIBUTION_DIR}/conf/pinot-ingestion-job-log4j2.xml" \
  --conf "spark.driver.extraClassPath=${PINOT_DISTRIBUTION_DIR}/plugins/pinot-batch-ingestion/pinot-batch-ingestion-spark/pinot-batch-ingestion-spark-${PINOT_VERSION}-shaded.jar:${PINOT_DISTRIBUTION_DIR}/lib/pinot-all-${PINOT_VERSION}-jar-with-dependencies.jar:${PINOT_DISTRIBUTION_DIR}/plugins/pinot-file-system/pinot-s3/pinot-s3-${PINOT_VERSION}-shaded.jar:${PINOT_DISTRIBUTION_DIR}/plugins/pinot-input-format/pinot-parquet/pinot-parquet-${PINOT_VERSION}-shaded.jar:${PINOT_DISTRIBUTION_DIR}/plugins/pinot-file-system/pinot-hdfs/pinot-hdfs-${PINOT_VERSION}-shaded.jar" \
  ${PINOT_DISTRIBUTION_DIR}/lib/pinot-all-${PINOT_VERSION}-jar-with-dependencies.jar \
  -jobSpecFile '/Users/satyam.raj/dataplatform/pinot-dist/batchjob-spec/batch-job-spec.yaml'
Getting the below weird error:
Copy code
Exception in thread "main" java.lang.VerifyError: Bad type on operand stack
Exception Details:
  Location:
    org/apache/spark/metrics/sink/MetricsServlet.<init>(Ljava/util/Properties;Lcom/codahale/metrics/MetricRegistry;Lorg/apache/spark/SecurityManager;)V @116: invokevirtual
  Reason:
    Type 'com/codahale/metrics/json/MetricsModule' (current frame, stack[2]) is not assignable to 'com/fasterxml/jackson/databind/Module'
  Current Frame:
    bci: @116
    flags: { }
    locals: { 'org/apache/spark/metrics/sink/MetricsServlet', 'java/util/Properties', 'com/codahale/metrics/MetricRegistry', 'org/apache/spark/SecurityManager' }
    stack: { 'org/apache/spark/metrics/sink/MetricsServlet', 'com/fasterxml/jackson/databind/ObjectMapper', 'com/codahale/metrics/json/MetricsModule' }
  Bytecode:
    0000000: 2a2b b500 2a2a 2cb5 002f 2a2d b500 5c2a
    0000010: b700 7e2a 1280 b500 322a 1282 b500 342a
    0000020: 03b5 0037 2a2b 2ab6 0084 b600 8ab5 0039
    0000030: 2ab2 008f 2b2a b600 91b6 008a b600 95bb
    0000040: 0014 592a b700 96b6 009c bb00 1659 2ab7
    0000050: 009d b600 a1b8 00a7 b500 3b2a bb00 7159
    0000060: b700 a8bb 00aa 59b2 00b0 b200 b32a b600
    0000070: b5b7 00b8 b600 bcb5 003e b1

	at java.base/java.lang.Class.forName0(Native Method)
	at java.base/java.lang.Class.forName(Class.java:398)
	at org.apache.spark.util.Utils$.classForName(Utils.scala:238)
	at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:200)
	at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:196)
	at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:130)
	at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:130)
	at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
	at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
	at scala.collection.mutable.HashMap.foreach(HashMap.scala:130)
	at org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:196)
	at org.apache.spark.metrics.MetricsSystem.start(MetricsSystem.scala:104)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:514)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:117)
	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2550)
	at org.apache.spark.SparkContext.getOrCreate(SparkContext.scala)
	at org.apache.pinot.plugin.ingestion.batch.spark.SparkSegmentGenerationJobRunner.run(SparkSegmentGenerationJobRunner.java:196)
	at org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher.kickoffIngestionJob(IngestionJobLauncher.java:146)
	at org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher.runIngestionJob(IngestionJobLauncher.java:125)
	at org.apache.pinot.tools.admin.command.LaunchDataIngestionJobCommand.execute(LaunchDataIngestionJobCommand.java:121)
	at org.apache.pinot.tools.Command.call(Command.java:33)
	at org.apache.pinot.tools.Command.call(Command.java:29)
	at picocli.CommandLine.executeUserObject(CommandLine.java:1953)
	at picocli.CommandLine.access$1300(CommandLine.java:145)
	at picocli.CommandLine$RunLast.executeUserObjectOfLastSubcommandWithSameParent(CommandLine.java:2352)
	at picocli.CommandLine$RunLast.handle(CommandLine.java:2346)
	at picocli.CommandLine$RunLast.handle(CommandLine.java:2311)
	at picocli.CommandLine$AbstractParseResultHandler.execute(CommandLine.java:2179)
	at picocli.CommandLine.execute(CommandLine.java:2078)
	at org.apache.pinot.tools.admin.command.LaunchDataIngestionJobCommand.main(LaunchDataIngestionJobCommand.java:153)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
	at <http://org.apache.spark.deploy.SparkSubmit.org|org.apache.spark.deploy.SparkSubmit.org>$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:855)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:930)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:939)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
m

Mayank

04/12/2022, 4:41 PM
What version of spark? Also cc: @User for any inputs
@User ^^
k

Kartik Khare

04/12/2022, 7:26 PM
Hi satyam, can you also mention the Java version you are using
s

Satyam Raj

04/13/2022, 5:13 AM
using jdk 11, and spark 2.4.6