This message was deleted.
# general
s
This message was deleted.
s
Seems like a hadoop issue.... I found this old conversation which seems relevant: https://stackoverflow.com/questions/22138664/hbase-map-reduce-and-sequencefiles-mapred-output-format-class-is-incompatible How are you submitting this job? Can you share the load spec? Have you tried SQL Based Ingestion as an alternative?
s
Thanks for your inputs @Sergio Ferragut, I am submitting thru web console, This is MR job error :
Copy code
Error: class com.fasterxml.jackson.datatype.guava.deser.HostAndPortDeserializer overrides final method deserialize.(Lcom/fasterxml/jackson/core/JsonParser;Lcom/fasterxml/jackson/databind/DeserializationContext;)Ljava/lang/Object;
Error: class com.fasterxml.jackson.datatype.guava.deser.HostAndPortDeserializer overrides final method deserialize.(Lcom/fasterxml/jackson/core/JsonParser;Lcom/fasterxml/jackson/databind/DeserializationContext;)Ljava/lang/Object;
Error: class com.fasterxml.jackson.datatype.guava.deser.HostAndPortDeserializer overrides final method deserialize.(Lcom/fasterxml/jackson/core/JsonParser;Lcom/fasterxml/jackson/databind/DeserializationContext;)Ljava/lang/Object;
Copy code
org.apache.hadoop.io.serializer.SerializationFactory: Serialization class not found: 
java.lang.ClassNotFoundException: Class  org.apache.hadoop.io.serializer.avro.AvroReflectSerialization not found
s
Was this working before? If so, what changed? The error indicates a missing jar or something that is not in the class path. I am not very familiar with the hadoop functionality, but could it be that you are missing a Druid extension?
s
Thanks for your inputs. Druid extension added