I'm getting an error when trying to read a scala c...
# troubleshooting
c
I'm getting an error when trying to read a scala case class from a datastream converted to a table:
Copy code
java.lang.IllegalStateException: Could not find a setter for field 'seriesID' in class 'streams.transport.bostream.scala.booking.MVTAllocation'. Make sure that the field is writable (via public visibility, setter, or full constructor).
	at org.apache.flink.table.data.conversion.StructuredObjectConverter.fieldNotWritableException(StructuredObjectConverter.java:282)
	at org.apache.flink.table.data.conversion.StructuredObjectConverter.lambda$setterExpr$6(StructuredObjectConverter.java:322)
	at java.util.Optional.orElseThrow(Optional.java:290)
	at org.apache.flink.table.data.conversion.StructuredObjectConverter.setterExpr(StructuredObjectConverter.java:320)
	at org.apache.flink.table.data.conversion.StructuredObjectConverter.generateCode(StructuredObjectConverter.java:233)
	at org.apache.flink.table.data.conversion.StructuredObjectConverter.createOrError(StructuredObjectConverter.java:161)
	at org.apache.flink.table.data.conversion.StructuredObjectConverter.create(StructuredObjectConverter.java:112)
the case class (generated by scalapb) in question:
Copy code
final case class MVTAllocation(
    seriesID: _root_.scala.Predef.String = "",
    originalTaxiTypeID: _root_.scala.Long = 0L,
    taxiTypeIDs: Array[_root_.scala.Long] = Array.empty,
    previousBookingCodes: Array[_root_.scala.Predef.String] = Array.empty,
    isLastBooking: _root_.scala.Boolean = false
    ) extends scalapb.GeneratedMessage {[...]}
as you can clearly see, the seriesID field is writable via full constructor. We have been using scalapb case classes on many occasions and this error is new. usually we use the KafkaSource to create a Datastream[MVTAllocation] and then convert it to a table via
val table = tableEnv.fromDataStream()
and that has worked reliably .... so far. I don't understand why Flink is not able to find the writable field in the constructor here?
wrapped in this exception:
Copy code
org.apache.flink.table.api.TableException: Could not create converter for structured type '*streams.transport.bostream.scala.booking.MVTAllocation<`seriesID` STRING, `originalTaxiTypeID` BIGINT NOT NULL, `taxiTypeIDs` ARRAY<BIGINT NOT NULL> NOT NULL, `previousBookingCodes` ARRAY<STRING> NOT NULL, `isLastBooking` BOOLEAN NOT NULL>* NOT NULL'.
	at org.apache.flink.table.data.conversion.StructuredObjectConverter.create(StructuredObjectConverter.java:115)
	at org.apache.flink.table.data.conversion.DataStructureConverters.getConverterInternal(DataStructureConverters.java:232)
	at org.apache.flink.table.data.conversion.DataStructureConverters.getConverter(DataStructureConverters.java:202)
	at org.apache.flink.table.data.conversion.StructuredObjectConverter.lambda$createOrError$0(StructuredObjectConverter.java:135)
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384)
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:546)
	at java.util.stream.AbstractPipeline.evaluateToArrayNode(AbstractPipeline.java:260)
	at java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:505)
	at org.apache.flink.table.data.conversion.StructuredObjectConverter.createOrError(StructuredObjectConverter.java:136)
	at org.apache.flink.table.data.conversion.StructuredObjectConverter.create(StructuredObjectConverter.java:112)
m
Which Scala version are you using?
c
2.12.18
m
Flink only supports up to Scala 2.12.7 unfortunately. I'm wondering if that could be related, especially since > 2.12.7 has a binary incompatibility (Scala broke that in a patch version)
1