This message was deleted.
# ask-for-help
s
This message was deleted.
c
Hi @Renz Abergos, yes it is possible! do you mind share a bit more about what you’re looking to achieve? e.g. input/out file type, models, etc
r
Hi @Chaoyu, i'm trying the batch inference using spark and I'm facing this issue:
This IO descriptor does not currently support batch inference
I got it to work by providing the schema in
run_in_spark()
, but when I try to show the results of the dataframe, it says:
AttributeError: type object 'Client' has no attribute 'wait_until_server_is_ready'
I'm running this on Databricks btw