I also see a issue where if the previously execute...
# general
s
I also see a issue where if the previously executed spark submit job is terminated before completion the staging directory is not deleted and if we rerun the job old staged files are also loaded in tables creating bad segments.. shouldn’t staging directory be deleted each time before job runs? Also deleting the table doesnt delete the underlying data directory causing data duplication when you recreate the table and load it