Hi all, is there a way to change the spark task na...
# integrate-databricks-datahub
c
Hi all, is there a way to change the spark task name? right now it's just picking up my queries and using that as the name
c
Unfortunately, at this time, we dont have task name configurable. We understand the current way is not the best way, and will try to improve this. Please let us know if you have any thoughts around how this thing can be made more user friendly.