I have one doubt if anyone can answer please- When...
# troubleshoot
s
I have one doubt if anyone can answer please- When trying to ingest recipe.yaml, I am getting error that resource is not defined. I ran command with --debug option and realized that resource is being prefixed with \ufeff. Does anyone have seen this issue?
g
can you share the recipe, mask the sensitive value
s
Thanks @gifted-knife-16120 for quick response. PSB: source: type: "csv-enricher" config: filename: "/path/to/csv" should_overwrite: false delimiter: "," array_delimiter: "|" sink: type: "datahub-rest" config: server: "<DATAHUB_SERVER>:8080"
b
hey JT! can you post your error that you're seeing here as well?
can you also share some info about your csv file? i'm wondering if that's adding something that's unexpected
s
@bulky-soccer-26729 Here you go- resource,tags,owners,ownership_type,description,domain "urnlidataset:(urnlidataPlatform:hive,<TBL>,DEV)",<SOME_DATA>,TECHNICAL_OWNER,"<DESC>",urnlidomain:<DOMAIN>
erorr is keyError: 'resource' , coming from csv_enricher.py
this key is checked as a mandatory field. I ran datahub ingest command with --debug option and found out that when row dictionary is created for each row of csv, it has a '\ufeff' prefix which is why resource is not found even though it is present.
Not sure if it is an issue with my CSV, I tried to replace this Unicode character with "'' to double check but the error still persists.
b
yeah this is a tough one.. it doesn't appear like anything should be off in the csv itself, but after doing some google searching this might be an issue with utf encoding in the file's header (seeing some stuff on stack overflow: https://stackoverflow.com/questions/48085319/python-reading-and-writing-csv-files-with-utf-8-encoding) - are you storing this file where the path has utf encoded characters or something that could be causing the '\ufeff' prefix (or byte order mark) to appear? or are there any other encoded characters in the csv itself?
s
Yea, I tried to replace unicode characters but no luck, will dig in more. As of now, I can use other ways to populate dataset's metadata. Thanks for the help @bulky-soccer-26729
b
okay gotcha, sorry for the difficulty here. and of course!
d
Hi, was this ever figured out? I am getting this error. any help is appreciated. Thanks!