Hello! I'm noticing that when ingesting data from ...
# ingestion
c
Hello! I'm noticing that when ingesting data from BigQuery, if the active service account doesn't actually have the necessary permissions on the GCP project being ingested, then there will be no error message, just a "Pipeline finished successfully" message but no data ingested. Is that expected behavior?
Relatedly, I think it would be helpful to add the exact Google scopes needed to the BQ ingestion guide: https://datahubproject.io/docs/metadata-ingestion/source_docs/bigquery I can see that the required scopes are specified for
bigquery-usage
but not plain
bigquery
. My team has had success with a svc account that had
BigQuery Metadata Viewer
permissions for the projects in question, but I don't know if that is the only permission that's needed
m
Thanks for letting us know @colossal-account-65055 we'll update the docs once we figure out the exact scope
c
Awesome. Thank you!
b
From my experience
BigQuery Metadata Viewer
permissions are enough. @colossal-account-65055 One thing to mention here is that you need to be aware that you can only fetch native tables. External tables are not visible for ingest 😕
👍 1
thankyou 1