Hello, I'm unable to make metadata ingestion for B...
# troubleshoot
b
Hello, I'm unable to make metadata ingestion for BigQuery, the command works well without an error message but I don't have the data in the Datahub front end
m
That’s strange, what does
datahub ingest list-runs
give you?
b
Copy code
+--------------------+--------+---------------------+
| runId              |   rows | created at          |
+====================+========+=====================+
| no-run-id-provided |     32 | 2021-10-16 17:56:38 |
+--------------------+--------+---------------------+
m
how about
datahub --version
b
I downgrade the version since the 0.8.14 worked 2 days ago
Copy code
acryl-datahub, version 0.8.14.0
m
can you change the
sink
to
Copy code
sink:
  type:file
  config:
    filename: bq_mce.json
and re-run bq ingestion
b
yes, let me try
m
that way you can check what metadata is being produced before it is sent to datahub
b
like this?
Copy code
source:
  type: "bigquery"
  config:
    env: "DEV"
sink:
  type:file
  config:
    filename: bq_mce.json
If yes I have error
Copy code
ScannerError: mapping values are not allowed here
  in "<file>", line 7, column 9
Hi, I found the solution. My cloud architect deployed datahub on a VM with 10GB of disk space. It seems like running "datahub docker quickstart" with a previous version of datahub installed failed silently because of the lake of free space
I don't have the permission to extend the disk space, so I remove all installation (volume, images, package) to clean up and free space
After that, I run "datahub docker quickstart" on VM with at least 5GB free space, and voila
It's obvious, but maybe you need to tell people to have a drive with at least 10GB of free space after OS install? My Cloud Architect used the minimum VM disk available on GCP. But with the system and the need to upgrade, the free space was not enough over time
m
Great to hear @blue-zoo-89533