Hi guys, i tried to ingest biguqery-usage for 2 pr...
# ingestion
f
Hi guys, i tried to ingest biguqery-usage for 2 project, its started , but after 2-3 minutes i get this error
Copy code
Quota exceeded for quota metric 'Read requests' and limit 'Read requests per minute' of service '<http://logging.googleapis.com|logging.googleapis.com>' for consumer 'project_number:491986273194'. [{'@type': '<http://type.googleapis.com/google.rpc.ErrorInfo|type.googleapis.com/google.rpc.ErrorInfo>', 'reason': 'RATE_LIMIT_EXCEEDED', 'domain': '<http://googleapis.com|googleapis.com>', 'metadata': {'consumer': 'projects/491986273194', 'quota_metric': '<http://logging.googleapis.com/read_requests|logging.googleapis.com/read_requests>', 'quota_limit': 'ReadRequestsPerMinutePerProject', 'service': '<http://logging.googleapis.com|logging.googleapis.com>'}}]
This is the recepi
Copy code
source:
  type: bigquery-usage
  config:
    # Coordinates
    projects:
      - <project1>
      - <project2>
    max_query_duration: 5

sink:
  type: "datahub-rest"
  config:
    server: <ip>
I use a k8s cronjob and this image
Copy code
linkedin/datahub-ingestion:v0.8.24
with this command
Copy code
args: ["ingest", "-c", "file"]
Thx 😄.
✅ 1
Also i think if i run on local, via cli, it will work,but i will test it.
d
Are you saying that it works if you run locally but you get rate limited if you run from cronjob? Do you use the same service account? Do you ingest the same data?
f
i remember that i run it via cli on my pc an it works, but i wil try to do a test also to me sure.
Yea , i use a service account on k8s and local (the same service account)
i belive that is this api
i also start a local run, i will come back after it finish
yep, on local it work
Copy code
{'records_written': 2356,
 'warnings': [],
 'failures': [],
 'downstream_start_time': datetime.datetime(2022, 2, 3, 9, 22, 22, 506673),
 'downstream_end_time': datetime.datetime(2022, 2, 3, 9, 28, 35, 830540),
 'downstream_total_latency_in_seconds': 373.323867}
strange 😂
d
wow
Is this the same ingestion recipe and the same service account?
f
yep
i think the problem is on the project where i have the datahub, probably google have some limit and i need to increse it
d
it can be, it seems like you can edit quotas -> https://cloud.google.com/logging/quotas
f
yea, right now i have only 60
i will edit it , but i need to w8 google to approve it
b
Did updating the quota fix this? Also how much did you increase it?
Nvm, never realized I had my recipe looking way back in the past. Set it to check back only a week and it's fine