few-air-56117
02/03/2022, 7:13 AMQuota exceeded for quota metric 'Read requests' and limit 'Read requests per minute' of service '<http://logging.googleapis.com|logging.googleapis.com>' for consumer 'project_number:491986273194'. [{'@type': '<http://type.googleapis.com/google.rpc.ErrorInfo|type.googleapis.com/google.rpc.ErrorInfo>', 'reason': 'RATE_LIMIT_EXCEEDED', 'domain': '<http://googleapis.com|googleapis.com>', 'metadata': {'consumer': 'projects/491986273194', 'quota_metric': '<http://logging.googleapis.com/read_requests|logging.googleapis.com/read_requests>', 'quota_limit': 'ReadRequestsPerMinutePerProject', 'service': '<http://logging.googleapis.com|logging.googleapis.com>'}}]
This is the recepi
source:
type: bigquery-usage
config:
# Coordinates
projects:
- <project1>
- <project2>
max_query_duration: 5
sink:
type: "datahub-rest"
config:
server: <ip>
I use a k8s cronjob and this image
linkedin/datahub-ingestion:v0.8.24
with this command
args: ["ingest", "-c", "file"]
Thx 😄.few-air-56117
02/03/2022, 7:13 AMdazzling-judge-80093
02/03/2022, 7:16 AMfew-air-56117
02/03/2022, 7:17 AMfew-air-56117
02/03/2022, 7:17 AMfew-air-56117
02/03/2022, 7:22 AMfew-air-56117
02/03/2022, 7:24 AMfew-air-56117
02/03/2022, 7:28 AM{'records_written': 2356,
'warnings': [],
'failures': [],
'downstream_start_time': datetime.datetime(2022, 2, 3, 9, 22, 22, 506673),
'downstream_end_time': datetime.datetime(2022, 2, 3, 9, 28, 35, 830540),
'downstream_total_latency_in_seconds': 373.323867}few-air-56117
02/03/2022, 7:29 AMdazzling-judge-80093
02/03/2022, 7:35 AMdazzling-judge-80093
02/03/2022, 7:35 AMfew-air-56117
02/03/2022, 7:37 AMfew-air-56117
02/03/2022, 7:39 AMdazzling-judge-80093
02/03/2022, 7:41 AMfew-air-56117
02/03/2022, 7:42 AMfew-air-56117
02/03/2022, 7:42 AMbland-lighter-26751
12/12/2022, 6:19 PMbland-lighter-26751
12/12/2022, 6:36 PM