Hi there! What is the most efficient way to go abo...
# ask-ai
e
Hi there! What is the most efficient way to go about loading log data into BigQuery from a cloud function? Is it better to do it in two step? store in the most cost efficient way, and then replicate the data in BQ (maybe filtered data) in batches.
r
are these logs from https://cloud.google.com/logging or application-specific logs which you want to push to GBQ?
e
It's the logs from Cloud Logging yes
👍 1
n
https://cloud.google.com/logging/docs/export/bigquery#:~:text=You%20can%20route%20log%20entries,you%20in%20that%20BigQuery%20dataset. Maybe consider using a sink with BQ as the destination rather than trying to make something custom
👍 2
r
+1 for the above approach
e
Thanks 🙂 I did that and I routed them directly in BQ I’ll monitor the volume and associated cost