Hi everyone :wave: I wrote a blog about the produc...
# releases-and-early-demos
k
Hi everyone 👋 I wrote a blog about the production operation of Airbyte on GCP. It's a content that implemented a monitoring and config backup using GCP resources. https://dev.to/kojim/starting-small-airbyte-on-gcp-1j19
👏 8
m
Sooooo cool!!!!
thank you for sharing
j
@koji matsumoto would you share your cloud function for backing up the config yaml?
Great article btw!
k
I'll share only the excerpt of the main process. It's written by python as follows:
Copy code
from google.cloud import storage

# export Airbyte config
req = urllib.request.Request(AIRBYTE_CONFIG_ENDPOINT, method='POST')
with urllib.request.urlopen(req) as data:
    conf_data = data.read()

# upload Airbyte config to GCS
client = storage.Client()
bucket = client.get_bucket(BUCKET)
blob = bucket.blob(GCS_PATH)
blob.upload_from_string(conf_data, content_type='application/x-gzip')
j
Cheers! @koji matsumoto If you don't mind another question from a GCP beginner: when calling the Airbyte endpoint from a Cloud Function within the same GCP project. Do you use the IP of the Compute instance, or how you you access the API?
m
@Jonas Bolin without having the full context (planning on reding the post tomorrow): Yes that’s basically it. To make it secure you can put the CF and VM on the same VPC.
k
@Jonas Bolin Yes exactly, I made Cloud Functions access to VPC internal IP address attached to compute instance. For that I'm using Serverless VPC access for connecting Cloud Functions and VPC.
Thank you for your question, I forgot to describe about that point in the article. I've updated the post!