No matter how I configure my values.yaml file, I c...
# ask-ai
m
No matter how I configure my values.yaml file, I cannot get airbyte to try and resolve the correct storage bucket. I am using spaces object storage in Digital ocean. My credentials are correct, but it keeps changing my path to domain.s3.nyc3.amazonaws.com instead of domain.nyc3.digitaloceanspaces.com. How do I configure my values.yaml file so that it will try to resolve the correct endpoint. Untyil I get this fixed I cannot add sources or destinations.
k
To configure Airbyte to use DigitalOcean Spaces (or any S3-compatible storage) and ensure it resolves the correct endpoint (e.g.,
<http://domain.nyc3.digitaloceanspaces.com|domain.nyc3.digitaloceanspaces.com>
), you need to set a custom S3 endpoint in your
values.yaml
. However, according to the knowledge sources, Airbyte's Helm chart currently does not natively support setting a custom S3 endpoint when using the
s3
storage type—this is a known limitation and a frequently requested feature by the community. The documentation suggests that if you are using an S3-compatible solution, you should "use the S3 type and provide an endpoint key/value as needed," but users have reported that this does not work as expected in practice, and the endpoint defaults to AWS S3 instead of your custom endpoint for DigitalOcean Spaces [State and Logging Storage](https://docs.airbyte.com/platform/deploying-airbyte/integrations/storage) [Github Issue 44174](https://github.com/airbytehq/airbyte/issues/44174). Workaround: Some users have successfully configured DigitalOcean Spaces by using the
minio
storage type instead of
s3
, as the
minio
type allows you to specify a custom endpoint. Here’s an example configuration that has worked for others:
Copy code
yaml global:   storage:     type: "minio"     minio:       endpoint: "<https://nyc3.digitaloceanspaces.com>"     secretName: airbyte-config-secrets     bucket:       log: airbyte-prod       state: airbyte-prod       workloadOutput: airbyte-prod
And your Kubernetes secret should include the relevant keys:
Copy code
yaml apiVersion: v1 kind: Secret metadata:   name: airbyte-config-secrets   namespace: airbyte type: Opaque stringData:   MINIO_ACCESS_KEY_ID: <your-access-key>   MINIO_SECRET_ACCESS_KEY: <your-secret-key>   minio-access-key-id: <your-access-key>   minio-secret-access-key: <your-secret-key>   s3-access-key-id: <your-access-key>   s3-secret-access-key: <your-secret-key>
Limitations: - When using the
minio
type, you may not be able to set the region parameter, which could be a limitation for some S3-compatible providers. - The official documentation and issues indicate that full support for custom S3 endpoints with the
s3
type is not yet available, and this workaround is the best available option until the feature is implemented [Github Issue 44174](https://github.com/airbytehq/airbyte/issues/44174). Summary: - Use
type: minio
and set the
endpoint
to your DigitalOcean Spaces endpoint. - Place your credentials in the secret as shown above. - This should direct Airbyte to use the correct endpoint for DigitalOcean Spaces. If you need both custom region and endpoint support, you may be limited by current Airbyte Helm chart capabilities. The community is actively requesting this feature, but as of now, this workaround is the most reliable approach. For more details and user experiences, see the discussion in [Github Issue 44174](https://github.com/airbytehq/airbyte/issues/44174).