Ali Kelkawi
05/28/2024, 3:19 PMkapa.ai
05/28/2024, 3:19 PMairbyte-logs-secrets.yaml
to store the AWS Keys and other information:
yaml
apiVersion: v1
kind: Secret
metadata:
name: airbyte-logs-secrets
type: Opaque
stringData:
AWS_KEY: <AWS_KEY>
AWS_SECRET_KEY: <AWS_SECRET_KEY>
S3_LOG_BUCKET: <BUCKET_NAME>
S3_LOG_BUCKET_REGION: <REGION>
2. Apply the Secret: Run the following command to create the secret in the namespace you're using Airbyte:
sh
kubectl apply -f airbyte-logs-secrets.yaml -n <NAMESPACE>
3. Update Helm Values: Modify your values.yaml
file to use the S3 external logs:
yaml
global:
state:
storage:
type: "S3"
logs:
accessKey:
password: ""
existingSecret: "airbyte-logs-secrets"
existingSecretKey: "AWS_KEY"
secretKey:
password: ""
existingSecret: "airbyte-logs-secrets"
existingSecretKey: "AWS_SECRET_KEY"
storage:
type: "S3"
minio:
enabled: false
4. Add Extra Environment Variables: Add extra environment variables to the worker and server blocks:
yaml
worker:
extraEnv:
- name: AWS_ACCESS_KEY_ID
valueFrom:
secretKeyRef:
name: airbyte-logs-secrets
key: AWS_KEY
- name: AWS_SECRET_ACCESS_KEY
valueFrom:
secretKeyRef:
name: airbyte-logs-secrets
key: AWS_SECRET_KEY
- name: STATE_STORAGE_S3_ACCESS_KEY
valueFrom:
secretKeyRef:
name: airbyte-logs-secrets
key: AWS_KEY
- name: STATE_STORAGE_S3_SECRET_ACCESS_KEY
valueFrom:
secretKeyRef:
name: airbyte-logs-secrets
key: AWS_SECRET_KEY
- name: STATE_STORAGE_S3_BUCKET_NAME
valueFrom:
secretKeyRef:
name: airbyte-logs-secrets
key: S3_LOG_BUCKET
- name: STATE_STORAGE_S3_REGION
valueFrom:
secretKeyRef:
name: airbyte-logs-secrets
key: S3_LOG_BUCKET_REGION
kapa.ai
05/28/2024, 3:19 PMairbyte-logs-secrets.yaml
to store the AWS Keys and other information: yaml apiVersion: v1 kind: Secret metadata: name: airbyte-logs-secrets type: Opaque stringData: AWS_KEY: <AWS_KEY> AWS_SECRET_KEY: <AWS_SECRET_KEY> S3_LOG_BUCKET: <BUCKET_NAME> S3_LOG_BUCKET_REGION: <REGION>
2. Apply the Secret: Run the following command to create the secret in the namespace you're using Airbyte: sh kubectl apply -f airbyte-logs-secrets.yaml -n <NAMESPACE>
3. Update Helm Values: Modify your values.yaml
file to use the S3 external logs: yaml global: state: storage: type: "S3" logs: accessKey: password: "" existingSecret: "airbyte-logs-secrets" existingSecretKey: "AWS_KEY" secretKey: password: "" existingSecret: "airbyte-logs-secrets" existingSecretKey: "AWS_SECRET_KEY" storage: type: "S3" minio: enabled: false
4. Add Extra Environment Variables: Add extra environment variables to the worker and server blocks: yaml worker: extraEnv: - name: AWS_ACCESS_KEY_ID valueFrom: secretKeyRef: name: airbyte-logs-secrets key: AWS_KEY - name: AWS_SECRET_ACCESS_KEY valueFrom: secretKeyRef: name: airbyte-logs-secrets key: AWS_SECRET_KEY - name: STATE_STORAGE_S3_ACCESS_KEY valueFrom: secretKeyRef: name: airbyte-logs-secrets key: AWS_KEY - name: STATE_STORAGE_S3_SECRET_ACCESS_KEY valueFrom: secretKeyRef: name: airbyte-logs-secrets key: AWS_SECRET_KEY - name: STATE_STORAGE_S3_BUCKET_NAME valueFrom: secretKeyRef: name: airbyte-logs-secrets key: S3_LOG_BUCKET - name: STATE_STORAGE_S3_REGION valueFrom: secretKeyRef: name: airbyte-logs-secrets key: S3_LOG_BUCKET_REGION
Similarly, update the server block: ```yaml server: extraEnv: - name: AWS_ACCESS_KEY_ID valueFrom: secretKeyRef: name: airbyte-logs-secrets key: AWS_KEY - name: AWS_SECRET_ACCESS_KEY valueFrom: secretKeyRef: name: airbyte-logs-secrets key: AWS_SECRET_KEY - name: STATE_STORAGE_S3_ACCESS_KEY valueFrom: secretKeyRef: name: airbyte-logs-secrets key: AWS_KEY -Ali Kelkawi
05/28/2024, 3:21 PM"Caused by: software.amazon.awssdk.core.exception.SdkClientException: Unable to execute HTTP request: airbyte-dev-logs.s3.<bucket_name>.amazonaws.com
kapa.ai
05/28/2024, 3:21 PMkapa.ai
05/28/2024, 3:21 PMCaused by: software.amazon.awssdk.core.exception.SdkClientException: Unable to execute HTTP request: airbyte-dev-logs.s3.<bucket_name>.<http://amazonaws.com|amazonaws.com>
### Example Configuration for S3 Logging
Ensure your Kubernetes secret and Helm values are correctly set up as follows:
Kubernetes Secret (airbyte-logs-secrets.yaml
):
yaml
apiVersion: v1
kind: Secret
metadata:
name: airbyte-logs-secrets
type: Opaque
stringData:
AWS_KEY: <AWS_KEY>
AWS_SECRET_KEY: <AWS_SECRET_KEY>
S3_LOG_BUCKET: <BUCKET_NAME>
S3_LOG_BUCKET_REGION: <REGION>
Helm Values (values.yaml
):
yaml
global:
state:
storage:
type: "S3"
logs:
accessKey:
password: ""
existingSecret: "airbyte-logs-secrets"
existingSecretKey: "AWS_KEY"
secretKey:
password: ""
existingSecret: "airbyte-logs-secrets"
existingSecretKey: "AWS_SECRET_KEY"
storage:
type: "S3"
minio:
enabled: false
kapa.ai
05/28/2024, 3:21 PMCaused by: software.amazon.awssdk.core.exception.SdkClientException: Unable to execute HTTP request: airbyte-dev-logs.s3.<bucket_name>.<http://amazonaws.com|amazonaws.com>
### Example Configuration for S3 Logging Ensure your Kubernetes secret and Helm values are correctly set up as follows: Kubernetes Secret (airbyte-logs-secrets.yaml
): yaml apiVersion: v1 kind: Secret metadata: name: airbyte-logs-secrets type: Opaque stringData: AWS_KEY: <AWS_KEY> AWS_SECRET_KEY: <AWS_SECRET_KEY> S3_LOG_BUCKET: <BUCKET_NAME> S3_LOG_BUCKET_REGION: <REGION>
Helm Values (values.yaml
): yaml global: state: storage: type: "S3" logs: accessKey: password: "" existingSecret: "airbyte-logs-secrets" existingSecretKey: "AWS_KEY" secretKey: password: "" existingSecret: "airbyte-logs-secrets" existingSecretKey: "AWS_SECRET_KEY" storage: type: "S3" minio: enabled: false
Extra Environment Variables for Worker and Server: yaml worker: extraEnv: - name: AWS_ACCESS_KEY_ID valueFrom: secretKeyRef: name: airbyte-logs-secrets key: AWS_KEY - name: AWS_SECRET_ACCESS_KEY valueFrom: secretKeyRef: name: airbyte-logs-secrets key: AWS_SECRET_KEY - name: STATE_STORAGE_S3_ACCESS_KEY valueFrom: secretKeyRef: name: airbyte-logs-secrets key: AWS_KEY - name: STATE_STORAGE_S3_SECRET_ACCESS_KEY valueFrom: secretKeyRef: name: airbyte-logs-secrets key: AWS_SECRET_KEY - name: STATE_STORAGE_S3_BUCKET_NAME valueFrom: secretKeyRef: name: airbyte-logs-secrets key: S3_LOG_BUCKET - name: STATE_STORAGE_S3_REGION valueFrom: secretKeyRef: name: airbyte-logs-secrets key: S3_LOG_BUCKET_REGION
Similarly, update the server block: ```yaml server: extraEnv: - name: AWS_ACCESS_KEY_ID valueAli Kelkawi
05/28/2024, 3:31 PMkapa.ai
05/28/2024, 3:31 PMkapa.ai
05/28/2024, 3:32 PMairbyte-logs-secrets.yaml
to store the AWS Keys and other information, including the custom endpoint:
yaml
apiVersion: v1
kind: Secret
metadata:
name: airbyte-logs-secrets
type: Opaque
stringData:
AWS_KEY: <AWS_KEY>
AWS_SECRET_KEY: <AWS_SECRET_KEY>
S3_LOG_BUCKET: <BUCKET_NAME>
S3_LOG_BUCKET_REGION: <REGION>
S3_ENDPOINT: <CUSTOM_ENDPOINT> # Add your custom endpoint here
2. Apply the Secret: Run the following command to create the secret in the namespace you're using Airbyte:
sh
kubectl apply -f airbyte-logs-secrets.yaml -n <NAMESPACE>
3. Update Helm Values: Modify your values.yaml
file to use the S3 external logs and include the custom endpoint:
yaml
global:
state:
storage:
type: "S3"
logs:
accessKey:
password: ""
existingSecret: "airbyte-logs-secrets"
existingSecretKey: "AWS_KEY"
secretKey:
password: ""
existingSecret: "airbyte-logs-secrets"
existingSecretKey: "AWS_SECRET_KEY"
storage:
type: "S3"
minio:
enabled: false
kapa.ai
05/28/2024, 3:32 PMairbyte-logs-secrets.yaml
to store the AWS Keys and other information, including the custom endpoint: yaml apiVersion: v1 kind: Secret metadata: name: airbyte-logs-secrets type: Opaque stringData: AWS_KEY: <AWS_KEY> AWS_SECRET_KEY: <AWS_SECRET_KEY> S3_LOG_BUCKET: <BUCKET_NAME> S3_LOG_BUCKET_REGION: <REGION> S3_ENDPOINT: <CUSTOM_ENDPOINT> # Add your custom endpoint here
2. Apply the Secret: Run the following command to create the secret in the namespace you're using Airbyte: sh kubectl apply -f airbyte-logs-secrets.yaml -n <NAMESPACE>
3. Update Helm Values: Modify your values.yaml
file to use the S3 external logs and include the custom endpoint: yaml global: state: storage: type: "S3" logs: accessKey: password: "" existingSecret: "airbyte-logs-secrets" existingSecretKey: "AWS_KEY" secretKey: password: "" existingSecret: "airbyte-logs-secrets" existingSecretKey: "AWS_SECRET_KEY" storage: type: "S3" minio: enabled: false
4. Add Extra Environment Variables: Add extra environment variables to the worker and server blocks, including the custom endpoint: yaml worker: extraEnv: - name: AWS_ACCESS_KEY_ID valueFrom: secretKeyRef: name: airbyte-logs-secrets key: AWS_KEY - name: AWS_SECRET_ACCESS_KEY valueFrom: secretKeyRef: name: airbyte-logs-secrets key: AWS_SECRET_KEY - name: STATE_STORAGE_S3_ACCESS_KEY valueFrom: secretKeyRef: name: airbyte-logs-secrets key: AWS_KEY - name: STATE_STORAGE_S3_SECRET_ACCESS_KEY valueFrom: secretKeyRef: name: airbyte-logs-secrets key: AWS_SECRET_KEY - name: STATE_STORAGE_S3_BUCKET_NAME valueFrom: secretKeyRef: name: airbyte-logs-secrets key: S3_LOG_BUCKET - name: STATE_STORAGE_S3_REGION valueFrom: secretKeyRef: name: airbyte-logs-secrets key: S3_LOG_BUCKET_REGION - name: STATE_STORAGE_S3_ENDPOINT # Add the custom endpoint here valueFrom: secretKeyRef: name: airbyte-logs-secrets key: S3_ENDPOINT
Similarly, update the server block: ```yaml server: extraEnv: - name: AWS_ACCESS_KEY_ID