Hi, I'm trying to import existing sources into ter...
# ask-ai
p
Hi, I'm trying to import existing sources into terraform but, after import, plan shows a lot of changes in configuration, is this expected?
k
When importing existing sources into Terraform, it is possible to see a lot of changes in the configuration during the
terraform plan
phase. This can happen due to differences in how configurations are represented or managed between the Airbyte UI and Terraform. Here are a few steps you can take to address this: 1. Review the Plan: Before applying any changes, always review the plan to understand what Terraform will do. This helps in identifying any unexpected changes.
Copy code
bash     terraform plan
2. Configuration Differences: Ensure that the configurations in Terraform match those in the Airbyte UI. Sometimes, default values or optional parameters might be set differently, leading to discrepancies. 3. Manual Adjustments: You might need to manually adjust the Terraform configuration to align with the existing setup in Airbyte. For example, setting optional parameters to empty strings if they are set to
null
by default. 4. Use the Official Terraform Provider: If you are using an unofficial CLI or method, consider switching to the official Terraform provider for Airbyte, which might handle configurations more consistently. - [Airbyte Terraform Provider](https://registry.terraform.io/providers/airbytehq/airbyte/0.1.0) 5. Validation Errors: Be aware of potential validation errors that might arise due to format changes. For instance, date formats might be transformed incorrectly, leading to validation issues. Here is an example of a validation error that might occur:
Copy code
shell **Request**: POST /v1/sources HTTP/1.1 Host: <http://airbyte.development.myurl.com:8006|airbyte.development.myurl.com:8006> Accept: application/json Authorization: Basic YWlyYnl0ZTpwYXNzd29yZA== Content-Type: application/json User-Agent: speakeasy-sdk/go 0.3.5 2.210.3 1.0.0 airbyte  {"configuration":{"bucket":"data-development","endpoint":null,"schema":"{}","sourceType":"s3","start_date":"2023-10-01T00:00:00Z","streams":[{"days_to_sync_if_history_is_full":3,"format":{"filetype":"jsonl"},"globs":["*/*/**/final.json"],"name":"video_jsons_s3_data","schemaless":true,"validation_policy":"Emit Record"}]},"name":"video_jsons_s3","workspaceId":"40648797-0936-4145-ad22-031e23b363ae"} **Response**: HTTP/1.1 422 Unprocessable Entity Content-Length: 443 Connection: keep-alive Content-Type: application/problem+json Date: Fri, 16 Feb 2024 13:14:45 GMT Server: nginx/1.25.3
For more detailed guidance, you can refer to the [Airbyte documentation](https://airbyte.com/tutorials/version-control-airbyte-configurations) on version control and configuration management. If you continue to face issues, consider reaching out to the Airbyte community or support for further assistance.
p
@kapa.ai, but it is saying, in some cases, that all configuration is new. example:
Copy code
~ resource "airbyte_source_facebook_marketing" "airbyte_source_facebook_marketing" {
      ~ configuration = {
          + access_token                  = (sensitive value)
          + account_ids                   = [
              + "sensitive",
            ]
          + action_breakdowns_allow_empty = true
          + fetch_thumbnail_images        = false
          + include_deleted               = false
          + insights_job_timeout          = 60
          + insights_lookback_window      = 28
          + page_size                     = 100
          + start_date                    = "2023-01-01T00:00:00Z"
        }
        name          = "Facebook Marketing"
        # (3 unchanged attributes hidden)
    }
k
It looks like you are encountering an issue where the configuration for the
airbyte_source_facebook_marketing
resource is being marked as entirely new, even though some attributes are unchanged. This can happen due to various reasons, such as changes in sensitive values or differences in how the configuration is interpreted. Here are a few steps you can take to troubleshoot and resolve this issue: 1. Check Sensitive Values: Ensure that the sensitive values (like
access_token
and
account_ids
) are correctly managed. Sensitive values can sometimes cause the entire configuration to appear as new if they are not handled consistently. 2. Review Configuration Changes: Compare the current configuration with the previous one to identify any subtle differences. Even minor changes can cause the configuration to be marked as new. 3. Set Default Values Explicitly: Ensure that all optional fields have explicit values set, even if they are defaults. This can help avoid discrepancies. 4. Use Version Control: If you are using version control, review the changes in the configuration file to see if there are any unintended modifications. Here is an example of how the configuration might look:
Copy code
hcl resource "airbyte_source_facebook_marketing" "airbyte_source_facebook_marketing" {   configuration = {     access_token                  = (sensitive value)     account_ids                   = [       "sensitive",     ]     action_breakdowns_allow_empty = true     fetch_thumbnail_images        = false     include_deleted               = false     insights_job_timeout          = 60     insights_lookback_window      = 28     page_size                     = 100     start_date                    = "2023-01-01T00:00:00Z"   }   name = "Facebook Marketing"   # (3 unchanged attributes hidden) }
For more detailed information on configuring the Facebook Marketing source in Airbyte, you can refer to the [setup guide](https://docs.airbyte.com/integrations/sources/facebook-marketing#set-up-facebook-marketing-as-a-source-in-airbyte) and the [source settings](https://docs.airbyte.com/integrations/sources/facebook-marketing#facebook-marketing-source-settings). If the issue persists, you may want to reach out to the Airbyte community or support for further assistance.
p
@kapa.ai I am now getting this error when trying to ugrade airbyte: Normal Scheduled 15s default-scheduler Successfully assigned airbyte/airbyte-airbyte-bootloader to gke-prod-cluster-v2-deployments-pool-33aafffb-uqoi Normal Pulled 2s (x3 over 14s) kubelet Container image "airbyte/bootloader:0.61.0" already present on machine Warning Failed 2s (x3 over 14s) kubelet Error: couldn't find key DATABASE_USER in Secret airbyte/airbyte-airbyte-secrets please provide me the proper configuration I should pass
k
A message from kapa.ai
It looks like you are encountering an error related to the
DATABASE_USER
key not being found in the
airbyte-airbyte-secrets
secret when trying to upgrade Airbyte. This issue is likely due to missing or incorrect configuration in your
values.yaml
file. To resolve this, you need to ensure that the
DATABASE_USER
and other related database credentials are correctly specified in your
values.yaml
file. Here is an example configuration based on the provided knowledge sources:
Copy code
yaml postgresql:   enabled: false  externalDatabase:   host: "<your-database-host>"   port: "<your-database-port>"   database: "<your-database-name>"   user: "<your-database-user>"   existingSecret: "airbyte-airbyte-secrets"   existingSecretPasswordKey: "DATABASE_PASSWORD"  global:   database:     secretName: "airbyte-airbyte-secrets"     secretValue: "DATABASE_PASSWORD"     host: "<your-database-host>"     port: "<your-database-port>"     user: "<your-database-user>"
Make sure to replace the placeholders (
<your-database-host>
,
<your-database-port>
,
<your-database-name>
,
<your-database-user>
) with your actual database details. Additionally, ensure that the
airbyte-airbyte-secrets
secret in your Kubernetes cluster contains the necessary keys (
DATABASE_USER
,
DATABASE_PASSWORD
, etc.). You can create or update the secret using the following command:
Copy code
sh kubectl create secret generic airbyte-airbyte-secrets \   --from-literal=DATABASE_USER=<your-database-user> \   --from-literal=DATABASE_PASSWORD=<your-database-password> \   --from-literal=DATABASE_HOST=<your-database-host> \   --from-literal=DATABASE_PORT=<your-database-port>
Replace the placeholders with your actual database credentials. For more detailed information on configuring the Airbyte database, you can refer to the [documentation](https://docs.airbyte.com/enterprise-setup/implementation-guide#configuring-the-airbyte-database). If you continue to experience issues, you may want to review the related GitHub issues and discussions for additional context and potential solutions: - [Issue #18637](https://github.com/airbytehq/airbyte/issues/18637) - [Discussion #33835](https://github.com/airbytehq/airbyte/discussions/33835) - [Issue #18539](https://github.com/airbytehq/airbyte/issues/18539) - [Issue #28914](https://github.com/airbytehq/airbyte/issues/28914) These resources provide insights into similar problems and their resolutions.
p
@kapa.ai I have it configured to use another secret - existingSecret: "airbyte-db-secrets", but still facing the same error
k
A message from kapa.ai
If you have configured Airbyte to use another secret (
existingSecret: "airbyte-db-secrets"
) but are still facing the same error, you need to ensure that the
airbyte-db-secrets
secret contains all the necessary keys, including
DATABASE_USER
. Here is an example configuration for your
values.yaml
file:
Copy code
yaml postgresql:   enabled: false  externalDatabase:   host: "<your-database-host>"   port: "<your-database-port>"   database: "<your-database-name>"   user: "<your-database-user>"   existingSecret: "airbyte-db-secrets"   existingSecretPasswordKey: "DATABASE_PASSWORD"  global:   database:     secretName: "airbyte-db-secrets"     secretValue: "DATABASE_PASSWORD"     host: "<your-database-host>"     port: "<your-database-port>"     user: "<your-database-user>"
Make sure to replace the placeholders (
<your-database-host>
,
<your-database-port>
,
<your-database-name>
,
<your-database-user>
) with your actual database details. Next, ensure that the
airbyte-db-secrets
secret in your Kubernetes cluster contains the necessary keys (
DATABASE_USER
,
DATABASE_PASSWORD
, etc.). You can create or update the secret using the following command:
Copy code
sh kubectl create secret generic airbyte-db-secrets \   --from-literal=DATABASE_USER=<your-database-user> \   --from-literal=DATABASE_PASSWORD=<your-database-password> \   --from-literal=DATABASE_HOST=<your-database-host> \   --from-literal=DATABASE_PORT=<your-database-port>
Replace the placeholders with your actual database credentials. For more detailed information on configuring the Airbyte database, you can refer to the [documentation](https://docs.airbyte.com/enterprise-setup/implementation-guide#configuring-the-airbyte-database). If you continue to experience issues, you may want to review the related GitHub issues and discussions for additional context and potential solutions: - [Issue #18637](https://github.com/airbytehq/airbyte/issues/18637) - [Issue #18539](https://github.com/airbytehq/airbyte/issues/18539) - [Issue #36238](https://github.com/airbytehq/airbyte/issues/36238) These resources provide insights into similar problems and their resolutions.
p
@kapa.ai - I am now getting this error: io.temporal.failure.ActivityFailure: Activity with activityType='RunWithJobOutput' failed: 'Activity task failed'. scheduledEventId=12, startedEventId=13, activityId=ed73e42d-a3af-3253-b163-07b74011410d, identity='1@airbyte-worker-5dbc588585-4fwv5', retryState=RETRY_STATE_MAXIMUM_ATTEMPTS_REACHED at java.base/java.lang.Thread.getStackTrace(Thread.java:2450) at io.temporal.internal.sync.ActivityStubBase.execute(ActivityStubBase.java:49) at io.temporal.internal.sync.ActivityInvocationHandler.lambda$getActivityFunc$0(ActivityInvocationHandler.java:83) at io.temporal.internal.sync.ActivityInvocationHandlerBase.invoke(ActivityInvocationHandlerBase.java:60) at jdk.proxy2/jdk.proxy2.$Proxy119.runWithJobOutput(Unknown Source) at io.airbyte.workers.temporal.check.connection.CheckConnectionWorkflowImpl.run(CheckConnectionWorkflowImpl.java:55) at CheckConnectionWorkflowImplProxy.run$accessor$IB6TxDYv(Unknown Source) at CheckConnectionWorkflowImplProxy$auxiliary$nWwpDs2D.call(Unknown Source) at io.airbyte.micronaut.temporal.TemporalActivityStubInterceptor.execute(TemporalActivityStubInterceptor.java:79) at CheckConnectionWorkflowImplProxy.run(Unknown Source) at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103) at java.base/java.lang.reflect.Method.invoke(Method.java:580) at io.temporal.internal.sync.POJOWorkflowImplementationFactory$POJOWorkflowImplementation$RootWorkflowInboundCallsInterceptor.execute(POJOWorkflowImplementationFactory.java:339) at io.temporal.internal.sync.POJOWorkflowImplementationFactory$POJOWorkflowImplementation.execute(POJOWorkflowImplementationFactory.java:314) at io.temporal.internal.sync.WorkflowExecutionHandler.runWorkflowMethod(WorkflowExecutionHandler.java:70) at io.temporal.internal.sync.SyncWorkflow.lambda$start$0(SyncWorkflow.java:135) at io.temporal.internal.sync.CancellationScopeImpl.run(CancellationScopeImpl.java:102) at io.temporal.internal.sync.WorkflowThreadImpl$RunnableWrapper.run(WorkflowThreadImpl.java:107) at io.temporal.worker.ActiveThreadReportingExecutor.lambda$submit$0(ActiveThreadReportingExecutor.java:53) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:572) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
k
A message from kapa.ai
It looks like you are encountering an
ActivityFailure
error related to the
RunWithJobOutput
activity in the
CheckConnectionWorkflowImpl
. This error indicates that the activity task has failed and reached the maximum number of retry attempts. Here are some steps you can take to troubleshoot and resolve this issue: 1. Check the Logs: Review the logs for more detailed error messages that can provide insights into why the
RunWithJobOutput
activity is failing. Look for any specific exceptions or error messages that can help identify the root cause. 2. Review Configuration: Ensure that the configuration for the
CheckConnectionWorkflowImpl
is correct. Verify that all required parameters and settings are properly configured. 3. Increase Retry Attempts: If the failure is due to transient issues, you might want to increase the number of retry attempts for the activity. This can be done by adjusting the retry policy in your workflow configuration. 4. Inspect Dependencies: Ensure that all dependencies and services required by the
RunWithJobOutput
activity are available and functioning correctly. This includes checking database connections, network availability, and any external services. 5. Update Airbyte Version: Ensure that you are using the latest version of Airbyte, as updates often include bug fixes and improvements. If you are not on the latest version, consider upgrading. Here is an example of how the error might appear in the logs:
It looks like you are encountering an
ActivityFailure
error related to the
RunWithJobOutput
activity in the
CheckConnectionWorkflowImpl
. This error indicates that the activity task has failed and reached the maximum number of retry attempts. Here are some steps you can take to troubleshoot and resolve this issue: 1. Check the Logs: Review the logs for more detailed error messages that can provide insights into why the
RunWithJobOutput
activity is failing. Look for any specific exceptions or error messages that can help identify the root cause. 2. Review Configuration: Ensure that the configuration for the
CheckConnectionWorkflowImpl
is correct. Verify that all required parameters and settings are properly configured. 3. Increase Retry Attempts: If the failure is due to transient issues, you might want to increase the number of retry attempts for the activity. This can be done by adjusting the retry policy in your workflow configuration. 4. Inspect Dependencies: Ensure that all dependencies and services required by the
RunWithJobOutput
activity are available and functioning correctly. This includes checking database connections, network availability, and any external services. 5. Update Airbyte Version: Ensure that you are using the latest version of Airbyte, as updates often include bug fixes and improvements. If you are not on the latest version, consider upgrading. Here is an example of how the error might appear in the logs: ```plaintext io.temporal.failure.ActivityFailure: Activity with activityType='RunWithJobOutput' failed: 'Activity task failed'. scheduledEventId=12, startedEventId=13, activityId=ed73e42d-a3af-3253-b163-07b74011410d, identity='1@airbyte-worker-5dbc588585-4fwv5', retryState=RETRY_STATE_MAXIMUM_ATTEMPTS_REACHED at java.base/java.lang.Thread.getStackTrace(Thread.java:2450) at io.temporal.internal.sync.ActivityStubBase.execute(ActivityStubBase.java:49) at io.temporal.internal.sync.ActivityInvocationHandler.lambda$getActivityFunc$0(ActivityInvocationHandler.java:83) at io.temporal.internal.sync.ActivityInvocationHandlerBase.invoke(ActivityInvocationHandlerBase.java:60) at jdk.proxy2/jdk.proxy2.$Proxy119.runWithJobOutput(Unknown Source) at io.airbyte.workers.temporal.check.connection.CheckConnectionWorkflowImpl.run(CheckConnectionWorkflowImpl.java:55) at CheckConnectionWorkflowImplProxy.run$accessor$IB6TxDYv(Unknown Source) at CheckConnectionWorkflowImplProxy$auxiliary$nWwpDs2D.call(Unknown Source) at io.airbyte.micronaut.temporal.TemporalActivityStubInterceptor.execute(TemporalActivityStubInterceptor.java:79) at CheckConnectionWorkflowImplProxy.run(Unknown Source) at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103) at java.base/java.lang.reflect.Method.invoke(Method.java:580) at io.temporal.internal.sync.POJOWorkflowImplementationFactory$POJOWorkflowImplementation$RootWorkflowInboundCallsInterceptor.execute(POJOWorkflowImplementationFactory.java:339) at io.temporal.internal.sync.POJOWorkflowImplementationFactory$POJOWorkflowImplementation.execute(POJOWorkflowImplementationFactory.java:314) at io.temporal.internal.sync.WorkflowExecutionHandler.runWorkflowMethod(WorkflowExecutionHandler.java:70) at io.temporal.internal.sync.SyncWorkflow.lambda$start$0(SyncWorkflow.java:135) at io.temporal.internal.sync.CancellationScopeImpl.run(CancellationScopeImpl.java:102) at io.temporal.internal.sync.WorkflowThreadImpl$RunnableWrapper.run(WorkflowThreadImpl.java:107) at io.temporal.worker.ActiveThreadReporting
p
@kapa.ai seems a missing access to the external persistence store I've configured. What do I need to do to configure an external secret store in GCP?
k
A message from kapa.ai
To configure an external secret store in GCP for Airbyte, you need to follow these steps: 1. Create a Service Account: Create a service account in GCP with the necessary permissions to access Google Secret Manager. 2. Generate a JSON Key: Generate a JSON key for the service account. 3. Create a Kubernetes Secret: Store the JSON key in a Kubernetes secret. 4. Configure Airbyte to Use Google Secret Manager: Update the Airbyte configuration to use Google Secret Manager for storing secrets. Here are the detailed steps: ### Step 1: Create a Service Account 1. Go to the [Google Cloud Console](https://console.cloud.google.com/). 2. Navigate to
IAM & Admin
>
Service Accounts
. 3. Click
Create Service Account
. 4. Provide a name and description for the service account. 5. Click
Create
and then
Continue
. 6. Assign the
Secret Manager Secret Accessor
role to the service account. 7. Click
Done
. ### Step 2: Generate a JSON Key 1. In the
Service Accounts
page, find the service account you created. 2. Click on the service account name. 3. Go to the
Keys
tab. 4. Click
Add Key
>
Create New Key
. 5. Select
JSON
and click
Create
. 6. Save the JSON key file to your local machine. ### Step 3: Create a Kubernetes Secret 1. Create a new file
gcp.json
containing the credentials JSON blob for the service account. 2. Use
kubectl
to create the secret directly from the
gcp.json
file:
Copy code
sh
kubectl create secret generic gcp-cred-secrets --from-file=gcp.json --namespace airbyte
### Step 4: Configure Airbyte to Use Google Secret Manager Update your
values.yaml
file to configure Airbyte to use Google Secret Manager:
Copy code
yaml
secretsManager:
  type: googleSecretManager
  storageSecretName: gcp-cred-secrets
  googleSecretManager:
    projectId: <your-gcp-project-id>
    credentialsSecretKey: gcp.json
Replace
<your-gcp-project-id>
with your actual GCP project ID. ### Additional Configuration Ensure that the environment variables are set correctly in the Worker and Server applications:
Copy code
sh
export SECRET_STORE_GCP_PROJECT_ID=<your-gcp-project-id>
export SECRET_STORE_GCP_CREDENTIALS=$(cat gcp.json)
export SECRET_PERSISTENCE=GOOGLE_SECRET_MANAGER
### Important Notes - Modifying the configuration of connector secret storage will cause all existing connectors to fail. You will need to recreate these connectors to ensure they are reading from the appropriate secret store. - This process is not reversible. Once you have converted to a secret store, you won’t be able to reverse it. For more detailed information, you can refer to the [Airbyte documentation on configuring external connector secret management](https://docs.airbyte.com/enterprise-setup/implementation-guide#configuring-external-connector-secret-management).
To configure an external secret store in GCP for Airbyte, you need to follow these steps: 1. Create a Service Account: Create a service account in GCP with the necessary permissions to access Google Secret Manager. 2. Generate a JSON Key: Generate a JSON key for the service account. 3. Create a Kubernetes Secret: Store the JSON key in a Kubernetes secret. 4. Configure Airbyte to Use Google Secret Manager: Update the Airbyte configuration to use Google Secret Manager for storing secrets. Here are the detailed steps: ### Step 1: Create a Service Account 1. Go to the [Google Cloud Console](https://console.cloud.google.com/). 2. Navigate to
IAM & Admin
>
Service Accounts
. 3. Click
Create Service Account
. 4. Provide a name and description for the service account. 5. Click
Create
and then
Continue
. 6. Assign the
Secret Manager Secret Accessor
role to the service account. 7. Click
Done
. ### Step 2: Generate a JSON Key 1. In the
Service Accounts
page, find the service account you created. 2. Click on the service account name. 3. Go to the
Keys
tab. 4. Click
Add Key
>
Create New Key
. 5. Select
JSON
and click
Create
. 6. Save the JSON key file to your local machine. ### Step 3: Create a Kubernetes Secret 1. Create a new file
gcp.json
containing the credentials JSON blob for the service account. 2. Use
kubectl
to create the secret directly from the
gcp.json
file:
Copy code
sh kubectl create secret generic gcp-cred-secrets --from-file=gcp.json --namespace airbyte
### Step 4: Configure Airbyte to Use Google Secret Manager Update your
values.yaml
file to configure Airbyte to use Google Secret Manager:
Copy code
yaml secretsManager:   type: googleSecretManager   storageSecretName: gcp-cred-secrets   googleSecretManager:     projectId: <your-gcp-project-id>     credentialsSecretKey: gcp.json
Replace
<your-gcp-project-id>
with your actual GCP project ID. ### Additional Configuration Ensure that the environment variables are set correctly in the Worker and Server applications:
Copy code
sh export SECRET_STORE_GCP_PROJECT_ID=<your-gcp-project-id> export SECRET_STORE_GCP_CREDENTIALS=$(cat gcp.json) export SECRET_PERSISTENCE=GOOGLE_SECRET_MANAGER
### Important Notes - Modifying the configuration of connector secret storage will cause all existing connectors to fail. You will need to recreate these connectors to ensure they are reading from the appropriate secret store. - This process is not reversible. Once you have converted to a secret store, you won’t be able to reverse it. For more detailed information, you can refer to the [Airbyte documentation on configuring external connector secret management](https://docs.airbyte.com/enterprise-setup/implementation-guide#configuring-external-connector-secret-management). By following these steps, you should be able to configure Airbyte to use Google Secret Manager for storing secrets. If you encounter any issues, reviewing the logs and ensuring that all configurations are correct will help in troubleshooting.
p
@kapa.ai I dont see this field in the values.yaml - secretsManager:
k
A message from kapa.ai
If you don't see the
secretsManager
field in your
values.yaml
file, you can add it manually. Here is how you can configure Airbyte to use Google Secret Manager for storing secrets: ### Step-by-Step Configuration 1. Create a Service Account: Create a service account in GCP with the necessary permissions to access Google Secret Manager. 2. Generate a JSON Key: Generate a JSON key for the service account. 3. Create a Kubernetes Secret: Store the JSON key in a Kubernetes secret. 4. Update
values.yaml
: Add the
secretsManager
configuration to your
values.yaml
file. ### Detailed Steps #### Step 1: Create a Service Account 1. Go to the [Google Cloud Console](https://console.cloud.google.com/). 2. Navigate to
IAM & Admin
>
Service Accounts
. 3. Click
Create Service Account
. 4. Provide a name and description for the service account. 5. Click
Create
and then
Continue
. 6. Assign the
Secret Manager Secret Accessor
role to the service account. 7. Click
Done
. #### Step 2: Generate a JSON Key 1. In the
Service Accounts
page, find the service account you created. 2. Click on the service account name. 3. Go to the
Keys
tab. 4. Click
Add Key
>
Create New Key
. 5. Select
JSON
and click
Create
. 6. Save the JSON key file to your local machine. #### Step 3: Create a Kubernetes Secret 1. Create a new file
gcp.json
containing the credentials JSON blob for the service account. 2. Use
kubectl
to create the secret directly from the
gcp.json
file:
Copy code
sh
kubectl create secret generic gcp-cred-secrets --from-file=gcp.json --namespace airbyte
#### Step 4: Update
values.yaml
Add the following configuration to your
values.yaml
file:
Copy code
yaml
secretsManager:
  type: googleSecretManager
  storageSecretName: gcp-cred-secrets
  googleSecretManager:
    projectId: <your-gcp-project-id>
    credentialsSecretKey: gcp.json
Replace
<your-gcp-project-id>
with your actual GCP project ID. ### Additional Configuration Ensure that the environment variables are set correctly in the Worker and Server applications:
Copy code
sh
export SECRET_STORE_GCP_PROJECT_ID=<your-gcp-project-id>
export SECRET_STORE_GCP_CREDENTIALS=$(cat gcp.json)
export SECRET_PERSISTENCE=GOOGLE_SECRET_MANAGER
### Important Notes - Modifying the configuration of connector secret storage will cause all existing connectors to fail. You will need to recreate these connectors to ensure they are reading from the appropriate secret store. - This process is not reversible. Once you have converted to a secret store, you won’t be able to reverse it. For more detailed information, you can refer to the [Airbyte documentation on configuring external connector secret management](https://docs.airbyte.com/enterprise-setup/implementation-guide#configuring-external-connector-secret-management).
If you don't see the
secretsManager
field in your
values.yaml
file, you can add it manually. Here is how you can configure Airbyte to use Google Secret Manager for storing secrets: ### Step-by-Step Configuration 1. Create a Service Account: Create a service account in GCP with the necessary permissions to access Google Secret Manager. 2. Generate a JSON Key: Generate a JSON key for the service account. 3. Create a Kubernetes Secret: Store the JSON key in a Kubernetes secret. 4. Update
values.yaml
: Add the
secretsManager
configuration to your
values.yaml
file. ### Detailed Steps #### Step 1: Create a Service Account 1. Go to the [Google Cloud Console](https://console.cloud.google.com/). 2. Navigate to
IAM & Admin
>
Service Accounts
. 3. Click
Create Service Account
. 4. Provide a name and description for the service account. 5. Click
Create
and then
Continue
. 6. Assign the
Secret Manager Secret Accessor
role to the service account. 7. Click
Done
. #### Step 2: Generate a JSON Key 1. In the
Service Accounts
page, find the service account you created. 2. Click on the service account name. 3. Go to the
Keys
tab. 4. Click
Add Key
>
Create New Key
. 5. Select
JSON
and click
Create
. 6. Save the JSON key file to your local machine. #### Step 3: Create a Kubernetes Secret 1. Create a new file
gcp.json
containing the credentials JSON blob for the service account. 2. Use
kubectl
to create the secret directly from the
gcp.json
file:
Copy code
sh kubectl create secret generic gcp-cred-secrets --from-file=gcp.json --namespace airbyte
#### Step 4: Update
values.yaml
Add the following configuration to your
values.yaml
file:
Copy code
yaml secretsManager:   type: googleSecretManager   storageSecretName: gcp-cred-secrets   googleSecretManager:     projectId: <your-gcp-project-id>     credentialsSecretKey: gcp.json
Replace
<your-gcp-project-id>
with your actual GCP project ID. ### Additional Configuration Ensure that the environment variables are set correctly in the Worker and Server applications:
Copy code
sh export SECRET_STORE_GCP_PROJECT_ID=<your-gcp-project-id> export SECRET_STORE_GCP_CREDENTIALS=$(cat gcp.json) export SECRET_PERSISTENCE=GOOGLE_SECRET_MANAGER
### Important Notes - Modifying the configuration of connector secret storage will cause all existing connectors to fail. You will need to recreate these connectors to ensure they are reading from the appropriate secret store. - This process is not reversible. Once you have converted to a secret store, you won’t be able to reverse it. For more detailed information, you can refer to the [Airbyte documentation on configuring external connector secret management](https://docs.airbyte.com/enterprise-setup/implementation-guide#configuring-external-connector-secret-management). By following these steps, you should be able to configure Airbyte to use Google Secret Manager for storing secrets. If you encounter any issues, reviewing the logs and ensuring that all configurations are correct will help in troubleshooting.
p
@kapa.ai - same issue: java.lang.RuntimeException: That secret was not found in the store! Coordinate: airbyte_workspace_fa3402a5-ecd8-45ff-9dc9-02c15e0847ab_secret_4f0c1a1e-66d9-4205-9d80-4db828ab141c_v1 at io.airbyte.config.secrets.SecretsHelpers.getOrThrowSecretValue(SecretsHelpers.kt:288) ~[io.airbyte.airbyte-config-config-secrets-0.61.0.jar:?] at io.airbyte.config.secrets.SecretsHelpers.combineConfig(SecretsHelpers.kt:173) ~[io.airbyte.airbyte-config-config-secrets-0.61.0.jar:?] at io.airbyte.config.secrets.SecretsHelpers$combineConfig$1.invoke(SecretsHelpers.kt:183) ~[io.airbyte.airbyte-config-config-secrets-0.61.0.jar:?] at io.airbyte.config.secrets.SecretsHelpers$combineConfig$1.invoke(SecretsHelpers.kt:177) ~[io.airbyte.airbyte-config-config-secrets-0.61.0.jar:?]
k
A message from kapa.ai
It looks like you are still encountering an issue where a secret is not found in the store, specifically for the coordinate
airbyte_workspace_fa3402a5-ecd8-45ff-9dc9-02c15e0847ab_secret_4f0c1a1e-66d9-4205-9d80-4db828ab141c_v1
. This indicates that the secret is either missing or not correctly configured in Google Secret Manager. Here are the steps to ensure that the secret is correctly set up in Google Secret Manager and referenced in your Airbyte configuration: ### Step 1: Verify the Secret in Google Secret Manager 1. Go to the Google Cloud Console: Navigate to the [Secret Manager](https://console.cloud.google.com/security/secret-manager). 2. Check for the Secret: Ensure that the secret
airbyte_workspace_fa3402a5-ecd8-45ff-9dc9-02c15e0847ab_secret_4f0c1a1e-66d9-4205-9d80-4db828ab141c
exists and has the correct version. 3. Add the Secret if Missing: If the secret is missing, add it manually with the correct value. ### Step 2: Update Kubernetes Secret Ensure that the Kubernetes secret containing the GCP credentials is correctly set up:
Copy code
sh kubectl create secret generic gcp-cred-secrets --from-file=gcp.json --namespace airbyte
### Step 3: Update
values.yaml
Ensure that your
values.yaml
file is correctly configured to use Google Secret Manager:
Copy code
yaml secretsManager:   type: googleSecretManager   storageSecretName: gcp-cred-secrets   googleSecretManager:     projectId: <your-gcp-project-id>     credentialsSecretKey: gcp.json
Replace
<your-gcp-project-id>
with your actual GCP project ID. ### Step 4: Set Environment Variables Ensure that the environment variables are set correctly in the Worker and Server applications:
Copy code
sh export SECRET_STORE_GCP_PROJECT_ID=<your-gcp-project-id> export SECRET_STORE_GCP_CREDENTIALS=$(cat gcp.json) export SECRET_PERSISTENCE=GOOGLE_SECRET_MANAGER
### Additional Resources For more detailed information, you can refer to the [Airbyte documentation on configuring external connector secret management](https://docs.airbyte.com/enterprise-setup/implementation-guide#configuring-external-connector-secret-management). ### Important Notes - Modifying the configuration of connector secret storage will cause all existing connectors to fail. You will need to recreate these connectors to ensure they are reading from the appropriate secret store. - This process is not reversible. Once you have converted to a secret store, you won’t be able to reverse it. By following these steps, you should be able to resolve the issue with the missing secret. If you continue to experience issues, reviewing the logs and ensuring that all configurations are correct will help in troubleshooting.
p
@kapa.ai I am not running the entrepise version, do I still need to set the secretManager in the yaml? I am following this: Credential management To allow you to configure syncs and connectors, Airbyte stores the credentials (like API Keys and passwords) you provide in the Airbyte application database. Make sure you protect the configuration management routes. If you’re deploying Airbyte Open Source on GCP, you may use Google Secret Manager to store credentials instead of in the database: 1. Create a service account with Google Secret Manager with read/write access. Generate a JSON key. 2. In the Worker and Server applications, set the
SECRET_STORE_GCP_PROJECT_ID
environment variable to the GCP project to which the credentials have access and secrets will be located. 3. In the Worker and Server applications, set the
SECRET_STORE_GCP_CREDENTIALS
environment variable to the JSON key created in step 1. 4. In the Worker and Server applications, set the
SECRET_PERSISTENCE
environment variable to
GOOGLE_SECRET_MANAGER
. Note that this process is not reversible. Once you have converted to a secret store, you won’t be able to reverse it.
k
A message from kapa.ai
If you are running the open-source version of Airbyte and want to use Google Secret Manager to store credentials instead of the database, you do not need to set the
secretsManager
field in the
values.yaml
file. Instead, you should follow the steps outlined in the documentation for credential management. Here are the steps to configure Airbyte Open Source to use Google Secret Manager: ### Step-by-Step Configuration 1. Create a Service Account: Create a service account in GCP with the necessary permissions to access Google Secret Manager. 2. Generate a JSON Key: Generate a JSON key for the service account. 3. Set Environment Variables: Configure the Worker and Server applications to use Google Secret Manager by setting the appropriate environment variables. ### Detailed Steps #### Step 1: Create a Service Account 1. Go to the [Google Cloud Console](https://console.cloud.google.com/). 2. Navigate to
IAM & Admin
>
Service Accounts
. 3. Click
Create Service Account
. 4. Provide a name and description for the service account. 5. Click
Create
and then
Continue
. 6. Assign the
Secret Manager Secret Accessor
role to the service account. 7. Click
Done
. #### Step 2: Generate a JSON Key 1. In the
Service Accounts
page, find the service account you created. 2. Click on the service account name. 3. Go to the
Keys
tab. 4. Click
Add Key
>
Create New Key
. 5. Select
JSON
and click
Create
. 6. Save the JSON key file to your local machine. #### Step 3: Set Environment Variables Set the following environment variables in the Worker and Server applications:
Copy code
sh export SECRET_STORE_GCP_PROJECT_ID=<your-gcp-project-id> export SECRET_STORE_GCP_CREDENTIALS=$(cat path/to/your/gcp.json) export SECRET_PERSISTENCE=GOOGLE_SECRET_MANAGER
Replace
<your-gcp-project-id>
with your actual GCP project ID and
path/to/your/gcp.json
with the path to the JSON key file you generated. ### Important Notes - Modifying the configuration of connector secret storage will cause all existing connectors to fail. You will need to recreate these connectors to ensure they are reading from the appropriate secret store. - This process is not reversible. Once you have converted to a secret store, you won’t be able to reverse it. For more detailed information, you can refer to the [Airbyte documentation on credential management](https://docs.airbyte.com/operating-airbyte/security#credential-management). By following these steps, you should be able to configure Airbyte Open Source to use Google Secret Manager for storing credentials. If you encounter any issues, reviewing the logs and ensuring that all configurations are correct will help in troubleshooting.
c
same issue
h
Hi @Connor Beauchamp I just succeed deploying with latest Postgres 16.3. I posted solution for this issue at https://github.com/hongbo-miao/hongbomiao.com/issues/17463 After this issue, there are many other issues, the work around ways and solutions especially parts related to Temporal in the community are out-of-dates. I will post all deployment codes soon as well. Just follow this ticket 😃
For this specific issue, I posted solution at https://stackoverflow.com/a/78644975/2000548