Luis Espinoza
11/04/2024, 4:12 PMNivedita Baliga
11/04/2024, 4:12 PMPatrick Henry
11/04/2024, 4:14 PMVineeth ER
11/04/2024, 4:15 PMPoorna Premachandra
11/04/2024, 4:19 PMhelm install airbyte airbyte/airbyte --namespace airbyte --values ./values.yaml
I'm getting the following error.
coalesce.go:286: warning: cannot overwrite table with non table for airbyte.keycloak-setup.initContainers.keycloakReadinessCheck.image (map[])
coalesce.go:286: warning: cannot overwrite table with non table for airbyte.keycloak.initContainers.initDb.image (map[])
coalesce.go:286: warning: cannot overwrite table with non table for airbyte.keycloak-setup.initContainers.keycloakReadinessCheck.image (map[])
coalesce.go:286: warning: cannot overwrite table with non table for airbyte.keycloak.initContainers.initDb.image (map[])
Error: INSTALLATION FAILED: failed pre-install: 1 error occurred:
* timed out waiting for the condition
During the process airbyte-airbyte-bootloader starts but after the error it get stopped. I double checked the yaml and seems ok. And I didn't change any properties relating to keycloak or keycloak-setup. Also I'm using helm version "v3.16.2". Any idea what is causing this? Thanks.Anna Bogo
11/04/2024, 4:43 PMGideon Stowell
11/04/2024, 6:18 PMThomas Shin
11/04/2024, 6:31 PMLeo Salayog
11/04/2024, 6:31 PM2024-11-04 18:28:29 platform > failures: [ {
"failureOrigin" : "source",
"failureType" : "system_error",
"internalMessage" : "The request failed due to an unknown error.",
"externalMessage" : "Something went wrong in the connector. See the logs for more details.",
"metadata" : {
"attemptNumber" : 0,
"jobId" : 327,
"from_trace_message" : true,
"connector_command" : "read"
},
How to set up Response filter so that I can ignore these errors
I need info for
Error Message Substring
Predicate
HTTP Codes
Action (ignore)
Error MessageLeo Salayog
11/04/2024, 6:54 PMRavi Nathwani
11/04/2024, 6:57 PM2024-11-04 18:46:04 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):308 - INFO main c.z.h.HikariDataSource(close):349 HikariPool-1 - Shutdown initiated...
2024-11-04 18:46:04 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):308 - INFO main c.z.h.HikariDataSource(close):351 HikariPool-1 - Shutdown completed.
2024-11-04 18:46:04 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):308 - INFO main c.z.h.HikariDataSource(<init>):79 HikariPool-2 - Starting...
2024-11-04 18:46:04 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):308 - INFO main c.z.h.HikariDataSource(<init>):81 HikariPool-2 - Start completed.
2024-11-04 18:46:04 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):308 - INFO main i.a.c.i.b.IntegrationRunner(runInternal):224 Completed integration: io.airbyte.cdk.integrations.base.ssh.SshWrappedSource
2024-11-04 18:46:04 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):308 - INFO main i.a.i.s.m.MssqlSource(main):577 completed source: class io.airbyte.integrations.source.mssql.MssqlSource
2024-11-04 18:46:04 INFO i.a.c.ConnectorMessageProcessor(updateConfigFromControlMessage):231 - Checking for optional control message...
2024-11-04 18:46:04 INFO i.a.c.ConnectorMessageProcessor(setOutput):176 - Writing catalog result to API...
2024-11-04 18:46:05 INFO i.a.c.ConnectorMessageProcessor(setOutput):180 - Finished writing catalog result to API.
2024-11-04 18:46:05 INFO i.a.c.ConnectorWatcher(saveConnectorOutput):162 - Writing output of b5ea17b1-f170-46dc-bc31-cc744ca984c1_4a1e274f-a661-441b-a3ce-8b08c0912ecd_0_discover to the doc store
2024-11-04 18:46:06 INFO i.a.c.ConnectorWatcher(markWorkloadSuccess):167 - Marking workload b5ea17b1-f170-46dc-bc31-cc744ca984c1_4a1e274f-a661-441b-a3ce-8b08c0912ecd_0_discover as successful
2024-11-04 18:46:06 INFO i.a.c.ConnectorWatcher(exitProperly):215 - Deliberately exiting process with code 0.
2024-11-04 18:46:06 INFO i.a.c.i.LineGobbler(voidCall):166 -
2024-11-04 18:46:06 INFO i.a.c.i.LineGobbler(voidCall):166 - ----- END DISCOVER -----
2024-11-04 18:46:06 INFO i.a.c.i.LineGobbler(voidCall):166 -
Diako
11/04/2024, 6:58 PMdavid balli
11/04/2024, 8:08 PMdavid balli
11/04/2024, 8:49 PMEthan Brown
11/04/2024, 10:28 PMvijay vijay
11/04/2024, 11:15 PMJordi Crespo Guzman
11/05/2024, 12:04 AMHuy Đỗ Văn
11/05/2024, 3:56 AMclass BigQueryTableStream(IncrementalMixin, Stream):
source_defined_cursor = False
cursor_field = "updated_at"
_cursor_value = ""
I got error how to get a cursor have been defined by user on webapp
Caused by: io.temporal.failure.ApplicationFailure: message='Source defined cursor validation failed for stream: table_50mb1. Error: key: 'updated_at' of path: '[updated_at]' not found in schema,Source defined cursor validation failed for stream: table_data_20gb. Error: key: 'updated_at' of path: '[updated_at]' not found in schema,Source defined cursor validation failed for stream: table_test2. Error: key: 'updated_at' of path: '[updated_at]' not found in schema,Source defined cursor validation failed for stream: assignment. Error: key: 'updated_at' of path: '[updated_at]' not found in schema,Source defined cursor validation failed for stream: user_magazine_count. Error: key: 'updated_at' of path: '[updated_at]' not found in schema', type='io.airbyte.workers.exception.WorkerException', nonRetryable=falseHuy Đỗ Văn
11/05/2024, 4:00 AMIshan Anilbhai Koradiya
11/05/2024, 5:11 AMTom Montgomery
11/05/2024, 6:43 AMPramod Bhat
11/05/2024, 7:32 AMPramod Bhat
11/05/2024, 7:43 AMKatalina Villamil
11/05/2024, 9:02 AMMor Iluz
11/05/2024, 9:46 AMDuong Le
11/05/2024, 9:48 AMVitalik Badion
11/05/2024, 10:09 AMhttps://{domain}.<http://chargebee.com/api/v2/plans?sort_by%5Basc%5D=updated_at&include_deleted=true&updated_at%5Bbetween%5D=%5B1641027600%2C+1643705999%5D|chargebee.com/api/v2/plans?sort_by%5Basc%5D=updated_at&include_deleted=true&updated_at%5Bbetween%5D=%5B1641027600%2C+1643705999%5D> failed with status code 400 and error message The API endpoint is incompatible with the product catalog version. You are calling product catalog 1.0 API endpoint but you are using product catalog 2.0
I guess it’s related to latest commit: https://github.com/airbytehq/airbyte/commit/7c6b528e1f057921d24b48faee4ec31a5782a145 & PR
We still observing this issues for plans, addons endpoints. Could you please help us to understand whether this is known issue?Tom Montgomery
11/05/2024, 11:01 AMabctl
on an EC2 instance. I would like to migrate over to using terraform to manage the instance. Are there any steps to do this?user
11/05/2024, 11:33 AMvalues.yaml
file: yaml global: storage: type: "GCS" storageSecretName: airbyte-config-secrets bucket: log: airbyte-bucket state: airbyte-bucket workloadOutput: airbyte-bucket gcs: projectId: <project-id>
You'll also need to create a Kubernetes secret containing the GCP credentials for accessing GCS [Implementation Guide](https://docs.airbyte.com/enterprise-setup/implementation-guide#configuring-external-logging). 2. Configure Cloud SQL for the database: To use Cloud SQL instead of the default Postgres, you'll need to disable the internal Postgres and configure the external database. Here's an example configuration for your `values.yaml`: yaml postgresql: enabled: false global: database: secretName: "airbyte-config-secrets" host: "" port: "" database: "" user: "" passwordSecretKey: ""
Make sure to create and configure your Cloud SQL instance separately and provide the necessary credentials in the Kubernetes secret [Implementation Guide](https://docs.airbyte.com/enterprise-setup/implementation-guide#configuring-the-airbyte-database). 3. Set up Terraform configuration: Create the necessary Terraform files in your project directory: - `provider.tf`: Define the Airbyte provider - `main.tf`: Configure Airbyte resources - `variables.tf`: Define variables, including credentials You may also want to create separate modules for connections, destinations, and sources [API to Warehouse Tutorial](https://airbyte.com/tutorials/api-to-warehouse-basic-stack-with-airbyte). 4. Initialize and apply Terraform: Run the following commands to set up your Airbyte resources: terraform init terraform plan terraform apply
[API to Warehouse Tutorial](https://airbyte.com/tutorials/api-to-warehouse-basic-stack-with-airbyte) Remember to adjust the configurations to match your specific GCP project settings and requirements. Also, ensure that you've created the necessary GCP resources (GKE cluster, Cloud SQL instance, GCS buckets) before applying the Terraform configuration. For a more detailed example of setting up Airbyte on GKE with external storage and database, you can refer to this [GitHub discussion](https://github.com/airbytehq/airbyte/discussions/47256) which provides additional Terraform configuration snippets.Rafael Anizio
11/05/2024, 11:34 AM