https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • u

    user

    09/09/2024, 3:24 PM
    #45340 [Feature Request] Support Associations APIs (HubSpot) natively with incremental mode (append+dedup) New discussion created by carlos-yolo Hello! As the title says, I was wondering if this could be done. It would help us avoid having to build a custom connector and have to maintain it in the future (Cf. https://developers.hubspot.com/beta-docs/reference/api/crm/associations/association-details). Please let me know if I may somehow help. Have a great day! Carlos airbytehq/airbyte
  • s

    Steven Herweijer

    09/09/2024, 8:09 AM
    I set up a new postgres to postgres connection, but getting this error:
    Copy code
    2024-09-09 08:07:05 destination > ERROR main i.a.c.i.b.AirbyteExceptionHandler(uncaughtException):31 Something went wrong in the connector. See the logs for more details. org.postgresql.util.PSQLException: ERROR: cannot open relation "snip_raw__stream_user_organizationplatformprofile_categories_"
      Detail: This operation is not supported for indexes.
    What can I do about this?
    • 1
    • 2
  • b

    ben amor elyes

    09/09/2024, 3:43 PM
    any updates on this error?
  • t

    Thambz Z

    09/09/2024, 3:43 PM
    Hi Team, I'm encountering an issue when deploying through Helm on AWS EKS. The error I receive is:
    Copy code
    Caused by: java.lang.RuntimeException: Cannot publish to S3: Unable to load AWS credentials from any provider in the chain: 
    [EnvironmentVariableCredentialsProvider: Unable to load AWS credentials from environment variables (AWS_ACCESS_KEY_ID (or AWS_ACCESS_KEY) and AWS_SECRET_KEY (or AWS_SECRET_ACCESS_KEY)), 
    SystemPropertiesCredentialsProvider: Unable to load AWS credentials from Java system properties (aws.accessKeyId and aws.secretKey), 
    WebIdentityTokenCredentialsProvider: You must specify a value for roleArn and roleSessionName, 
    com.amazonaws.auth.profile.ProfileCredentialsProvider@7a6aa30b: profile file cannot be null, 
    com.amazonaws.auth.EC2ContainerCredentialsProviderWrapper@423612fa: Unauthorized (Service: null; Status Code: 401; Error Code: null; Request ID: null; Proxy: null)]
    Here is the relevant section of my `values.yaml`:
    Copy code
    global: 
      serviceAccountName: "dev-sa"
      storage:
        type: s3
        bucket:
          log: dev-s3
          state: dev-s3
          workloadOutput: dev-s3  
        s3:
          region: us-east-2
          authenticationType: "instanceProfile"
    App Version: 0.63.6 EKS Version: 1.30 Any help or advice on resolving this issue would be greatly appreciated!
    a
    j
    j
    • 4
    • 19
  • s

    Slackbot

    09/09/2024, 4:28 PM
    This message was deleted.
    u
    c
    • 3
    • 2
  • p

    Pranay Mule

    09/09/2024, 4:34 PM
    Hello #C021JANJ6TY, I have a Airbyte connection with Klaviyo as the source and BigQuery as the destination. I've noticed that 'airbyte_internal' tables keep accumulating data with each sync and don’t delete old data. How can I manage or handle the accumulation of data in 'airbyte_internal' to avoid escalating storage costs? Are there best practices or configurations to ensure old data is removed or managed effectively? Thanks in advance for any insights or suggestions!
    j
    f
    l
    • 4
    • 8
  • m

    Matias Miranda

    09/05/2024, 10:41 AM
    Hi Everyone. I have airbyte running in GKE and I've upgraded from v0.60.1 to v0.64.1 by using https://artifacthub.io/packages/helm/airbyte/airbyte and right now the connections are stuck. The sync are in Starting status and hanging there for hours
    j
    b
    • 3
    • 7
  • n

    Nihal V Velpula

    09/09/2024, 6:41 PM
    Hello everyone, I'm trying start an airbyte instance locally on ubuntu. Everything is working except that I'm unable to get credentials. abctl local credentials INFO Using Kubernetes provider: Provider: kind Kubeconfig: /home/nihal/.airbyte/abctl/abctl.kubeconfig Context: kind-airbyte-abctl ERROR unable to inspect container: Error response from daemon: No such container: airbyte-abctl-control-plane nihal@nihalga:~$ airbyte-abctl-control-plane is up and running. I'm unable to figure out why we're getting this error. Could someone please help me out?
    a
    l
    • 3
    • 45
  • u

    user

    09/09/2024, 6:46 PM
    #45346 S3 storage set-up failing New discussion created by KCAmplify Hello, I followed the set-up instruction here: https://docs.airbyte.com/deploying-airbyte/integrations/storage What I wanted to do was for Airbyte to not store anything new on the server (I noticed the storage usage kept going up), and utilize external database to manage states (I've successfully set that up) and use S3 for anything that it needs to store (such as logs). Above instructions results in the following error message. How can I set it up to use external S3 bucket for anything other than state, which can be stored in external database? Thank you.
    unable to install airbyte chart: unable to install helm: cannot patch "airbyte-abctl-server" with kind Deployment: Deployment.apps "airbyte-abctl-server" is invalid: [spec.template.spec.containers[0].env[44].valueFrom.secretKeyRef.name: Invalid value: "": a lowercase RFC 1123 subdomain must consist of lower case alphanumeric characters, '-' or '.', and must start and end with an alphanumeric character (e.g. '<http://example.com|example.com>', regex used for validation is '[a-z0-9]([-a-z0-9]*[a-z0-9])?(\.[a-z0-9]([-a-z0-9]*[a-z0-9])?)*'), spec.template.spec.containers[0].env[45].valueFrom.secretKeyRef.name: Invalid value: "": a lowercase RFC 1123 subdomain must consist of lower case alphanumeric characters, '-' or '.', and must start and end with an alphanumeric character (e.g. '<http://example.com|example.com>', regex used for validation is '[a-z0-9]([-a-z0-9]*[a-z0-9])?(\.[a-z0-9]([-a-z0-9]*[a-z0-9])?)*')] && cannot patch "airbyte-abctl-worker" with kind Deployment: Deployment.apps "airbyte-abctl-worker" is invalid: [spec.template.spec.containers[0].env[46].valueFrom.secretKeyRef.name: Invalid value: "": a lowercase RFC 1123 subdomain must consist of lower case alphanumeric characters, '-' or '.', and must start and end with an alphanumeric character (e.g. '<http://example.com|example.com>', regex used for validation is '[a-z0-9]([-a-z0-9]*[a-z0-9])?(\.[a-z0-9]([-a-z0-9]*[a-z0-9])?)*'), spec.template.spec.containers[0].env[47].valueFrom.secretKeyRef.name: Invalid value: "": a lowercase RFC 1123 subdomain must consist of lower case alphanumeric characters, '-' or '.', and must start and end with an alphanumeric character (e.g. '<http://example.com|example.com>', regex used for validation is '[a-z0-9]([-a-z0-9]*[a-z0-9])?(\.[a-z0-9]([-a-z0-9]*[a-z0-9])?)*')] && cannot patch "airbyte-abctl-workload-launcher" with kind Deployment: Deployment.apps "airbyte-abctl-workload-launcher" is invalid: [spec.template.spec.containers[0].env[57].valueFrom.secretKeyRef.name: Invalid value: "": a lowercase RFC 1123 subdomain must consist of lower case alphanumeric characters, '-' or '.', and must start and end with an alphanumeric character (e.g. '<http://example.com|example.com>', regex used for validation is '[a-z0-9]([-a-z0-9]*[a-z0-9])?(\.[a-z0-9]([-a-z0-9]*[a-z0-9])?)*'), spec.template.spec.containers[0].env[58].valueFrom.secretKeyRef.name: Invalid value: "": a lowercase RFC 1123 subdomain must consist of lower case alphanumeric characters, '-' or '.', and must start and end with an alphanumeric character (e.g. '<http://example.com|example.com>', regex used for validation is '[a-z0-9]([-a-z0-9]*[a-z0-9])?(\.[a-z0-9]([-a-z0-9]*[a-z0-9])?)*')]
    airbytehq/airbyte
    b
    • 2
    • 1
  • t

    Thiago

    09/09/2024, 7:49 PM
    Hello #C021JANJ6TY, I'm trying to get a list of all sources configurations, since this is not available I'm using:
    Copy code
    curl --request POST --url '<http://localhost:8001/api/v1/source_definitions/list_latest>' --header 'accept: application/json'
    to get all sources and then wanted to use:
    Copy code
    curl -X POST <http://localhost:8001/api/v1/source_definition_specifications/get> \
    -H 'Content-Type: application/json' \
    -d '{
      "sourceDefinitionId": "<id>"
    }'
    to get each configuration, but when I run the above I get:
    Copy code
    curl -X POST <http://localhost:8001/api/v1/source_definition_specifications/get> \
    -H 'Content-Type: application/json' \
    -d '{
      "sourceDefinitionId": "9da77001-af33-4bcd-be46-6252bf9342b9"
    }'
    {"message":"Internal Server Error: null","exceptionClassName":"java.lang.NullPointerException","exceptionStack":["java.lang.NullPointerException","\tat java.base/java.util.Objects.requireNonNull(Objects.java:233)","\tat java.base/java.util.ImmutableCollections$List12.<init>(ImmutableCollections.java:563)","\tat java.base/java.util.List.of(List.java:937)","\tat io.airbyte.data.services.impls.jooq.WorkspaceServiceJooqImpl.getStandardWorkspaceNoSecrets(WorkspaceServiceJooqImpl.java:108)","\tat io.airbyte.config.persistence.version_overrides.ConfigurationDefinitionVersionOverrideProvider.getOrganizationId(ConfigurationDefinitionVersionOverrideProvider.java:48)","\tat io.airbyte.config.persistence.version_overrides.ConfigurationDefinitionVersionOverrideProvider.getScopedConfig(ConfigurationDefinitionVersionOverrideProvider.java:56)","\tat io.airbyte.config.persistence.version_overrides.ConfigurationDefinitionVersionOverrideProvider.getOverride(ConfigurationDefinitionVersionOverrideProvider.java:82)","\tat io.airbyte.config.persistence.ActorDefinitionVersionHelper.getSourceVersionWithOverrideStatus(ActorDefinitionVersionHelper.java:97)","\tat io.airbyte.config.persistence.ActorDefinitionVersionHelper.getSourceVersion(ActorDefinitionVersionHelper.java:119)","\tat io.airbyte.config.persistence.ActorDefinitionVersionHelper.getSourceVersion(ActorDefinitionVersionHelper.java:131)","\tat io.airbyte.commons.server.handlers.ConnectorDefinitionSpecificationHandler.getSourceDefinitionSpecification(ConnectorDefinitionSpecificationHandler.java:88)","\tat io.airbyte.server.apis.SourceDefinitionSpecificationApiController.lambda$getSourceDefinitionSpecification$0(SourceDefinitionSpecificationApiController.java:38)","\tat io.airbyte.server.apis.ApiHelper.execute(ApiHelper.java:28)","\tat io.airbyte.server.apis.SourceDefinitionSpecificationApiController.getSourceDefinitionSpecification(SourceDefinitionSpecificationApiController.java:38)","\tat io.airbyte.server.apis.$SourceDefinitionSpecificationApiController$Definition$Exec.dispatch(Unknown Source)","\tat io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invokeUnsafe(AbstractExecutableMethodsDefinition.java:461)","\tat io.micronaut.context.DefaultBeanContext$BeanContextUnsafeExecutionHandle.invokeUnsafe(DefaultBeanContext.java:4325)","\tat io.micronaut.web.router.AbstractRouteMatch.execute(AbstractRouteMatch.java:271)","\tat io.micronaut.web.router.DefaultUriRouteMatch.execute(DefaultUriRouteMatch.java:39)","\tat io.micronaut.http.server.RouteExecutor.executeRouteAndConvertBody(RouteExecutor.java:490)","\tat io.micronaut.http.server.RouteExecutor.lambda$callRoute$6(RouteExecutor.java:467)","\tat io.micronaut.core.execution.ExecutionFlow.lambda$async$1(ExecutionFlow.java:87)","\tat io.micronaut.core.propagation.PropagatedContext.lambda$wrap$3(PropagatedContext.java:211)","\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)","\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)","\tat java.base/java.lang.Thread.run(Thread.java:1583)"],"rootCauseExceptionStack":[]}%
    any idea?
    p
    • 2
    • 2
  • b

    ben amor elyes

    09/09/2024, 7:49 PM
    hello my airbyte workload launcher and server have an error as follows , but i can access the platform , what does that error mean and how it can be fixed please.
    Copy code
    ERROR i.m.h.s.RouteExecutor(logException):282 - Unexpected error occurred: Cannot invoke "io.micronaut.web.router.RouteMatch.isAnnotationPresent(java.lang.Class)" because "routeMatch" is null
  • p

    Peter

    09/09/2024, 7:58 PM
    Hey team. Im trying to install airbyte cli on Fedora/Debian, however I am getting that the command as not found. Besides docker, do I need to have another component installed ?
  • j

    Jean Olivier Butron

    09/09/2024, 8:54 PM
    Hi, im trying to connect airbyte with Kafka as a destination. Using UI Airbyte i have this message error. Could you help me like this please?
  • c

    Clinton Berry

    09/09/2024, 11:57 PM
    I have a weird one. I'm on self-hosted airbyte trying to add an S3 destination. I want to use the roll based authentication method, but the docs are asking for me to set an environment variable in the worker: https://docs.airbyte.com/integrations/destinations/s3#authentication-option-1-using-an-iam-role-most-secure the var it is asking me to set is
    AWS_ASSUME_ROLE_EXTERNAL_ID="{your-external-id}"
    which if I run the destination manually with a docker run command and include that as a variable the check passes. But there is no way in the user interface to add a custom environment variable to a destination that I can see. The docs make it sound like it is super obvious... so I am confused. I am on the latest version (installed today)
    p
    • 2
    • 2
  • l

    Leandro Roubert

    09/04/2024, 5:29 PM
    Hi, just migrated to airbyte 0.64.1, but when running a job the orchestrator-repl-job-* pods are not able to be allocated in a node. We use k8s tolerations that are not being propagated to this pod it seems, even though we have set them in the worker and job sections on helm chart values. Does any one know what this could be?
    a
    • 2
    • 2
  • t

    Tonja Rand

    09/10/2024, 8:15 AM
    Hi hi, Hope all is well! There was a pull request that was merged https://github.com/airbytehq/airbyte/pull/45116 and since then I have troubles with my airbyte connectors for Shopify. What can I do to solve it? Thank you in advance for the help.
    p
    m
    • 3
    • 4
  • u

    user

    09/10/2024, 8:15 AM
    #45355 [source-metabase] Add API Key auth New discussion created by kempspo The preferred way of authenticating to the Metabase API is via API Keys over the current implemented way (user login/ session token) airbytehq/airbyte
  • h

    Hari Gudla

    09/10/2024, 8:47 AM
    For MSSQL CDC replication onto target Snowflake warehouse, I could notice a field "_ab_cdc_event_serial_no" in the scd table. What is the purpose of the field? kapa.ai could not get any answer from documentation base.
    p
    • 2
    • 2
  • a

    Ahmed Hamid

    09/10/2024, 9:07 AM
    Hey everyone, I’m trying to understand the lookback window in the Google Analytics 4 connector. My understanding is that if I set the lookback window to 7 days and I load data on September 7th, I should only see data from September 1st and earlier—nothing after September 1st should be included. However, I’m still seeing data from September 6th, even though the window should have cut off at September 1st. Am I misunderstanding how the lookback window is supposed to work, or is this an issue with the connector? Any clarification would be appreciated!
  • r

    Rytis Zolubas

    09/10/2024, 9:15 AM
    Hello when running a sync i get an error:
    Failed to read the output of a successful workload e6a9a525-6de9-4126-b80e-06acb70eeb92_22725_1_check
  • m

    Mohamed Necib

    09/10/2024, 10:22 AM
    Hello Everyone, while trying to synchronize destination I have this message, i allready pushed this update but it doesn't seem to work, i USE airbyte with digital ocean : 2024-09-10 101415 platform > airbyte/source-shopify:2.5.0 not found locally. Attempting to pull the image... 2024-09-10 101415 platform > Image does not exist. 2024-09-10 101415 platform > Unexpected error while checking connection:
    p
    • 2
    • 6
  • u

    user

    09/10/2024, 12:23 PM
    #45358 Unable to instal Airbyte using the abctl on local widows machine New discussion created by Jackolas126 We are currently investigating if airbyte can be used for out goals. But we are running into a bit of a trouble when trying to instal it on a local windows machine Multiple errors are showing and it seems that it is unable to pull the files. We are unable to figure out what is causing the error. (also we don't have that many that are familiar with linux and docker, we are learning as we go) << Encountered an issue deploying Airbyte: Pod: airbyte-db-0.17f3dfee3e2d446d Reason: Failed Message: Failed to pull image "airbyte/db:0.64.3": failed to pull and unpack image "docker.io/airbyte/db:0.64.3": failed to resolve reference "docker.io/airbyte/db:0.64.3": failed to do request: Head "https://registry-1.docker.io/v2/airbyte/db/manifests/0.64.3": tls: failed to verify certificate: x509: certificate signed by unknown authority Count: 1 DEBUG Encountered an issue deploying Airbyte: Pod: airbyte-db-0.17f3dfee3e2f9077 Reason: Failed Message: Error: ErrImagePull Count: 1 DEBUG Pulling image "airbyte/bootloader:0.64.3" DEBUG Encountered an issue deploying Airbyte: Pod: airbyte-minio-0.17f3dfee417bcac6 Reason: Failed Message: Failed to pull image "minio/minio:RELEASE.2023-11-20T22-40-07Z": failed to pull and unpack image "docker.io/minio/minio:RELEASE.2023-11-20T22-40-07Z": failed to resolve reference "docker.io/minio/minio:RELEASE.2023-11-20T22-40-07Z": failed to do request: Head "https://registry-1.docker.io/v2/minio/minio/manifests/RELEASE.2023-11-20T22-40-07Z": tls: failed to verify certificate: x509: certificate signed by unknown authority Count: 1
    >
    Maybe someone can point us into the right direction We already tried the docker by installing the "Hello world" image and that worked without any problems. airbytehq/airbyte
  • c

    Christophe Di Prima

    09/10/2024, 12:24 PM
    Hi there ! I think people already asked, but is there a tool to create a connector out of an OpenAPI definition? If not, is there any active development in progress on this?
    p
    j
    • 3
    • 4
  • r

    Rytis Zolubas

    09/10/2024, 1:42 PM
    how can I see what is happening inside the worker? The job is running 5h now for a simple 1 day shopify job orders...
    p
    j
    • 3
    • 9
  • m

    Matias Miranda

    09/10/2024, 3:08 PM
    Hi! Does anyone else has this problem with github source? I'm trying to look at the logs but It is not clear enough.
    Copy code
    replication-orchestrator > failures: [ {
      "failureOrigin" : "source",
      "failureType" : "system_error",
      "internalMessage" : "Conflict.",
      "externalMessage" : "Git Repository is empty.",
      "metadata" : {
        "attemptNumber" : 1,
        "jobId" : 429,
        "from_trace_message" : true,
        "connector_command" : "read"
      },
      "stacktrace" : "Traceback (most recent call last):\n  File \"/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/abstract_source.py\", line 133, in read\n    yield from self._read_stream(\n  File \"/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/abstract_source.py\", line 239, in _read_stream\n    for record_data_or_message in record_iterator:\n  File \"/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/streams/core.py\", line 192, in read\n    for record_data_or_message in records:\n  File \"/airbyte/integration_code/source_github/streams.py\", line 307, in read_records\n    for record in super().read_records(\n  File \"/airbyte/integration_code/source_github/streams.py\", line 130, in read_records\n    yield from super().read_records(stream_slice=stream_slice, **kwargs)\n  File \"/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/streams/http/http.py\", line 335, in read_records\n    yield from self._read_pages(\n  File \"/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/streams/http/http.py\", line 387, in _read_pages\n    request, response = self._fetch_next_page(stream_slice, stream_state, next_page_token)\n  File \"/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/streams/http/http.py\", line 452, in _fetch_next_page\n    request, response = self._http_client.send_request(\n  File \"/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/streams/http/http_client.py\", line 382, in send_request\n    response: requests.Response = self._send_with_retry(\n  File \"/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/streams/http/http_client.py\", line 228, in _send_with_retry\n    response = backoff_handler(rate_limit_backoff_handler(user_backoff_handler))(request, request_kwargs, log_formatter=log_formatter, exit_on_rate_limit=exit_on_rate_limit)  # type: ignore # mypy can't infer that backoff_handler wraps _send\n  File \"/usr/local/lib/python3.10/site-packages/backoff/_sync.py\", line 105, in retry\n    ret = target(*args, **kwargs)\n  File \"/usr/local/lib/python3.10/site-packages/backoff/_sync.py\", line 105, in retry\n    ret = target(*args, **kwargs)\n  File \"/usr/local/lib/python3.10/site-packages/backoff/_sync.py\", line 105, in retry\n    ret = target(*args, **kwargs)\n  File \"/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/streams/http/http_client.py\", line 334, in _send\n    raise UserDefinedBackoffException(\nairbyte_cdk.sources.streams.http.exceptions.UserDefinedBackoffException: Conflict.\n",
      "timestamp" : 1725980299262,
      "streamDescriptor" : {
        "name" : "commits"
      }
    }, {
      "failureOrigin" : "source",
      "failureType" : "config_error",
      "externalMessage" : "During the sync, the following streams did not sync successfully: commits: AirbyteTracedException('Conflict.')",
      "metadata" : {
        "attemptNumber" : 1,
        "jobId" : 429,
        "from_trace_message" : true,
        "connector_command" : "read"
      },
      "stacktrace" : "Traceback (most recent call last):\n  File \"/airbyte/integration_code/main.py\", line 8, in <module>\n    run()\n  File \"/airbyte/integration_code/source_github/run.py\", line 17, in run\n    launch(source, sys.argv[1:])\n  File \"/usr/local/lib/python3.10/site-packages/airbyte_cdk/entrypoint.py\", line 234, in launch\n    for message in source_entrypoint.run(parsed_args):\n  File \"/usr/local/lib/python3.10/site-packages/airbyte_cdk/entrypoint.py\", line 122, in run\n    yield from map(AirbyteEntrypoint.airbyte_message_to_string, self.read(source_spec, config, config_catalog, state))\n  File \"/usr/local/lib/python3.10/site-packages/airbyte_cdk/entrypoint.py\", line 164, in read\n    for message in self.source.read(self.logger, config, catalog, state):\n  File \"/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/abstract_source.py\", line 177, in read\n    raise AirbyteTracedException(message=error_message, failure_type=FailureType.config_error)\nairbyte_cdk.utils.traced_exception.AirbyteTracedException: None\n",
      "timestamp" : 1725980299265
    }, {
      "failureOrigin" : "source",
      "internalMessage" : "Source process exited with non-zero exit code 1",
      "externalMessage" : "Something went wrong within the source connector",
      "metadata" : {
        "attemptNumber" : 1,
        "jobId" : 429,
        "connector_command" : "read"
      },
      "stacktrace" : "io.airbyte.workers.internal.exception.SourceException: Source process exited with non-zero exit code 1\n\tat io.airbyte.workers.general.BufferedReplicationWorker.readFromSource(BufferedReplicationWorker.java:378)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithHeartbeatCheck$3(BufferedReplicationWorker.java:242)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\n",
      "timestamp" : 1725980300572
    } ]
  • e

    Euan Blackledge

    09/10/2024, 3:08 PM
    Hey folks meow fingerguns, we were trying to upgrade Airbyte today to the latest version from 0.50.55. However, the upgrade failed due to an error message
    column actor.default_version_id does not exist
    . Has anyone encountered this or resolved it? We are still encountering it after downgrading back to our previous version
    r
    m
    • 3
    • 15
  • t

    Tigran Zalyan

    09/06/2024, 5:34 PM
    Hey guys! Could anyone please share a nice setup guide of Airbyte + GKE + Identity Aware Proxy. I'm at the point where I'm struggling to setup a basic ingress and access airbyte via public ip. Thanks in advance!
    j
    • 2
    • 3
  • j

    Juan Guillen

    09/06/2024, 6:59 PM
    Hi There, we are gonna update the postgres db version of one of the sources, and I wonder if someone has had any issue with the incremental syncs after it? Thanks!
    j
    • 2
    • 2
  • r

    Rytis Zolubas

    09/10/2024, 3:39 PM
    what is happening with the pods? shouldn't they be killed?
    a
    • 2
    • 10
  • k

    Kevin Wang

    09/10/2024, 4:04 PM
    We need to use a cursor of 'time - 2 days' to get late-arriving data. Do we need to use a custom connector to do this? I'm having this issue on Mixpanel -> BigQuery. I cannot change the source-defined cursor of
    time
    to something like
    mp_processing_time_ms
    which would be a better cursor, and not need me to recheck 'older' dates. I found this @Roberto Walter's thread, but that requires me running open-source and not Cloud?
    r
    • 2
    • 5
1...212213214...245Latest