https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • m

    Madison Mae

    07/06/2023, 5:31 PM
    I have a connection that has been failing every time and I can't seem to figure out the issue. All my other Stripe connectors are working but not this one and I can't figure out why. Can I get someone to take a look for me on the bckend?
    k
    m
    • 3
    • 5
  • r

    Richard Anthony Hein (Auxon)

    07/06/2023, 5:40 PM
    Hi! I am getting this error trying to configure an MSSQL connection: Configuration check failed Could not connect with provided configuration. Error: Cannot invoke "com.fasterxml.jackson.databind.JsonNode.asText()" because the return value of "com.fasterxml.jackson.databind.JsonNode.get(String)" is null Any ideas?
    k
    • 2
    • 5
  • e

    Eduardo Aviles

    07/06/2023, 6:55 PM
    Hi, I'm using Elastic Search source connector to get data from OpenSearch, I don't know if there is a way to use it for OpenSearch i'm currently getting this error
    Copy code
    Suppressed: org.elasticsearch.client.ResponseException: method [POST], host [<https://vpc-bi-sp-opensearch-bwb5kptvvi6tzbfkyqxhetnquy.us-east-1.es.amazonaws.com>], URI [/_search/scroll], status line [HTTP/1.1 404 Not Found]
    {"error":{"root_cause":[{"type":"search_context_missing_exception","reason":"No search context found for id [65883]"},{"type":"search_context_missing_exception","reason":"No search context found for id [65882]"},{"type":"search_context_missing_exception","reason":"No search context found for id [62223]"},{"type":"search_context_missing_exception","reason":"No search context found for id [62222]"}],"type":"search_phase_execution_exception","reason":"all shards failed","phase":"query","grouped":true,"failed_shards":[{"shard":-1,"index":null,"reason":{"type":"search_context_missing_exception","reason":"No search context found for id [65883]"}},{"shard":-1,"index":null,"reason":{"type":"search_context_missing_exception","reason":"No search context found for id [65882]"}},{"shard":-1,"index":null,"reason":{"type":"search_context_missing_exception","reason":"No search context found for id [62223]"}},{"shard":-1,"index":null,"reason":{"type":"search_context_missing_exception","reason":"No search context found for id [62222]"}}],"caused_by":{"type":"search_context_missing_exception","reason":"No search context found for id [62222]"}},"status":404}
    		at org.elasticsearch.client.RestClient.convertResponse(RestClient.java:326)
    		at org.elasticsearch.client.RestClient.performRequest(RestClient.java:296)
    		at org.elasticsearch.client.RestClient.performRequest(RestClient.java:270)
    		at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1621)
    		... 25 more
    Has someone tried this before?
    k
    • 2
    • 3
  • f

    Fariha Baloch

    07/06/2023, 7:15 PM
    I am trying to use airbyte-local-cli to connect ghe server to S3 bucket. How would I change these commands so that dst has S3 mentioned and not faros
    Copy code
    ./airbyte-local.sh \
    --src 'farosai/airbyte-faros-feeds-source' \
    --src.feed_cfg.feed_name 'github-feed' \
    --src.feed_cfg.feed_path 'vcs/github-feed' \
    --src.feed_cfg.auth_cfg '{"auth":"token", "personal_access_token":"<my_token>"}' \
    --src.feed_cfg.repos_query_mode '{"query_mode":"GitHubOrg"}' \
     --src.feed_cfg.org_repo_list '["org/repo"]' \
    --src.feed_cfg.cutoff_days '90' \
    --src.feed_args '["--github-api-url", "<github api link>", "--debug"]' \
    --dst.edition_configs.hasura_url '<http://host.docker.internal:8080/>'  \
    --dst.edition_configs.hasura_admin_secret 'admin' \
    --dst.edition_configs.edition 'community' \
    --dst 'farosai/airbyte-faros-destination' \
    --debug
    k
    • 2
    • 2
  • p

    Patrick Kompier

    07/06/2023, 7:40 PM
    Hello, I'm running airbyte locally with docker. When I want to add a S3 destination I'm getting an authentication error that the AWS_ACCESS_KEY and AWS_SECRET_ACCESS_KEY are nto set. I've added these values in the docker-compose.yaml file, added the values in the .env.dev file and build the docker container. The access and secret key are working, I've checked them separately with aws cli. What am I doing wrong?
  • g

    Gabriel Martelloti

    07/06/2023, 8:57 PM
    Hey guys. Can someone explain to me the env variable
    TEMPORAL_HISTORY_RETENTION_IN_DAYS
    works in k8s when installing Airbyte through Helm? I tried setting it for the airbyte-cron, airbyte-worker and airbyte-server but it seems the deletion job is never triggered.
    k
    p
    • 3
    • 9
  • g

    Gabriel Levine

    07/06/2023, 8:59 PM
    I receive a message “Failed to start sync: A sync is already running” when attempting to trigger a sync despite the previous job being marked as Succeeded and no pods running for that connection. I believe this may be related to this error:
    Copy code
    io.airbyte.workers.internal.exception.StreamStatusException: Invalid stream status transition to COMPLETE.
    I’m running Airbyte OSS 0.50.6 on GKE
    k
    • 2
    • 11
  • j

    Jason Sorensen

    07/06/2023, 9:16 PM
    Hello - I just upgraded our connectors and our RedShift integration is failing to parse a cursor with this error
    "stacktrace" : "java.time.format.DateTimeParseException: Text '2023-07-06T17:50:24.296000Z' could not be parsed, unparsed text found at index 26
    . It looks like the 'Z' is causing this error. I did not set up this airbyte instance and only use the webui for our Airbyte. Is there a way to fix this cursor value so that the connection can run
    k
    • 2
    • 2
  • c

    Chidambara Ganapathy

    07/07/2023, 6:11 AM
    Hi Team,
    k
    • 2
    • 2
  • c

    Chidambara Ganapathy

    07/07/2023, 6:15 AM
    Hi Team, I am trying to add Shopify as source but I am getting some error 2023-07-07 054512 [32mINFO[m i.a.w.g.DefaultReplicationWorker(getReplicationOutput):450 - failures: [ { "failureOrigin" : "source", "failureType" : "system_error", "internalMessage" : "402 Client Error: Payment Required for url: https://nock-4618.myshopify.com/admin/api/2022-10/articles.json?limit=250&amp;order=id+asc&amp;since_id=0", "externalMessage" : "Something went wrong in the connector. See the logs for more details.", "metadata" : { "attemptNumber" : 2, "jobId" : 24172, "from_trace_message" : true, "connector_command" : "read" } Can you please check and let me know if there is any known issue or how it can be sorted Thanks
    k
    • 2
    • 2
  • h

    Haki Dere

    07/07/2023, 10:49 AM
    We are getting timeouts after 5 min when try to connect a source [oracle] to BigQuery. This is due to check_schema Ajax post request is timing out. is there a way to workaround this? Airbyte version 0.50.5 , Helm deployment on GKE.
    k
    • 2
    • 2
  • l

    Lily Tian

    07/07/2023, 2:26 PM
    hi there, i am trying to sync up my connector, but i thought this error: the log is not giving me any more details on where it failed or what the actual error is...any help?
    Copy code
    2023-07-07 14:17:24 INFO i.a.w.p.KubePodProcess(close):797 - (pod: airbyte / normalization-normalize-468-0-ycujg) - Closed all resources for pod
    2023-07-07 14:17:24 INFO i.a.w.n.DefaultNormalizationRunner(close):196 - Terminating normalization process...
    2023-07-07 14:17:24 ERROR i.a.w.g.DefaultNormalizationWorker(run):86 - Normalization failed for job
    k
    • 2
    • 2
  • r

    ramiro barraco

    07/07/2023, 3:07 PM
    Hi, im trying to use airbyte with s3 connectors but i'm getting:
    Copy code
    Configuration check failed 
    AirbyteTracedException('Access Denied')
    in the worker pods i get this in logs:
    Copy code
    Log4j2Appender says: Traceback (most recent call last):
      File "/airbyte/integration_code/source_s3/stream.py", line 60, in filepath_iterator
        response = client.list_objects_v2(**kwargs)
      File "/usr/local/lib/python3.9/site-packages/botocore/client.py", line 530, in _api_call
        return self._make_api_call(operation_name, kwargs)
      File "/usr/local/lib/python3.9/site-packages/botocore/client.py", line 964, in _make_api_call
        raise error_class(parsed_response, operation_name)
    botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/airbyte/integration_code/source_s3/source_files_abstract/source.py", line 63, in check_connection
        for file_info in stream.filepath_iterator():
      File "/airbyte/integration_code/source_s3/stream.py", line 64, in filepath_iterator
        raise AirbyteTracedException(message, message, failure_type=FailureType.config_error)
    airbyte_cdk.utils.traced_exception.AirbyteTracedException: Access Denied
    
    2023-07-07 15:00:55 ERROR i.a.w.i.VersionedAirbyteStreamFactory(internalLog):308 - Traceback (most recent call last):
      File "/airbyte/integration_code/source_s3/stream.py", line 60, in filepath_iterator
        response = client.list_objects_v2(**kwargs)
      File "/usr/local/lib/python3.9/site-packages/botocore/client.py", line 530, in _api_call
        return self._make_api_call(operation_name, kwargs)
      File "/usr/local/lib/python3.9/site-packages/botocore/client.py", line 964, in _make_api_call
        raise error_class(parsed_response, operation_name)
    botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/airbyte/integration_code/source_s3/source_files_abstract/source.py", line 63, in check_connection
        for file_info in stream.filepath_iterator():
      File "/airbyte/integration_code/source_s3/stream.py", line 64, in filepath_iterator
        raise AirbyteTracedException(message, message, failure_type=FailureType.config_error)
    airbyte_cdk.utils.traced_exception.AirbyteTracedException: Access Denied
    
    Log4j2Appender says: Check failed
    2023-07-07 15:00:55 ERROR i.a.w.i.VersionedAirbyteStreamFactory(internalLog):308 - Check failed
    2023-07-07 15:00:55 INFO i.a.w.p.ExitCodeWatcher(persistExitCode):117 - Received exit code 0 for pod source-s3-check-bf89656d-54b0-4e0d-9393-790344798210-0-vqedb
    Log4j2Appender says: (pod: airbyte / source-s3-check-bf89656d-54b0-4e0d-9393-790344798210-0-vqedb) - Closed all resources for pod
    The cluster is currently runing in eks and by running a pod with aws-cli installed in the namespace i could check that the SA is being used and that i can access the bucket. Am i missing somehting here?
    k
    l
    p
    • 4
    • 4
  • c

    Cesar Santos

    07/07/2023, 3:26 PM
    Hey All! It seems that the Zendesk source connection can not populate the field custom_field_options even though the field is available at the Airbyte connections: Anyone else suffering with this problem?
    k
    • 2
    • 4
  • a

    Andrés O. Arredondo

    07/07/2023, 3:53 PM
    Hi, I am trying a fresh docker deployment of Airbyte and I get the following error:
    Copy code
    java.lang.RuntimeException: Cannot upgrade from version 0.35.36-alpha to version 0.50.6 directly. First you must upgrade to version 0.37.0-alpha. After that upgrade is complete, you may upgrade to version 0.50.6.
    I did a
    docker system prune -a
    and tried the instructions on https://docs.airbyte.com/operator-guides/upgrading-airbyte/#overview for the 0.32.0-alpha but with 0.37.0-alpha in the checkout and it keeps showing the same error. I have installed airbyte with docker before but if that's the problem then I dont know where I should delete something. The instructions on the page just say run
    Copy code
    ./run-ab-platform.sh
    before clone so I am not sure about what to do to fix this. A thread that appeared on "search" pointed to the 0.32.0-alpha error and seems to be fixed with the instructions in the link but it doesnt work for me on this version.
    k
    • 2
    • 2
  • j

    James Salmon

    07/07/2023, 3:54 PM
    Hi all, I am doing a replication from Redshift (Prod) to Redshift (Staging). However, when I do this, all of the Date_Time with TZ fields turn into Varchars. How do I stop this from happening? I am running latest versions of source and destination redshift connectors. Can anyone help?
    k
    • 2
    • 3
  • e

    Eliott Bigiaoui

    07/07/2023, 5:14 PM
    Hi! I have set a Postgres - Snowflake connection on Airbyte Cloud, and when I try to sync a few streams (10 tables) for the first time, the job runs for hours and seems to be stuck in an infinite loop with the following logs:
    Copy code
    2023-07-07 17:07:32 destination > INFO i.a.i.d.b.BufferManager(printQueueInfo):94 QUEUE INFO
      Global Mem Manager -- max: 1.2 GB, allocated: 66.45 MB (66.44576072692871 MB), % used: 0.05407369854181614
      Queue name: Table1, num records: 0, num bytes: 0 bytes
      Queue name: Table2, num records: 0, num bytes: 0 bytes
      Queue name: Table3, num records: 0, num bytes: 0 bytes
      Queue name: Table4, num records: 0, num bytes: 0 bytes
      Queue name: Table5, num records: 0, num bytes: 0 bytes
      Queue name: Table6, num records: 0, num bytes: 0 bytes
      Queue name: Table7, num records: 0, num bytes: 0 bytes
    2023-07-07 17:07:32 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
    2023-07-07 17:07:32 destination > INFO i.a.i.d.FlushWorkers(printWorkerInfo):134 WORKER INFO
      Pool queue size: 0, Active threads: 0
    2023-07-07 17:07:33 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
    2023-07-07 17:07:34 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
    2023-07-07 17:07:35 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
    2023-07-07 17:07:36 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
    2023-07-07 17:07:37 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
    2023-07-07 17:07:38 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
    2023-07-07 17:07:39 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
    2023-07-07 17:07:40 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
    2023-07-07 17:07:41 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
    2023-07-07 17:07:42 destination > INFO i.a.i.d.b.BufferManager(printQueueInfo):94 QUEUE INFO
      Global Mem Manager -- max: 1.2 GB, allocated: 66.45 MB (66.44576072692871 MB), % used: 0.05407369854181614
      Queue name: Table1, num records: 0, num bytes: 0 bytes
      Queue name: Table2, num records: 0, num bytes: 0 bytes
      Queue name: Table3, num records: 0, num bytes: 0 bytes
      Queue name: Table4, num records: 0, num bytes: 0 bytes
      Queue name: Table5, num records: 0, num bytes: 0 bytes
      Queue name: Table6, num records: 0, num bytes: 0 bytes
      Queue name: Table7, num records: 0, num bytes: 0 bytes
    2023-07-07 17:07:42 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
    2023-07-07 17:07:42 destination > INFO i.a.i.d.FlushWorkers(printWorkerInfo):134 WORKER INFO
      Pool queue size: 0, Active threads: 0
    2023-07-07 17:07:43 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
    2023-07-07 17:07:44 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
    Other details: • the tables are created in Snowflake, but only some of the raw tables are populated (not the normalized ones) • I have tried to reset the streams, same problem after resetting • The sync worked yesterday when I had only one table Any idea? Thanks in advance!
    k
    b
    • 3
    • 3
  • s

    Semyon Komissarov

    07/07/2023, 6:51 PM
    I use Amplitude connection and bigquery destination. after successful sync I only see raw table in BQ. And not normalized table
    k
    • 2
    • 2
  • o

    Octavia Squidington III

    07/07/2023, 7:45 PM
    🔥 Community Office Hours starts in 15 minutes 🔥 At 1pm PDT click here to join us on Zoom!
  • z

    Zack Parker

    07/07/2023, 9:33 PM
    I'm having trouble setting up Google Analytics 4 as a source. I've followed the instructions in the docs, but I'm getting an error saying that access was denied to the property. I can see that the service account is set up in my project and has been added as a viewer to the property, and I've enabled the Google Analytics API and Google Analytics Reporting API for the project as well.
    k
    • 2
    • 9
  • z

    Zack Parker

    07/07/2023, 10:15 PM
    I'm using Airbyte Cloud. I just ran my first sync, and the system reports success, but I don't see any data in the destination database. I'm syncing from Google Analytics 4 to Postgres
    k
    • 2
    • 14
  • s

    Slackbot

    07/08/2023, 7:16 AM
    This message was deleted.
    k
    • 2
    • 2
  • r

    Rodrigo Mont'Alegre

    07/08/2023, 7:34 AM
    Hi, I have a mysql connector that has been successfully running for weeks. Starting Thursday (06.07.2023) morning, airbyte failed to connect to the data source. The provider of the mysql database claims nothing has changed and I am able to successfully connect using the mysql workbench. Airbyte is running on AWS and is version
    04.40.23
    . I would appreciate any insight. Here's part of the log
    2023-07-08 07:04:31 [33mWARN[m c.n.s.JsonMetaSchema(newValidator):278 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
    2023-07-08 07:04:31 [32mINFO[m i.a.v.j.JsonSchemaValidator(test):130 - JSON schema validation failed.
    errors: $.auth_type: must be a constant value OAuth2.0, $.auth_type: does not have a value in the enumeration [OAuth2.0], $.access_token: is missing but it is required, $.refresh_token: is missing but it is required
    2023-07-08 07:04:31 [32mINFO[m i.a.v.j.JsonSchemaValidator(test):130 - JSON schema validation failed.
    errors: $.auth_type: must be a constant value Key Pair Authentication, $.auth_type: does not have a value in the enumeration [Key Pair Authentication], $.private_key: is missing but it is required
    2023-07-08 07:04:31 [33mWARN[m c.n.s.JsonMetaSchema(newValidator):278 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
    2023-07-08 07:04:31 [32mINFO[m i.a.v.j.JsonSchemaValidator(test):130 - JSON schema validation failed.
    errors: $.password: object found, string expected
    2023-07-08 07:04:31 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):136 - Docker volume job log path: /tmp/workspace/63142/0/logs.log
    2023-07-08 07:04:31 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.40.23
    2023-07-08 07:04:31 [32mINFO[m i.a.a.c.AirbyteApiClient(retryWithJitter):179 - Attempt 0 to save workflow id for cancellation
    2023-07-08 07:04:32 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 -
    2023-07-08 07:04:32 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 - ----- START CHECK -----
    2023-07-08 07:04:32 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 -
    2023-07-08 07:04:32 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/source-mysql:2.0.11 exists...
    2023-07-08 07:04:43 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 - airbyte/source-mysql:2.0.11 was found locally.
    2023-07-08 07:04:44 [32mINFO[m i.a.w.p.DockerProcessFactory(create):119 - Creating docker container = source-mysql-check-63142-0-izimd with resources io.airbyte.config.ResourceRequirements@525765e7[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=]
    2023-07-08 07:04:44 [32mINFO[m i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/63142/0 --log-driver none --name source-mysql-check-63142-0-izimd --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=true -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e WORKER_CONNECTOR_IMAGE=airbyte/source-mysql:2.0.11 -e WORKER_JOB_ATTEMPT=0 -e AUTO_DETECT_SCHEMA=false -e AIRBYTE_VERSION=0.40.23 -e WORKER_JOB_ID=63142 airbyte/source-mysql:2.0.11 check --config source_config.json
    2023-07-08 07:04:44 [32mINFO[m i.a.w.i.VersionedAirbyteStreamFactory(create):97 - Reading messages from protocol version 0.2.0
    2023-07-08 07:05:30 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO i.a.i.s.m.MySqlSource(main):407 starting source: class io.airbyte.integrations.source.mysql.MySqlSource
    2023-07-08 07:05:30 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {check=null, config=source_config.json}
    2023-07-08 07:05:30 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO i.a.i.b.IntegrationRunner(runInternal):108 Running integration: io.airbyte.integrations.base.ssh.SshWrappedSource
    2023-07-08 07:05:30 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO i.a.i.b.IntegrationRunner(runInternal):109 Command: CHECK
    2023-07-08 07:05:30 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO i.a.i.b.IntegrationRunner(runInternal):110 Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'}
    2023-07-08 07:05:31 [33mWARN[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):117 - WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
    2023-07-08 07:05:31 [33mWARN[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):117 - WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
    2023-07-08 07:05:32 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO i.a.i.b.s.SshTunnel(getInstance):204 Starting connection with method: NO_TUNNEL
    2023-07-08 07:05:33 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO c.z.h.HikariDataSource(<init>):80 HikariPool-1 - Starting...
    2023-07-08 07:05:33 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO c.z.h.HikariDataSource(<init>):82 HikariPool-1 - Start completed.
    2023-07-08 07:06:33 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO c.z.h.HikariDataSource(close):350 HikariPool-1 - Shutdown initiated...
    2023-07-08 07:06:36 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO c.z.h.HikariDataSource(close):352 HikariPool-1 - Shutdown completed.
    2023-07-08 07:06:36 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO i.a.i.b.IntegrationRunner(runInternal):186 Completed integration: io.airbyte.integrations.base.ssh.SshWrappedSource
    2023-07-08 07:06:36 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO i.a.i.s.m.MySqlSource(main):409 completed source: class io.airbyte.integrations.source.mysql.MySqlSource
    2023-07-08 07:06:36 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 -
    2023-07-08 07:06:36 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):163 - Stopping cancellation check scheduling...
    2023-07-08 07:06:36 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 - ----- END CHECK -----
    2023-07-08 07:06:36 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 -
    2023-07-08 07:06:36 [33mWARN[m i.t.i.w.ActivityWorker$TaskHandlerImpl(logExceptionDuringResultReporting):365 - Failure during reporting of activity result to the server. ActivityId = cc93e74a-db81-37bc-a6b6-3895b739fa1f, ActivityType = RunWithJobOutput, WorkflowId=connection_manager_c68a63fd-ec58-45a3-b8ee-8d787ddf52e7, WorkflowType=ConnectionManagerWorkflow, RunId=61607809-4637-4161-b0f7-ef83a37e2d9d
    io.grpc.StatusRuntimeException: NOT_FOUND: invalid activityID or activity already timed out or invoking workflow is completed
    at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:271) ~[grpc-stub-1.50.2.jar:1.50.2]
    at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:252) ~[grpc-stub-1.50.2.jar:1.50.2]
    at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:165) ~[grpc-stub-1.50.2.jar:1.50.2]
    at io.temporal.api.workflowservice.v1.WorkflowServiceGrpc$WorkflowServiceBlockingStub.respondActivityTaskCompleted(WorkflowServiceGrpc.java:3840) ~[temporal-serviceclient-1.17.0.jar:?]
    at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.lambda$sendReply$0(ActivityWorker.java:303) ~[temporal-sdk-1.17.0.jar:?]
    at io.temporal.internal.retryer.GrpcRetryer.lambda$retry$0(GrpcRetryer.java:52) ~[temporal-serviceclient-1.17.0.jar:?]
    at io.temporal.internal.retryer.GrpcSyncRetryer.retry(GrpcSyncRetryer.java:67) ~[temporal-serviceclient-1.17.0.jar:?]
    at io.temporal.internal.retryer.GrpcRetryer.retryWithResult(GrpcRetryer.java:60) ~[temporal-serviceclient-1.17.0.jar:?]
    at io.temporal.internal.retryer.GrpcRetryer.retry(GrpcRetryer.java:50) ~[temporal-serviceclient-1.17.0.jar:?]
    at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.sendReply(ActivityWorker.java:298) ~[temporal-sdk-1.17.0.jar:?]
    at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handleActivity(ActivityWorker.java:252) ~[temporal-sdk-1.17.0.jar:?]
    at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:206) ~[temporal-sdk-1.17.0.jar:?]
    at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:179) ~[temporal-sdk-1.17.0.jar:?]
    at io.temporal.internal.worker.PollTaskExecutor.lambda$process$0(PollTaskExecutor.java:93) ~[temporal-sdk-1.17.0.jar:?]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    at java.lang.Thread.run(Thread.java:1589) ~[?:?]
    2023-07-08 07:07:02 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):136 - Docker volume job log path: /tmp/workspace/63142/0/logs.log
    2023-07-08 07:07:02 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.40.23
    2023-07-08 07:07:02 [32mINFO[m i.a.a.c.AirbyteApiClient(retryWithJitter):179 - Attempt 0 to save workflow id for cancellation
    2023-07-08 07:07:02 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 -
    97616a89_8d1e_4a08_b894_b025230d2a39_logs_63142_txt.txt
    s
    • 2
    • 6
  • r

    Raghav Mittal

    07/09/2023, 7:55 AM
    I have been trying to connect to redshift as destination source and notion as source. But its showing error and am not able to establish a connection. Am getting the below error. Message: HikariPool-1 - Connection is not available, request timed out after 60001ms.
    k
    • 2
    • 2
  • a

    Akilesh V

    07/09/2023, 1:55 PM
    Hello All, connection failing with following error
    Failed to detect if there is a schema change
    k
    • 2
    • 2
  • n

    Nazif Ishrak

    07/10/2023, 1:05 AM
    Is there multiple cursor field support?
    k
    • 2
    • 2
  • e

    Ekansh Verma

    07/10/2023, 9:28 AM
    Hi team! This is regarding airbyte deployment via helm. I want to specify a few of the secrets to values.yaml file. How can I use these secrets by just providing its name and other necessary permissions?
    k
    • 2
    • 2
  • g

    Gerrit van Zyl

    07/10/2023, 11:27 AM
    Hi All - Im setting up airbyte to stream data from postgres to my starrocks instance - it creates the table on the starrocks side but then just hangs. This is the last 3 lines in the logs - any help would be appreciated
    Copy code
    2023-07-10 11:12:14 destination > INFO i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):80 Airbyte message consumer: succeeded.
    2023-07-10 11:12:14 destination > INFO i.a.i.d.b.BufferedStreamConsumer(close):255 executing on success close procedure.
    2023-07-10 11:12:14 destination > INFO i.a.i.d.r.InMemoryRecordBufferingStrategy(flushAllBuffers):85 Flushing agreement: 24684 records (26 MB)
    2023-07-10 11:12:14 destination > INFO i.a.i.d.s.DefaultStreamLoader(send):109 Stream loading, label : airbyte__airbyte_tmp_hav_agreement_4afc7597-60c1-41dc-957f-6731d5322f6a1688987534832, database : test, table : _airbyte_tmp_hav_agreement, request : PUT <http://abd.ser.net:8040/api/test/_airbyte_tmp_hav_agreement/_stream_load> HTTP/1.1
    k
    • 2
    • 2
  • c

    Carolina Buckler

    07/10/2023, 1:56 PM
    Getting this error on all connections
    Copy code
    Last attempt:
    0 Bytes|no records|no records|Job id: 16197|27m 48s
    Failure Origin: airbyte_platform, Message: Something went wrong within the airbyte platform
    and within the logs
    Copy code
    Failure reason: scheduledEventId=58, startedEventId=59, activityType='RunWithJobOutput', activityId='94c73d73-e37d-316f-9a7a-ef6ac57efd11', identity='', retryState=RETRY_STATE_MAXIMUM_ATTEMPTS_REACHED
    k
    m
    +2
    • 5
    • 7
  • s

    Steven Wang

    07/10/2023, 4:42 PM
    Started getting this error with our Posthog to Snowflake connection a few days ago (previously was working fine):
    Copy code
    ValueError: time data '2023-07-08T13:00:28+00:00' does not match format '%Y-%m-%dT%H:%M:%S.%f%z'
    We are using Airbyte Cloud. Anyone else run into this?
    k
    • 2
    • 2
1...210211212...245Latest