https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • n

    Nipuna Prashan

    05/17/2023, 12:01 AM
    Hi Team, I am trying to copy mssql data to snowflake database using airbyte. My connection is fine, after the sync is done my source data has not copied as it is to the snowflake destination. Following describes my settings. Transformation: Raw data (JSON) Source MSSQL data has a column called ContentHash and it contains data like below 0x47326255F46785C89D1F7DA 0x916D001C7A7B06F50B9DAA After I sync data from airbyte connection I can see following data in my snowflake tables related to the ContentHash table G2bU�g�ȝ}��L�)�X�W{ G2bU�g�ȝ}��L�)�X�W{ I believe, this is because of the default encoding of the COPY INTO command in airbyte snowflake connector. https://github.com/airbytehq/airbyte/blob/v0.44.4/airbyte-integrations/connectors/[…]estination/snowflake/SnowflakeInternalStagingSqlOperations.java Is there a way to fix this?
    k
    • 2
    • 3
  • m

    Marcus Vicentini

    05/17/2023, 2:14 AM
    Hi Everyone, I'm facing a problem with SFTP Bulk connector. I am trying to get a CSV file encoding in UTF-16. So everytime I trigger the connection I get the following error: ERROR i.a.w.i.DefaultAirbyteStreamFactory(internalLog):164 - 'utf-8' codec can't decode byte 0xff in position 0: invalid start byte Is there any way to change the connector settings to read the CSV file in UTF-16? Thanks
    k
    • 2
    • 2
  • d

    DR

    05/17/2023, 8:16 AM
    Any idea, Why the AppStore source tables are listed during the connection
  • b

    Budiono Santoso

    05/17/2023, 9:07 AM
    Hello everyone. I try DynamoDB source connections and get error : 2023-05-17 090634 INFO i.a.w.t.TemporalAttemptExecution(get):138 - Cloud storage job log path: /workspace/be2a28b8-a202-4f77-a2fa-63599056ae8a/0/logs.log. How to fix this error?
    k
    • 2
    • 2
  • y

    Yusuf Mirkar

    05/17/2023, 9:50 AM
    Hello everyone, new here, really love the postgres<>postgres connection for replication and will soon use that in production. One question before that, what is the meaning of non-breaking schema changes ?
    k
    • 2
    • 22
  • y

    Yusuf Mirkar

    05/17/2023, 11:06 AM
    Hello everyone, Suppose I am using incremental deduped+history sync, then a breaking schema change appeared on source, So I deleted existing connection of incremental sync and created new connection for full refresh overwrite. Now in next sync onwards, I again want incremental deduped+history sync, what to do ? Edit connection of full refresh to incremental OR again create new connection of incremental ?
    k
    • 2
    • 2
  • a

    Alban Dumouilla

    05/17/2023, 12:24 PM
    Hi Airbyte team, I'm trying to connect to a Hubspot source using my customers' oauth data. I created the source through the API using my hubspot client_id, client_secret and client's refresh_token, and from what I understand I need to create an Oauth Override in my workspace. Here's my request to create it :
    Copy code
    curl --request PUT \
    --url <https://api.airbyte.com/v1/workspaces/[WP-id]/oauthCredentials> \
    --header 'authorization: Bearer [API_KEY]]' \
    --header 'content-type: application/json' \
    --data '
    {
    "configuration": {
    "redirect_uri": "<http://api-insights.pyko.co/auth/hubspot/callback|api-insights.pyko.co/auth/hubspot/callback>"
    },
    "name": "[SOURCE_NAME]",
    "actorType": "source"
    }
    But it always gives me the same error : This gives me the error :
    Copy code
    {
      "type": "<https://reference.airbyte.com/reference/errors>",
      "title": "value-not-found",
      "status": 400,
      "detail": "Submitted value could not be found: [SOURCE_NAME]"
    }
    As a precision, for privacy reasons I changed my real source name by [SOURCE_NAME] and the name I'm using is an existing source in my workspace. I must be doing something wrong, but what ? The documentation on this part is very unclear. Thanks a lot !
    k
    • 2
    • 5
  • y

    Yusuf Mirkar

    05/17/2023, 12:44 PM
    Hi, when i added new column to source while replication, I am getting this message on clicking on "Refresh source schema" button. Message is "Due to changes in the stream configuration, we recommend a data reset. A reset will delete data in the destination of the affected streams and then re-sync that data. Skipping the reset is discouraged and might lead to unexpected behavior." Does it mean that it will do full refresh on every new column ? I am using
    incremental deduped + history
    mode
    k
    • 2
    • 2
  • y

    Yusuf Mirkar

    05/17/2023, 1:06 PM
    Hi all, Even in
    incremental deduped + history
    , 3 tables are being created - _scd , raw and final. 3 tables are created in
    cdc
    but only 2 should be created in
    incremental deduped + history
    Reference link - https://airbytehq.slack.com/archives/C01AHCD885S/p1684318581542639?thread_ts=1684317481.420539&amp;cid=C01AHCD885S
    k
    • 2
    • 2
  • y

    Yusuf Mirkar

    05/17/2023, 1:37 PM
    Hi all, how much load will airbyte cause on source db when full refresh-overwrite sync is happening ?
    k
    • 2
    • 11
  • p

    Pascal Moreau

    05/17/2023, 3:26 PM
    Hello all 🙂 I am setting up the Postgres CDC connection and I had 2 issues: • On airbyte side: when the connection does not include all the tables from the source it find 0 changes even if there is (cf screenshot), when all tables are replicating it works well • On postgres the WAL does not seem to be deleted, in the replication_slot the value
    confirmed_flush_lsn
    change but not the
    restart_lsn
    , in my understanding the restart_lsn should update when a sync is done shouldn’t it? Thanks in advance 🙂 Airbyte version: 0.40.22 Postgres connector: 1.0.25
  • i

    Ivan Zhabin

    05/17/2023, 3:57 PM
    I have a question about the salesforce connector, is there any way to download objects that have been deleted.
    k
    • 2
    • 2
  • g

    Gabriel Levine

    05/17/2023, 4:04 PM
    Can I change the timeout for schema discovery in the webapp? I’m running Kubernetes via Helm
    k
    • 2
    • 5
  • a

    Abdeljalil

    05/17/2023, 4:44 PM
    Dear team, Having connection issues related to the file connector, you’ll find logs below, couldn’t find any directions or similar issues, can you help please?
    Copy code
    2023-05-17 16:36:50 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):149 - docker: Error response from daemon: exec: "docker-init": executable file not found in $PATH.
    2023-05-17 16:36:50 [33mWARN[m i.a.w.g.DefaultCheckConnectionWorker(run):108 - Check connection job subprocess finished with exit code 127
    2023-05-17 16:36:50 [1;31mERROR[m i.a.w.g.DefaultCheckConnectionWorker(run):125 - Unexpected error while checking connection: 
    io.airbyte.workers.exception.WorkerException: Error checking connection status: no status nor failure reason were outputted
    	at io.airbyte.workers.WorkerUtils.throwWorkerException(WorkerUtils.java:267) ~[io.airbyte-airbyte-commons-worker-0.44.4.jar:?]
    	at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:118) ~[io.airbyte-airbyte-commons-worker-0.44.4.jar:?]
    	at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:43) ~[io.airbyte-airbyte-commons-worker-0.44.4.jar:?]
    	at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.44.4.jar:?]
    	at java.lang.Thread.run(Thread.java:1589) ~[?:?]
    2023-05-17 16:36:50 [32mINFO[m i.a.w.t.TemporalAttemptExecution(lambda$getWorkerThread$5):198 - Completing future exceptionally...
    io.airbyte.workers.exception.WorkerException: Unexpected error while getting checking connection.
    	at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:127) ~[io.airbyte-airbyte-commons-worker-0.44.4.jar:?]
    	at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:43) ~[io.airbyte-airbyte-commons-worker-0.44.4.jar:?]
    	at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.44.4.jar:?]
    	at java.lang.Thread.run(Thread.java:1589) ~[?:?]
    Caused by: io.airbyte.workers.exception.WorkerException: Error checking connection status: no status nor failure reason were outputted
    	at io.airbyte.workers.WorkerUtils.throwWorkerException(WorkerUtils.java:267) ~[io.airbyte-airbyte-commons-worker-0.44.4.jar:?]
    	at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:118) ~[io.airbyte-airbyte-commons-worker-0.44.4.jar:?]
    	... 3 more
    2023-05-17 16:36:50 [32mINFO[m i.a.c.i.LineGobbler(voidCall):149 - 
    2023-05-17 16:36:50 [32mINFO[m i.a.c.i.LineGobbler(voidCall):149 - ----- END CHECK -----
    k
    • 2
    • 2
  • b

    Bruno Alano

    05/17/2023, 8:41 PM
    [Facebook Marketing Connector with Errors]
    Hello, I've tried both on Airbyte Cloud and Self-Hosted, and keep getting error on the Facebook Marketing/Ads connector. Using alternatives, like Singer
    tap-facebook
    and Meltano, seems to work with the same access token. There's some workaround for that?
    Copy code
    2023-05-17 20:32:41 INFO i.a.w.g.DefaultReplicationWorker(getReplicationOutput):450 - failures: [ {
      "failureOrigin" : "source",
      "failureType" : "system_error",
      "internalMessage" : "\n\n  Message: Call was not successful\n  Method:  GET\n  Path:    <https://graph.facebook.com/v16.0/act_766503135008104/>\n  Params:  {'fields': 'account_id,account_status,age,amount_spent,balance,business,business_city,business_country_code,business_name,business_state,business_street,business_street2,business_zip,can_create_brand_lift_study,capabilities,created_time,currency,disable_reason,end_advertiser,end_advertiser_name,extended_credit_invoice_group,failed_delivery_checks,fb_entity,funding_source,funding_source_details,has_advertiser_opted_in_odax,has_migrated_permissions,id,io_number,is_attribution_spec_system_default,is_direct_deals_enabled,is_in_3ds_authorization_enabled_market,is_notifications_enabled,is_personal,is_prepay_account,is_tax_id_required,line_numbers,media_agency,min_campaign_group_spend_cap,min_daily_budget,name,offsite_pixels_tos_accepted,owner,partner,rf_spec,spend_cap,tax_id,tax_id_status,tax_id_type,timezone_id,timezone_name,timezone_offset_hours_utc,tos_accepted,user_tasks,user_tos_accepted'}\n\n  Status:  400\n  Response:\n    {\n      \"error\": {\n        \"message\": \"Unsupported request - method type: get\",\n        \"type\": \"GraphMethodException\",\n        \"code\": 100,\n        \"fbtrace_id\": \"ATGSAEKKmxLP9juhP_MIaIu\"\n      }\n    }\n",
      "externalMessage" : "Something went wrong in the connector. See the logs for more details.",
      "metadata" : {
        "attemptNumber" : 0,
        "jobId" : 1,
        "from_trace_message" : true,
        "connector_command" : "read"
      },
      "stacktrace" : "Traceback (most recent call last):\n  File \"/airbyte/integration_code/main.py\", line 13, in <module>\n    launch(source, sys.argv[1:])\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py\", line 156, in launch\n    for message in source_entrypoint.run(parsed_args):\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py\", line 98, in run\n    yield from map(AirbyteEntrypoint.airbyte_message_to_string, self.read(source_spec, config, config_catalog, state))\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py\", line 132, in read\n    yield from self.source.read(self.logger, config, catalog, state)\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py\", line 139, in read\n    raise e\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py\", line 120, in read\n    yield from self._read_stream(\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py\", line 194, in _read_stream\n    for record in record_iterator:\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py\", line 324, in _read_full_refresh\n    for record_data_or_message in record_data_or_messages:\n  File \"/airbyte/integration_code/source_facebook_marketing/streams/base_streams.py\", line 113, in read_records\n    for record in loaded_records_iter:\n  File \"/airbyte/integration_code/source_facebook_marketing/streams/base_streams.py\", line 109, in <genexpr>\n    loaded_records_iter = (record.api_get(fields=self.fields, pending=self.use_batch) for record in records_iter)\n  File \"/usr/local/lib/python3.9/site-packages/facebook_business/adobjects/adaccount.py\", line 259, in api_get\n    return request.execute()\n  File \"/usr/local/lib/python3.9/site-packages/facebook_business/api.py\", line 682, in execute\n    response = self._api.call(\n  File \"/usr/local/lib/python3.9/site-packages/backoff/_sync.py\", line 105, in retry\n    ret = target(*args, **kwargs)\n  File \"/airbyte/integration_code/source_facebook_marketing/api.py\", line 152, in call\n    response = super().call(method, path, params, headers, files, url_override, api_version)\n  File \"/usr/local/lib/python3.9/site-packages/facebook_business/api.py\", line 350, in call\n    raise fb_response.error()\nfacebook_business.exceptions.FacebookRequestError: \n\n  Message: Call was not successful\n  Method:  GET\n  Path:    <https://graph.facebook.com/v16.0/act_766503135008104/>\n  Params:  {'fields': 'account_id,account_status,age,amount_spent,balance,business,business_city,business_country_code,business_name,business_state,business_street,business_street2,business_zip,can_create_brand_lift_study,capabilities,created_time,currency,disable_reason,end_advertiser,end_advertiser_name,extended_credit_invoice_group,failed_delivery_checks,fb_entity,funding_source,funding_source_details,has_advertiser_opted_in_odax,has_migrated_permissions,id,io_number,is_attribution_spec_system_default,is_direct_deals_enabled,is_in_3ds_authorization_enabled_market,is_notifications_enabled,is_personal,is_prepay_account,is_tax_id_required,line_numbers,media_agency,min_campaign_group_spend_cap,min_daily_budget,name,offsite_pixels_tos_accepted,owner,partner,rf_spec,spend_cap,tax_id,tax_id_status,tax_id_type,timezone_id,timezone_name,timezone_offset_hours_utc,tos_accepted,user_tasks,user_tos_accepted'}\n\n  Status:  400\n  Response:\n    {\n      \"error\": {\n        \"message\": \"Unsupported request - method type: get\",\n        \"type\": \"GraphMethodException\",\n        \"code\": 100,\n        \"fbtrace_id\": \"ATGSAEKKmxLP9juhP_MIaIu\"\n      }\n    }\n\n",
      "timestamp" : 1684355539962
    },
    k
    • 2
    • 5
  • l

    Leslie Strauss

    05/17/2023, 11:05 PM
    Hi there, my team is seeing intermittent connection errors to both our source (RDS instance) and destination (Snowflake). For each, the connection tests are successful on about half of the tries. We are running Airbyte on Kubernetes (EKS). Here is the worker stacktrace we are seeing:
    Copy code
    Log4j2Appender says: Unexpected error while checking connection:
    2023-05-17 22:45:26 ERROR i.a.w.g.DefaultCheckConnectionWorker(run):125 - Unexpected error while checking connection:
    io.airbyte.workers.exception.WorkerException: Error checking connection status: no status nor failure reason were outputted
    	at io.airbyte.workers.WorkerUtils.throwWorkerException(WorkerUtils.java:267) ~[io.airbyte-airbyte-commons-worker-0.44.2.jar:?]
    	at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:118) ~[io.airbyte-airbyte-commons-worker-0.44.2.jar:?]
    	at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:43) ~[io.airbyte-airbyte-commons-worker-0.44.2.jar:?]
    	at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.44.2.jar:?]
    	at java.lang.Thread.run(Thread.java:1589) ~[?:?]
    Log4j2Appender says: Completing future exceptionally...
    Log4j2Appender says:
    2023-05-17 22:45:26 INFO i.a.w.t.TemporalAttemptExecution(lambda$getWorkerThread$5):198 - Completing future exceptionally...
    io.airbyte.workers.exception.WorkerException: Unexpected error while getting checking connection.
    	at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:127) ~[io.airbyte-airbyte-commons-worker-0.44.2.jar:?]
    	at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:43) ~[io.airbyte-airbyte-commons-worker-0.44.2.jar:?]
    	at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.44.2.jar:?]
    	at java.lang.Thread.run(Thread.java:1589) ~[?:?]
    Caused by: io.airbyte.workers.exception.WorkerException: Error checking connection status: no status nor failure reason were outputted
    2023-05-17 22:45:26 WARN i.t.i.s.WorkflowExecuteRunnable(throwAndFailWorkflowExecution):134 - Workflow execution failure WorkflowId='7b3b6006-d2c1-4b3d-af4c-db1872bf8af9', RunId=fd19da30-f939-42ca-b82b-f4bdc34fc8cd, WorkflowType='CheckConnectionWorkflow'
    io.temporal.failure.ActivityFailure: scheduledEventId=6, startedEventId=7, activityType='RunWithJobOutput', activityId='eecf8568-b1c7-3856-bb15-69f81625bb5c', identity='1@airbyte-staging-worker-6d58f9c954-wfwvw', retryState=RETRY_STATE_MAXIMUM_ATTEMPTS_REACHED
    	at java.lang.Thread.getStackTrace(Thread.java:2550) ~[?:?]
    	at io.temporal.internal.sync.ActivityStubBase.execute(ActivityStubBase.java:49) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.sync.ActivityInvocationHandler.lambda$getActivityFunc$0(ActivityInvocationHandler.java:78) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.sync.ActivityInvocationHandlerBase.invoke(ActivityInvocationHandlerBase.java:60) ~[temporal-sdk-1.17.0.jar:?]
    	at jdk.proxy2.$Proxy89.runWithJobOutput(Unknown Source) ~[?:?]
    	at io.airbyte.workers.temporal.check.connection.CheckConnectionWorkflowImpl.run(CheckConnectionWorkflowImpl.java:54) ~[io.airbyte-airbyte-workers-0.44.2.jar:?]
    	at CheckConnectionWorkflowImplProxy.run$accessor$ywRkvQZf(Unknown Source) ~[?:?]
    	at CheckConnectionWorkflowImplProxy$auxiliary$i7FPOmWz.call(Unknown Source) ~[?:?]
    	at io.airbyte.workers.temporal.support.TemporalActivityStubInterceptor.execute(TemporalActivityStubInterceptor.java:79) ~[io.airbyte-airbyte-workers-0.44.2.jar:?]
    	at CheckConnectionWorkflowImplProxy.run(Unknown Source) ~[?:?]
    	at jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104) ~[?:?]
    	at java.lang.reflect.Method.invoke(Method.java:578) ~[?:?]
    	at io.temporal.internal.sync.POJOWorkflowImplementationFactory$POJOWorkflowImplementation$RootWorkflowInboundCallsInterceptor.execute(POJOWorkflowImplementationFactory.java:302) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.sync.POJOWorkflowImplementationFactory$POJOWorkflowImplementation.execute(POJOWorkflowImplementationFactory.java:277) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.sync.WorkflowExecuteRunnable.run(WorkflowExecuteRunnable.java:71) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.sync.SyncWorkflow.lambda$start$0(SyncWorkflow.java:116) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.sync.CancellationScopeImpl.run(CancellationScopeImpl.java:102) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.sync.WorkflowThreadImpl$RunnableWrapper.run(WorkflowThreadImpl.java:106) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.worker.ActiveThreadReportingExecutor.lambda$submit$0(ActiveThreadReportingExecutor.java:53) ~[temporal-sdk-1.17.0.jar:?]
    	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:577) ~[?:?]
    	at java.util.concurrent.FutureTask.run(FutureTask.java:317) ~[?:?]
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    	at java.lang.Thread.run(Thread.java:1589) ~[?:?]
    	at io.airbyte.workers.WorkerUtils.throwWorkerException(WorkerUtils.java:267) ~[io.airbyte-airbyte-commons-worker-0.44.2.jar:?]
    	at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:118) ~[io.airbyte-airbyte-commons-worker-0.44.2.jar:?]
    	... 3 more
    Do you have any sense as to what the issue may be?
    k
    • 2
    • 6
  • a

    Ankit Kumar

    05/18/2023, 4:20 AM
    Hi team need some info about CursorPagination https://docs.airbyte.com/connector-development/config-based/understanding-the-yaml-file/pagination/#cursor-paginator-in-path
  • n

    Nathan Gold

    05/18/2023, 2:52 PM
    Hello, I am experiencing an issue with Airbyte correctly copying data from my organizations Postgres DB to BigQuery. It seems that not all of the records that I would expect to be copied are being copied over (originally we were missing 1 record, and after our automated sync just ran again we are now missing more than 2,000 records). We had thought that the sync would have solved our issue, however, it appears to not have. In addition, before we tried to run this sync, we noticed that older records which had been written to BigQuery before from Airbyte were being altered in a way that we could not understand. These were not records that should have been updated at all. Any assistance here would be greatly appreciated. @Divine
    k
    • 2
    • 2
  • j

    Joel Olazagasti

    05/18/2023, 2:52 PM
    Is it possible (without breaking things) to set up full-refresh sync that overwrites the base tables (but not scd's) of a source that's incremental-deduped? We want to refresh our Salesforce sync once a week, since the calculated fields start to drift
    k
    • 2
    • 2
  • y

    Yusuf Mirkar

    05/18/2023, 3:02 PM
    how to delete older sync logs to save space ?
    k
    • 2
    • 2
  • k

    Kala Aditya

    05/18/2023, 3:02 PM
    Hello, I am new to Airbyte and trying to build a custom Source in Airbyte for an AWS service. I have followed the Python Source method to create it. While syncing it to a Local CSV or to any other destinations, only a CSV file with _airbyte___data, _airbyte_id, _airbyte_emitted_at columns are written but the data is not, at the end the sync is successful without any errors... The source is able to get the data from the API, but it is unable to read it to the destination it seems. Here is the log:
    Copy code
    2023-05-18 14:51:07 INFO i.a.w.t.TemporalAttemptExecution(get):136 - Docker volume job log path: /tmp/workspace/6/0/logs.log
    2023-05-18 14:51:07 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.40.17
    2023-05-18 14:51:08 INFO i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):50 - Using default value for environment variable LOG_CONNECTOR_MESSAGES: 'false'
    2023-05-18 14:51:08 INFO i.a.c.EnvConfigs(getEnvOrDefault):1079 - Using default value for environment variable METRIC_CLIENT: ''
    2023-05-18 14:51:08 WARN i.a.m.l.MetricClientFactory(initialize):60 - Metric client is already initialized to
    2023-05-18 14:51:08 INFO i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):50 - Using default value for environment variable LOG_CONNECTOR_MESSAGES: 'false'
    2023-05-18 14:51:08 INFO i.a.w.g.DefaultReplicationWorker(run):125 - start sync worker. job id: 6 attempt id: 0
    2023-05-18 14:51:08 INFO i.a.w.g.DefaultReplicationWorker(run):141 - configured sync modes: {null.EVA and Equity  EVA by Industry=full_refresh - overwrite}
    2023-05-18 14:51:08 INFO i.a.w.i.DefaultAirbyteDestination(start):65 - Running destination...
    2023-05-18 14:51:08 INFO i.a.c.i.LineGobbler(voidCall):114 -
    2023-05-18 14:51:08 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- START REPLICATION -----
    2023-05-18 14:51:08 INFO i.a.c.i.LineGobbler(voidCall):114 -
    2023-05-18 14:51:08 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/destination-csv:0.2.10 exists...
    2023-05-18 14:51:08 INFO i.a.c.i.LineGobbler(voidCall):114 - airbyte/destination-csv:0.2.10 was found locally.
    2023-05-18 14:51:08 INFO i.a.w.p.DockerProcessFactory(create):119 - Creating docker container = destination-csv-write-6-0-cneip with resources io.airbyte.config.ResourceRequirements@6f08b143[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=]
    2023-05-18 14:51:08 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/6/0 --log-driver none --name destination-csv-write-6-0-cneip --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_CONNECTOR_IMAGE=airbyte/destination-csv:0.2.10 -e WORKER_JOB_ATTEMPT=0 -e AIRBYTE_VERSION=0.40.17 -e WORKER_JOB_ID=6 airbyte/destination-csv:0.2.10 write --config destination_config.json --catalog destination_catalog.json
    2023-05-18 14:51:08 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/source-aws-dataexchange-api:dev exists...
    2023-05-18 14:51:08 INFO i.a.w.p.DockerProcessFactory(create):119 - Creating docker container = source-aws-dataexchange-api-read-6-0-znwgl with resources io.airbyte.config.ResourceRequirements@7ee6c8e6[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=]
    2023-05-18 14:51:08 INFO i.a.c.i.LineGobbler(voidCall):114 - airbyte/source-aws-dataexchange-api:dev was found locally.
    2023-05-18 14:51:08 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/6/0 --log-driver none --name source-aws-dataexchange-api-read-6-0-znwgl --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_CONNECTOR_IMAGE=airbyte/source-aws-dataexchange-api:dev -e WORKER_JOB_ATTEMPT=0 -e AIRBYTE_VERSION=0.40.17 -e WORKER_JOB_ID=6 airbyte/source-aws-dataexchange-api:dev read --config source_config.json --catalog source_catalog.json
    2023-05-18 14:51:11 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):372 - Source has no more messages, closing connection.
    2023-05-18 14:51:11 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):381 - Total records read: 0 (0 bytes)
    2023-05-18 14:51:11 INFO i.a.w.g.DefaultReplicationWorker(run):190 - One of source or destination thread complete. Waiting on the other.
    2023-05-18 14:51:11 destination > 2023-05-18 14:51:11 INFO i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):65 - Airbyte message consumer: succeeded.
    2023-05-18 14:51:11 destination > 2023-05-18 14:51:11 INFO i.a.i.d.c.CsvDestination$CsvConsumer(close):179 - finalizing consumer.
    2023-05-18 14:51:11 destination > 2023-05-18 14:51:11 INFO i.a.i.d.c.CsvDestination$CsvConsumer(close):195 - File output: /local/test/_airbyte_raw_EVA_and_Equity_EVA_by_Industry.csv
    2023-05-18 14:51:11 destination > 2023-05-18 14:51:11 INFO i.a.i.b.IntegrationRunner(runInternal):171 - Completed integration: io.airbyte.integrations.destination.csv.CsvDestination
    2023-05-18 14:51:11 INFO i.a.w.g.DefaultReplicationWorker(run):192 - Source and destination threads complete.
    2023-05-18 14:51:11 INFO i.a.w.g.DefaultReplicationWorker(run):297 - Source did not output any state messages
    2023-05-18 14:51:11 WARN i.a.w.g.DefaultReplicationWorker(run):308 - State capture: No state retained.
    2023-05-18 14:51:08 INFO i.a.w.g.DefaultReplicationWorker(lambda$getDestinationOutputRunnable$7):454 - Destination output thread started.
    "maxSecondsBeforeSourceStateMessageEmitted" : 0,
        "maxSecondsBetweenStateMessageEmittedandCommitted" : 0,
        "meanSecondsBetweenStateMessageEmittedandCommitted" : 0,
        "replicationStartTime" : 1684421468080,
        "replicationEndTime" : 1684421471222,
        "sourceReadStartTime" : 1684421468130,
        "sourceReadEndTime" : 1684421471088,
        "destinationWriteStartTime" : 1684421468224,
        "destinationWriteEndTime" : 1684421471222
      },
      "streamStats" : [ ]
    }
    2023-05-18 14:51:11 INFO i.a.w.g.DefaultReplicationWorker(run):317 - failures: [ ]
    2023-05-18 14:51:11 INFO i.a.w.t.TemporalAttemptExecution(get):162 - Stopping cancellation check scheduling...
    2023-05-18 14:51:11 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):176 - sync summary:
    Any help would be appreciated
    k
    • 2
    • 3
  • y

    Yusuf Mirkar

    05/18/2023, 3:06 PM
    how to limit sync using too much ram ?
    k
    • 2
    • 10
  • s

    Sunny Hashmi

    05/18/2023, 4:18 PM
    👋 Hi all, we have an exciting announcement to share! Next week's Daily Airbyte Office Hours will feature Deep Dive Sessions hosted by the one and only @[DEPRECATED] Marcos Marx octavia muscle During the deep-dive sessions, Marcos will explain how Airbyte works, delving into each component in every session and explaining their functions. If you’re curious or want to learn more about Airbyte, these sessions will be truly valuable to you. For the first week we’re diving into the
    airbyte-bootloader
    and the
    airbyte-db
    services. The presentation will be 20 min, and we'll dedicate the remaining 25 min to questions about the daily topic or general Q&A. Check out the schedule below 👇 Reminders and updates will be posted in #C045VK5AF54 🔥 Deep Dive Sessions: airbyte-bootloader • Monday May 22 - 1pm PDT (zoom link) • Tuesday May 23 - 16:00 CEST / 10am EDT (zoom link) 🔥 Deep Dive Sessions: airbyte-db + Airbyte Database Internals • Wednesday May 24 - 1pm PDT (zoom link) • Thursday May 25 - 16:00 CEST / 10am EDT (zoom link) 🔥 Open Q&A • Friday May 26 - 1pm PDT (zoom link) Hope to see you there! octavia rocket
  • r

    Raphael Pacheco

    05/18/2023, 5:11 PM
    Hello! I need to replicate a Postgres instance in a Bigquery and I know that with Airbyte this is possible. However, within this instance of Postgres, we have several databases, in which they are named by the identification code of the branches of a company. Therefore, from time to time, this number of banks may increase, as new branches appear. How can I make the Airbyte integration dynamic, picking up these new banks, without the need to create new connections every time a branch is created?
    k
    • 2
    • 2
  • y

    Yusuf Mirkar

    05/18/2023, 5:26 PM
    what value will come in JOB_MAIN_CONTAINER_MEMORY_REQUEST if i want to give minimum 2 gb of ram
    k
    • 2
    • 2
  • y

    Yusuf Mirkar

    05/18/2023, 7:31 PM
    if i have 40 tables, when i start sync, will it sync all at once or how many at same time ?
    k
    • 2
    • 5
  • s

    Slackbot

    05/18/2023, 7:35 PM
    This message was deleted.
    k
    • 2
    • 2
  • a

    Andre Santos

    05/18/2023, 7:42 PM
    Hi Folks, I'm testing netsuite connector here. I found that the connection is extracting 1000 records only every 30 minutes... Do you know if this is acceptable... or if really indicates to some issues with the api... or any configuration...
    k
    • 2
    • 2
  • a

    Andre Santos

    05/18/2023, 7:43 PM
    The connection is taking to long to run... and after 9 hours running the connections fails due to time out. So the first attempt extracts let me say... 12000 records... but have failed... then the next attempt will start all over... loosing all the progress achieved... in the end... the connection never extracts all invoices... 😞
    k
    • 2
    • 2
  • y

    Yusuf Mirkar

    05/18/2023, 7:55 PM
    what is
    set new fetch size
    in sync logs
    k
    • 2
    • 2
1...193194195...245Latest