https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • c

    Carolina Buckler

    06/08/2023, 6:47 PM
    Just upgraded to v0.50.0 to test the new schema propagation options, but I don’t see the new options available in my connections. Is this only available for certain sources?
    m
    m
    j
    • 4
    • 8
  • n

    Naren Kadiri

    06/08/2023, 7:41 PM
    Hi Everyone I am using SQLServer connector, where trying to sync 1billion records, after sometime I'm getting connection closed error. My strong guess is this connection timeout as query is taking long time to execute. So I would like to configure this execution/connection timeout from airbyte can someone please guide how to make this work!!!!
    k
    • 2
    • 2
  • a

    Alexander Ettingshausen

    06/08/2023, 8:13 PM
    Hi there, I have a question regarding the Amazon Ads Connector. Does the StartDate Parameter resolve to the start of the day (0:00 UTC or rather YYYY-MM-DDT000000Z) or does it resolve to my set StartDate and current sync time?
    🙏 1
    k
    • 2
    • 3
  • j

    Jamshid Hashimi

    06/08/2023, 10:56 PM
    MySQL to typesense connector was working fine but getting this error since the past couple of hours. MySQL and typesense both works fine and I can connect them through other means. The logs:
    Copy code
    2023-06-08 22:25:41 replication-orchestrator > failures: [ {
      "failureOrigin" : "destination",
      "failureType" : "system_error",
      "internalMessage" : "('Connection aborted.', timeout('The write operation timed out'))",
      "externalMessage" : "Something went wrong in the connector. See the logs for more details.",
      "metadata" : {
        "attemptNumber" : 2,
        "jobId" : 2509946,
        "from_trace_message" : true,
        "connector_command" : "write"
      },
      "stacktrace" : "Traceback (most recent call last):\n  File \"/usr/local/lib/python3.9/site-packages/requests/adapters.py\", line 489, in send\n    resp = conn.urlopen(\n  File \"/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py\", line 787, in urlopen\n    retries = retries.increment(\n  File \"/usr/local/lib/python3.9/site-packages/urllib3/util/retry.py\", line 550, in increment\n    raise six.reraise(type(error), error, _stacktrace)\n  File \"/usr/local/lib/python3.9/site-packages/urllib3/packages/six.py\", line 769, in reraise\n    raise value.with_traceback(tb)\n  File \"/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py\", line 703, in urlopen\n    httplib_response = self._make_request(\n  File \"/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py\", line 398, in _make_request\n    conn.request(method, url, **httplib_request_kw)\n  File \"/usr/local/lib/python3.9/site-packages/urllib3/connection.py\", line 239, in request\n    super(HTTPConnection, self).request(method, url, body=body, headers=headers)\n  File \"/usr/local/lib/python3.9/http/client.py\", line 1285, in request\n    self._send_request(method, url, body, headers, encode_chunked)\n  File \"/usr/local/lib/python3.9/http/client.py\", line 1331, in _send_request\n    self.endheaders(body, encode_chunked=encode_chunked)\n  File \"/usr/local/lib/python3.9/http/client.py\", line 1280, in endheaders\n    self._send_output(message_body, encode_chunked=encode_chunked)\n  File \"/usr/local/lib/python3.9/http/client.py\", line 1079, in _send_output\n    self.send(chunk)\n  File \"/usr/local/lib/python3.9/http/client.py\", line 1001, in send\n    self.sock.sendall(data)\n  File \"/usr/local/lib/python3.9/ssl.py\", line 1204, in sendall\n    v = self.send(byte_view[count:])\n  File \"/usr/local/lib/python3.9/ssl.py\", line 1173, in send\n    return self._sslobj.write(data)\nurllib3.exceptions.ProtocolError: ('Connection aborted.', timeout('The write operation timed out'))\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n  File \"/airbyte/integration_code/main.py\", line 11, in <module>\n    DestinationTypesense().run(sys.argv[1:])\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/destinations/destination.py\", line 119, in run\n    for message in output_messages:\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/destinations/destination.py\", line 113, in run_cmd\n    yield from self._run_write(config=config, configured_catalog_path=parsed_args.catalog, input_stream=wrapped_stdin)\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/destinations/destination.py\", line 49, in _run_write\n    yield from self.write(config=config, configured_catalog=catalog, input_messages=input_messages)\n  File \"/airbyte/integration_code/destination_typesense/destination.py\", line 50, in write\n    writer.flush()\n  File \"/airbyte/integration_code/destination_typesense/writer.py\", line 34, in flush\n    self.client.collections[self.steam_name].documents.import_(self.write_buffer)\n  File \"/usr/local/lib/python3.9/site-packages/typesense/documents.py\", line 60, in import_\n    api_response = <http://self.api_call.post|self.api_call.post>(self._endpoint_path('import'), docs_import, params, as_json=False)\n  File \"/usr/local/lib/python3.9/site-packages/typesense/api_call.py\", line 141, in post\n    return self.make_request(<http://requests.post|requests.post>, endpoint, as_json,\n  File \"/usr/local/lib/python3.9/site-packages/typesense/api_call.py\", line 127, in make_request\n    raise last_exception\n  File \"/usr/local/lib/python3.9/site-packages/typesense/api_call.py\", line 101, in make_request\n    r = fn(url, headers={ApiCall.API_KEY_HEADER_NAME: self.config.api_key}, **kwargs)\n  File \"/usr/local/lib/python3.9/site-packages/requests/api.py\", line 115, in post\n    return request(\"post\", url, data=data, json=json, **kwargs)\n  File \"/usr/local/lib/python3.9/site-packages/requests/api.py\", line 59, in request\n    return session.request(method=method, url=url, **kwargs)\n  File \"/usr/local/lib/python3.9/site-packages/requests/sessions.py\", line 587, in request\n    resp = self.send(prep, **send_kwargs)\n  File \"/usr/local/lib/python3.9/site-packages/requests/sessions.py\", line 701, in send\n    r = adapter.send(request, **kwargs)\n  File \"/usr/local/lib/python3.9/site-packages/requests/adapters.py\", line 547, in send\n    raise ConnectionError(err, request=request)\nrequests.exceptions.ConnectionError: ('Connection aborted.', timeout('The write operation timed out'))\n",
      "timestamp" : 1686263140893
    }, {
      "failureOrigin" : "destination",
      "internalMessage" : "Destination process exited with non-zero exit code 1",
      "externalMessage" : "Something went wrong within the destination connector",
      "metadata" : {
        "attemptNumber" : 2,
        "jobId" : 2509946,
    k
    • 2
    • 2
  • m

    Martin Jung

    06/09/2023, 2:43 AM
    Hey there, I'm using the new column selection feature with Airbyte v0.50.0 and I'm trying to import my configs using Octavia CLI (v0.44.4). I'm getting the following error:
    airbyte_api_client.exceptions.ApiTypeError: Invalid type for variable '0'. Required value type is SelectedFieldInfo and passed type was dict at ['selected_fields'][0]
    . Is this because the CLI version doesn't match the Airbyte version, and if so, is there a release for Octavia CLI coming soon? I'm using the Docker image for octavia-cli.
    k
    s
    • 3
    • 5
  • c

    Chidambara Ganapathy

    06/09/2023, 5:14 AM
    Hi Team, I am getting error "Normalisation failed during dbt run"while connecting QuickBooks to snowflake. Why is it happening. any suggestions Thanks
    k
    • 2
    • 2
  • g

    Gary K

    06/09/2023, 6:20 AM
    I’m looking ahead for hills/holes in the road, and wondering if the syncing of decimal data would have caveats? ie, source database decimal column with precision P scale S, converted to json (loss of P & S??), destination database (Snowflake) with ??? data type
    k
    • 2
    • 3
  • r

    Rishav Sinha

    06/09/2023, 8:37 AM
    hi team i am not able to connect elastic search on an internal elasticsearch endpoint (it is accesible through a VPN)
    k
    • 2
    • 8
  • c

    Chidambara Ganapathy

    06/09/2023, 8:52 AM
    Hi Team, When will QuickBooks source beta version be released? Refresh token issue is still prevailing Thanks
    k
    • 2
    • 2
  • m

    Marc Fiani

    06/09/2023, 9:51 AM
    In the Hubspot connectors, I have noticed that the engagement -> deals associations are not updated using incremental sync. Has anyone faced the issue, how have you solved it? 🐛
    k
    • 2
    • 2
  • g

    Gaëtan Podevijn

    06/09/2023, 1:54 PM
    Hi. I upgraded to Airbyte 0.50.1 because I’m interested in the schema evolution propagation feature. It is advertised as follows:
    Copy code
    The Airbyte platform relies on the existing Airbyte protocol primitives to implement schema propagation: the same DiscoverSchema operation that is being run when a user sets up a new connection is also being run automatically before sync. The platform then compares the newly fetched schema with the one that is currently stored from replication.
    So I would expect that before each sync is triggered, Airbyte checks for schema changes and propagates the changes (if the option is selected of course). I tried it with the version of the
    destination-databricks
    version that supports schema evolution and with a postgres source configured with CDC. However, it seems that Airbyte does not check for schema changes before the sync, or I did something wrong. Are there any logs I should check in order to verify that there is a schema change detection before a sync is triggered? Thanks!
    k
    l
    m
    • 4
    • 16
  • v

    Victor Babichev

    06/09/2023, 2:13 PM
    Hi all, I’m trying connect my Redshift as source database, and after UI says ‘All test passed’, when I set up source I catch error: “Discovering schema failed Something went wrong in the connector. See the logs for more details. Internal message: java.lang.NullPointerException: null value in entry: isNullable=null Failure type: system_error ” I double check my settings: i try use redshift super user, use correct schema at settings and try check without schema. What can be wrong?
  • j

    Jose Viera

    06/09/2023, 4:38 PM
    select which columns you want to sync for the streams of your source connector. what is the version airbyte ?
    k
    v
    • 3
    • 3
  • s

    Slackbot

    06/09/2023, 5:48 PM
    This message was deleted.
    k
    • 2
    • 2
  • s

    Slackbot

    06/09/2023, 5:51 PM
    This message was deleted.
    k
    • 2
    • 2
  • o

    Octavia Squidington III

    06/09/2023, 7:45 PM
    🔥 Community Office Hours starts in 15 minutes 🔥 At 1pm PDT click here to join us on Zoom!
  • m

    Matheus Barbosa

    06/09/2023, 10:14 PM
    When will you guys offer support to ClickHouse? We are having lots of problems with that connector but of all GA destinations (only 3) ClickHouse is the only open source alternative
    k
    • 2
    • 3
  • m

    Matheus Barbosa

    06/09/2023, 10:22 PM
    I’m always having this problem when syncing google ads to clickhouse:
    Copy code
    22:20:30.400832 [error] [MainThread]:    Code: 190. DB::Exception: Elements 'ad_group.excluded_parent_asset_field_types' and 'ad_group.targeting_s__g.target_restrictions' of Nested data structure 'ad_group' (Array columns) have different array sizes. (SIZES_OF_ARRAYS_DOESNT_MATCH)
    k
    • 2
    • 2
  • k

    kigland

    06/09/2023, 11:29 PM
    Hello! If I choose Propagate column changes only, do I automatically recognize the column, reset the stream, and then load it? Or is it just changing the column and loading with only future data applied?
    k
    • 2
    • 2
  • k

    kigland

    06/10/2023, 12:04 AM
    Hello! After updating Airbyte boss to 0.5.1, I set a failure slack notification on the connector sink, but it doesnt send to slack notification. Please check if it's a bug.
    k
    • 2
    • 2
  • v

    Vikas Bansal

    06/10/2023, 1:01 PM
    Hey, I've a question about this Airbyte API if anyone can help - https://reference.airbyte.com/reference/initiateoauth What is the name in request params? I've tried different values but it doesn't work
  • w

    Wisnu Jinawi

    06/12/2023, 12:17 AM
    how do i get refresh token in gitlab? #help me everyone
  • k

    Krutik Pathak

    06/12/2023, 5:32 AM
    Hello! I am trying to setup Salesforce as source connection in Airbyte Open Source. I am using developer account without a read only user, followed this walkthrough to setup. I am getting the error "The REST API is not enabled for this organization." It seems like a very common error and resolution is also mentioned in the walkthrough. I went to the System Administrator profile (as I did not create any Read Only user) and under that I saw the API Enabled checkbox already "Checked", still I am getting the same error. I don't have Salesforce experience and I am exploring with Airbyte and Salesforce connectivity. Please advise if I am missing something to find the root cause of this issue. Thanks in advance!
    • 1
    • 1
  • j

    Jan Vermeulen

    06/12/2023, 7:43 AM
    Hi guys - Is there a way to configure an airbyte connection to continue loading other streams if a particular stream errors? It seems like the entire connection fails if one stream does.
    k
    • 2
    • 3
  • j

    Josefin Winberg

    06/12/2023, 8:36 AM
    Hi! I have an issue with the Appstore connector not able to synk our Subscriber, Subscriber Events and Subscription reports. Just the Sales reports. Anyone else experiencing this issue and knows how to solve it? Have setup the access with "Finance" role, which should be enough?
    k
    • 2
    • 3
  • f

    Faris

    06/12/2023, 9:26 AM
    Hi team! I have issues with elastic search connector. I am trying to read data from elastic search into S3. My airbyte is deployed locally for this evaluation phase. the error I have seems to be about the authroization which somewhat unclear still. Here is the error message:
    Copy code
    ERROR i.a.w.i.VersionedAirbyteStreamFactory(internalLog):308 - unknown exception while pinging elasticsearch server
    Stack Trace: ElasticsearchStatusException[Elasticsearch exception [type=security_exception, reason=unable to authenticate with provided credentials and anonymous access is not allowed for this request]]
    	at org.elasticsearch.rest.BytesRestResponse.errorFromXContent(BytesRestResponse.java:176)
    	at org.elasticsearch.client.RestHighLevelClient.parseEntity(RestHighLevelClient.java:1900)
    	at org.elasticsearch.client.RestHighLevelClient.parseResponseException(RestHighLevelClient.java:1877)
    	at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1634)
    	at org.elasticsearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:1606)
    	at org.elasticsearch.client.RestHighLevelClient.performRequestAndParseEntity(RestHighLevelClient.java:1573)
    	at <http://org.elasticsearch.client.RestHighLevelClient.info|org.elasticsearch.client.RestHighLevelClient.info>(RestHighLevelClient.java:774)
    	at io.airbyte.integrations.source.elasticsearch.ElasticsearchConnection.checkConnection(ElasticsearchConnection.java:101)
    	at io.airbyte.integrations.source.elasticsearch.ElasticsearchSource.check(ElasticsearchSource.java:51)
    	at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:125)
    	at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:100)
    	at io.airbyte.integrations.source.elasticsearch.ElasticsearchSource.main(ElasticsearchSource.java:34)
    	Suppressed: org.elasticsearch.client.ResponseException: method [GET], host [<https://vsm-dev-cpu-optimized.es.eu-central-1.aws.cloud.es.io>], URI [/], status line [HTTP/1.1 401 Unauthorized]
    {"error":{"root_cause":[{"type":"security_exception","reason":"unable to authenticate with provided credentials and anonymous access is not allowed for this request","additional_unsuccessful_credentials":"API key: invalid credentials","header":{"WWW-Authenticate":["Basic realm=\"security\" charset=\"UTF-8\"","Bearer realm=\"security\"","ApiKey"]}}],"type":"security_exception","reason":"unable to authenticate with provided credentials and anonymous access is not allowed for this request","additional_unsuccessful_credentials":"API key: invalid credentials","header":{"WWW-Authenticate":["Basic realm=\"security\" charset=\"UTF-8\"","Bearer realm=\"security\"","ApiKey"]}},"status":401}
    		at org.elasticsearch.client.RestClient.convertResponse(RestClient.java:326)
    k
    • 2
    • 2
  • g

    George Myrianthous

    06/12/2023, 10:34 AM
    Hi team! I have setup an Airbyte(v0.44.5 deployed on Kubernetes) connection with Mixpanel source (v0.1.34) and BigQuery destination (v1.4.1). This morning, the connection has failed due to a breaking schema change (field
    category
    was removed). I wanted to ask, how can we potentially disable/ignore any breaking schema changes? In this case, I would expect the connection to keep syncing using the old schema, unless I take a certain action on it. And given that I haven’t updated the connection version, I wouldn’t expect to see any changes at all. Refreshing the source schema and reloading all the data from the very beginning every time this happens is not an option for us, due to the extremely high volume of data we ingest from Mixpanel. Can someone shed some light on this? 🙏
    k
    • 2
    • 2
  • j

    Juan Carbon

    06/12/2023, 2:01 PM
    Hi everyone, I am starting to play around with Airbyte and checking the connection with S3. Is there any way you can partition by date of the column (date) in data you are extracting instead by date of extraction?
    k
    • 2
    • 3
  • o

    Octavia Squidington III

    06/12/2023, 7:45 PM
    🔥 Community Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 1pm PDT click here to join us on Zoom!
  • m

    mangole

    06/12/2023, 9:08 PM
    Hey team, For s3 destination connector - Is it possible to create a new folder for each sync?
    k
    • 2
    • 4
1...201202203...245Latest