https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • f

    Fabrizio Spini

    12/04/2025, 8:36 AM
    @kapa.ai is it possible to retrieve from internal postgres all the connection states?
    k
    • 2
    • 10
  • r

    Renu Fulmali

    12/04/2025, 9:57 AM
    @kapa.ai I am getting this error while trying to connect to iex/sfmc endpoints. Could not connect to the server with provided configuration. com.jcraft.jsch.JSchAlgoNegoFailException: Algorithm negotiation fail: algorithmName="server_host_key" jschProposal="ssh-ed25519,ecdsa-sha2-nistp256,ecdsa-sha2-nistp384,ecdsa-sha2-nistp521,rsa-sha2-512,rsa-sha2-256" serverProposal="ssh-rsa"
    k
    • 2
    • 1
  • s

    Slackbot

    12/04/2025, 10:08 AM
    This message was deleted.
    k
    • 2
    • 1
  • s

    Slackbot

    12/04/2025, 10:12 AM
    This message was deleted.
    k
    k
    • 3
    • 4
  • a

    Ahmed Ebrahim

    12/04/2025, 10:18 AM
    What is the latest Airbyte Helm chart v2 version?
    k
    • 2
    • 1
  • a

    A S Yamini

    12/04/2025, 1:52 PM
    Configuration check failed Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/sources/file_based/availability_strategy/default_file_based_availability_strategy.py", line 121, in _check_parse_record record = next( ^^^^^ File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/sources/file_based/file_types/csv_parser.py", line 246, in parse_records for row in data_generator: File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/sources/file_based/file_types/csv_parser.py", line 68, in read_data with stream_reader.open_file(file, file_read_mode, config_format.encoding, logger) as fp: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/airbyte/integration_code/source_sftp_bulk/stream_reader.py", line 92, in open_file remote_file = self.sftp_client.sftp_connection.open(file.uri, mode=mode.value) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/paramiko/sftp_client.py", line 372, in open t, msg = self._request(CMD_OPEN, filename, imode, attrblock) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/paramiko/sftp_client.py", line 857, in _request return self._read_response(num) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/paramiko/sftp_client.py", line 909, in _read_response self._convert_status(msg) File "/usr/local/lib/python3.11/site-packages/paramiko/sftp_client.py", line 940, in _convert_status raise IOError(errno.EACCES, text) PermissionError: [Errno 13] Permission denied The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/sources/file_based/availability_strategy/default_file_based_availability_strategy.py", line 81, in check_availability_and_parsability self._check_parse_record(stream, file, logger) File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/sources/file_based/availability_strategy/default_file_based_availability_strategy.py", line 136, in _check_parse_record raise CheckAvailabilityError( airbyte_cdk.sources.file_based.exceptions.CheckAvailabilityError: Error opening file. Please check the credentials provided in the config and verify that they provide permission to read files. Contact Support if you need assistance. stream=Sheet1-ChartOfAccounts file=//swapfile
    k
    • 2
    • 1
  • m

    Mogileeswar

    12/04/2025, 5:00 PM
    Airbyte server pod giving 404 object not found for root route / i should serve same as the web app pod right for latest helmet v2
    k
    • 2
    • 2
  • m

    Mogileeswar

    12/04/2025, 5:07 PM
    Airbyte server pod giving 404 object not found for root route / i should serve same as the web app pod right for latest helmet v2 I have latest ingress alb server that uses server for root And connector builder also I have webapp enable default value false bur web app still giving pod
    k
    • 2
    • 4
  • r

    Rafael Felipe

    12/04/2025, 6:45 PM
    how to fill out connection string for mongo db I have that info below: <mongodb://gl_reader:gFHAC8myXt7c4e96xsLuEk@10.25.20.243:27017%7Cmongodb://&lt;user&gt;:&lt;pass&gt;@&gt;&lt;list of hosts>/?replicaSet=ConnectDevReplica&authSource=admin&readPreference=secondaryPreferred
    k
    • 2
    • 1
  • z

    Zach Schmid

    12/04/2025, 7:24 PM
    i have 3 different API endpoints i need to query for a particular record. any incremental updates on api endpoint 1, should cause a query to api endpoint 2, and any updates to api endpoint 2, would then query api endpoint 3. what is the best way to handle this in airbyte? should this be a transformation inside of a single stream? or should these be handled as individual streams? if we go the substream/dependent stream route is there a way to prevent the substreams from resyncing all data every time, and only make requests based on the parent streams updated record?
    k
    t
    • 3
    • 4
  • s

    Slackbot

    12/04/2025, 11:24 PM
    This message was deleted.
    k
    a
    • 3
    • 3
  • t

    Tarush Khatri

    12/05/2025, 12:01 AM
    @kapa.ai why i cant see request for errors in current version of airbyte cloud ? How to check what calls airbyte is making? It is when i am making new custom api connection?
    k
    • 2
    • 1
  • s

    Sean Stach

    12/05/2025, 3:00 AM
    Getting an error like this WARNING Encountered an issue deploying Airbyte: Pod: airbyte-db-{an id} Reason: BackOff Message: Back-off restarting failed container airbyte-db-container in pod airbyte-db-0_airbyte-abctl(an id) Count: 923 Logs: chown: /var/lib/postgresql/data/pgdata: Operation not permitted chmod: /var/lib/postgresql/data/pgdata: Operation not permitted PostgreSQL Database directory appears to contain a database; Skipping initialization date UTC [16] FATAL: data directory "/var/lib/postgresql/data/pgdata" has wrong ownership date UTC [16] HINT: The server must be started by the user that owns the data directory.
    k
    • 2
    • 38
  • g

    Gongkui Peng

    12/05/2025, 3:46 AM
    @kapa.ai what the mininal airbyte version to use snowflake-destination 4.x
    k
    • 2
    • 1
  • h

    H1ROME

    12/05/2025, 4:46 AM
    @kapa.ai I understand that it's currently difficult to operate Airbyte correctly in a proxy environment. If I want to build Airbyte in a zone that cannot directly access the internet, what are the steps I would need to follow for deployment? Are there any alternative solutions, such as setting up a local repository by building our own server as a mirror server, and having Airbyte reference that?
    k
    • 2
    • 1
  • s

    Sean Stach

    12/05/2025, 4:50 AM
    Dec 5, 2025, 10:32 AM|3.52 MB|6,044 records extracted|no records loaded|Job id: 6322|5h 17m 12s I have this s3 source that takes 5 hours yet loads no files. I think it's reloaded files it's already seen?
    k
    • 2
    • 4
  • s

    Slackbot

    12/05/2025, 5:47 AM
    This message was deleted.
    k
    • 2
    • 1
  • a

    A S Yamini

    12/05/2025, 5:48 AM
    [2025-12-05 111519]
    INFO
    -
    Error inserting file processing record: current transaction is aborted, commands ignored until end of transaction block
    source=task.stdout [2025-12-05 111519]
    INFO
    - source=task.stdout [2025-12-05 111519]
    INFO
    -
    Error updating file processing record: current transaction is aborted, commands ignored until end of transaction block
    source=task.stdout [2025-12-05 111519]
    INFO
    - source=task.stdout [2025-12-05 111519]
    INFO
    -
    Converted and uploaded sheet 'EquityChanges' from /home/dtmsbi_test/BK_techouse/source/Expanded_Financial_Reporting_Dataset_Client5.7.xlsx to /home/dtmsbi_test/BK_techouse/source_csv/Expanded_Financial_Reporting_Dataset_Client5.7_EquityChanges.csv
    source=task.stdout [2025-12-05 111519]
    INFO
    -
    Error inserting file processing record: current transaction is aborted, commands ignored until end of transaction block
    source=task.stdout [2025-12-05 111519]
    INFO
    - source=task.stdout [2025-12-05 111519]
    INFO
    -
    Error updating file processing record: current transaction is aborted, commands ignored until end of transaction block
    source=task.stdout [2025-12-05 111519]
    INFO
    - source=task.stdout [2025-12-05 111519]
    INFO
    -
    Converted and uploaded sheet 'ForexTransactions' from /home/dtmsbi_test/BK_techouse/source/Expanded_Financial_Reporting_Dataset_Client5.7.xlsx to /home/dtmsbi_test/BK_techouse/source_csv/Expanded_Financial_Reporting_Dataset_Client5.7_ForexTransactions.csv
    source=task.stdout [2025-12-05 111519]
    INFO
    -
    Error inserting file processing record: current transaction is aborted, commands ignored until end of transaction block
    source=task.stdout [2025-12-05 111519]
    INFO
    - source=task.stdout [2025-12-05 111519]
    INFO
    -
    Error updating file processing record: current transaction is aborted, commands ignored until end of transaction block
    source=task.stdout [2025-12-05 111519]
    INFO
    - source=task.stdout [2025-12-05 111519]
    INFO
    -
    Converted and uploaded sheet 'BalanceSheet' from /home/dtmsbi_test/BK_techouse/source/Expanded_Financial_Reporting_Dataset_Client5.7.xlsx to /home/dtmsbi_test/BK_techouse/source_csv/Expanded_Financial_Reporting_Dataset_Client5.7_BalanceSheet.csv
    source=task.stdout [2025-12-05 111519]
    INFO
    -
    Error inserting file processing record: current transaction is aborted, commands ignored until end of transaction block
    source=task.stdout [2025-12-05 111519]
    INFO
    - source=task.stdout [2025-12-05 111519]
    INFO
    -
    Error updating file processing record: current transaction is aborted, commands ignored until end of transaction block
    source=task.stdout [2025-12-05 111519]
    INFO
    - source=task.stdout [2025-12-05 111519]
    INFO
    -
    Converted and uploaded sheet 'ProfitAndLoss' from /home/dtmsbi_test/BK_techouse/source/Expanded_Financial_Reporting_Dataset_Client5.7.xlsx to /home/dtmsbi_test/BK_techouse/source_csv/Expanded_Financial_Reporting_Dataset_Client5.7_ProfitAndLoss.csv
    source=task.stdout [2025-12-05 111519]
    INFO
    -
    Error inserting file processing record: current transaction is aborted, commands ignored until end of transaction block
    source=task.stdout [2025-12-05 111519]
    INFO
    - source=task.stdout [2025-12-05 111519]
    INFO
    -
    Error updating file processing record: current transaction is aborted, commands ignored until end of transaction block
    source=task.stdout [2025-12-05 111519]
    INFO
    - source=task.stdout [2025-12-05 111519]
    INFO
    -
    Converted and uploaded sheet 'NewCashFlow' from /home/dtmsbi_test/BK_techouse/source/Expanded_Financial_Reporting_Dataset_Client5.7.xlsx to /home/dtmsbi_test/BK_techouse/source_csv/Expanded_Financial_Reporting_Dataset_Client5.7_NewCashFlow.csv
    source=task.stdout [2025-12-05 111519]
    INFO
    -
    Error inserting file processing record: current transaction is aborted, commands ignored until end of transaction block
    source=task.stdout [2025-12-05 111519]
    INFO
    - source=task.stdout [2025-12-05 111519]
    INFO
    -
    Error updating file processing record: current transaction is aborted, commands ignored until end of transaction block
    source=task.stdout [2025-12-05 111519]
    INFO
    - source=task.stdout [2025-12-05 111519]
    INFO
    -
    Processed file details: [{'processedFileId': None, 'WorkbookID': 1, 'parentFileName': 'Expanded_Financial_Reporting_Dataset_Client5.7.xlsx', 'processedCsvFileName': 'Expanded_Financial_Reporting_Dataset_Client5.7_ChartOfAccounts.csv'}, {'processedFileId': None, 'WorkbookID': 1, 'parentFileName': 'Expanded_Financial_Reporting_Dataset_Client5.7.xlsx', 'processedCsvFileName': 'Expanded_Financial_Reporting_Dataset_Client5.7_GeneralLedger.csv'}, {'processedFileId': None, 'WorkbookID': 1, 'parentFileName': 'Expanded_Financial_Reporting_Dataset_Client5.7.xlsx', 'processedCsvFileName': 'Expanded_Financial_Reporting_Dataset_Client5.7_JournalEntries.csv'}, {'processedFileId': None, 'WorkbookID': 1, 'parentFileName': 'Expanded_Financial_Reporting_Dataset_Client5.7.xlsx', 'processedCsvFileName': 'Expanded_Financial_Reporting_Dataset_Client5.7_JournalEntryLines.csv'}, {'processedFileId': None, 'WorkbookID': 1, 'parentFileName': 'Expanded_Financial_Reporting_Dataset_Client5.7.xlsx', 'processedCsvFileName': 'Expanded_Financial_Reporting_Dataset_Client5.7_CashFlow.csv'}, {'processedFileId': None, 'WorkbookID': 1, 'parentFileName': 'Expanded_Financial_Reporting_Dataset_Client5.7.xlsx', 'processedCsvFileName': 'Expanded_Financial_Reporting_Dataset_Client5.7_EquityChanges.csv'}, {'processedFileId': None, 'WorkbookID': 1, 'parentFileName': 'Expanded_Financial_Reporting_Dataset_Client5.7.xlsx', 'processedCsvFileName': 'Expanded_Financial_Reporting_Dataset_Client5.7_ForexTransactions.csv'}, {'processedFileId': None, 'WorkbookID': 1, 'parentFileName': 'Expanded_Financial_Reporting_Dataset_Client5.7.xlsx', 'processedCsvFileName': 'Expanded_Financial_Reporting_Dataset_Client5.7_BalanceSheet.csv'}, {'processedFileId': None, 'WorkbookID': 1, 'parentFileName': 'Expanded_Financial_Reporting_Dataset_Client5.7.xlsx', 'processedCsvFileName': 'Expanded_Financial_Reporting_Dataset_Client5.7_ProfitAndLoss.csv'}, {'processedFileId': None, 'WorkbookID': 1, 'parentFileName': 'Expanded_Financial_Reporting_Dataset_Client5.7.xlsx', 'processedCsvFileName': 'Expanded_Financial_Reporting_Dataset_Client5.7_NewCashFlow.csv'}]
    source=task.stdout [2025-12-05 111519]
    INFO
    -
    Error storing processed file details: [Errno 2] No such file or directory: '/home/test-machine/airflow/dags/processed_file_details.json'
    source=task.stdout [2025-12-05 111519]
    INFO
    -
    Error moving file /home/dtmsbi_test/BK_techouse/source/Expanded_Financial_Reporting_Dataset_Client5.7.xlsx to archive: Failure
    source=task.stdout [2025-12-05 111519]
    INFO
    -
    [chan 0] sftp session closed.
    source=paramiko.transport.sftp loc=sftp.py:169 [2025-12-05 111519]
    INFO
    -
    Done. Returned value was: None
    source=airflow.task.operators.airflow.providers.standard.operators.python.PythonOperator loc=python.py:215 [2025-12-05 111519]
    INFO
    -
    Task instance in success state
    source=task.stdout [2025-12-05 111519]
    INFO
    -
    Previous state of the Task instance: TaskInstanceState.RUNNING
    source=task.stdout [2025-12-05 111519]
    INFO
    -
    Task operator:<Task(PythonOperator): file_processing_workflow>
    source=task.stdout
    k
    • 2
    • 1
  • d

    Dmytro Shamenko

    12/05/2025, 8:57 AM
    Hey, @kapa.ai, which of services are good to be executed on spot instnaces, my replication jobs running on spot instances received 143 SIGTERM and died.
    k
    • 2
    • 2
  • r

    Renu Fulmali

    12/05/2025, 11:38 AM
    @kapa.ai hm negotiation fail: algorithmName="server_host_key" jschProposal="ssh-ed25519,ecdsa-sha2-nistp256,ecdsa-sha2-nistp384,ecdsa-sha2-nistp521,rsa-sha2-512,rsa-sha2-256" serverProposal="ssh-rsa" Can you please check if this is something we can do ? Enable ssh-rsa in the connector’s JVM (quickest if you manage Airbyte)
    k
    • 2
    • 1
  • p

    Pablo Martin Calvo

    12/05/2025, 2:01 PM
    @kapa.ai It looks like I can only upsert contacts, hctimelog and product with hubspot's data activation. Shouldn't I be able to upsert tickets or deals?
    k
    • 2
    • 1
  • p

    Pablo Martin Calvo

    12/05/2025, 2:20 PM
    @kapa.ai When trying to send data to contacts in HubSpot, I'm getting an error
    Copy code
    Warning from destination: Invalid response with status code 409 while starting ingestion: {"status":"error","message":"Contact already exists. Existing ID: id_1","correlationId":"****","category":"CONFLICT"}
    It's the case for those hubspot contacts that have more than one email, and when both emails are sent in separate records for upserting. Has anybody else found this problem? Is there a fix going on?
    k
    • 2
    • 1
  • l

    Lucas Segers

    12/05/2025, 2:42 PM
    @kapa.ai what is this error when trying to setup a new bigquery datasource with a gcs bucekt 2025-12-05 144131,315 [io-executor-thread-6] WARN i.a.c.e.EntitlementServiceImpl(hasEnterpriseConnectorEntitlements-Y0BDzM4):185 - Connector entitlement not available. actorDefinitionId=d2542966-8cc8-4899-9b74-413a7d9bb28e organizationId=OrganizationId(value=00000000-0000-0000-0000-000000000000)
    k
    • 2
    • 1
  • a

    Alexander Ettingshausen

    12/05/2025, 3:52 PM
    @kapa.ai I am interested in a Firebird source connector to transfer data to BigQuery and found this page https://airbyte.com/integrations/firebird. In the Airbyte UI I can't find any FIrebird source connector. Could you support me with that?
    k
    • 2
    • 4
  • s

    Slackbot

    12/05/2025, 4:00 PM
    This message was deleted.
    k
    • 2
    • 1
  • p

    Parry Chen

    12/05/2025, 4:42 PM
    is there a sage intacct connector? i can't find it in the UI or in github
    k
    • 2
    • 5
  • j

    Júlia Lemes

    12/05/2025, 4:45 PM
    @kapa.ai I have a pipeline in airbyte that has one stream with sync mode Full Refresh Overwrite. What does this sync mode do behind the scenes? What are the commands it does in Redshift? Mainly if I have the drop cascade option enabled for Redshift destination connector
    k
    • 2
    • 4
  • j

    Juan Fabrega

    12/05/2025, 7:21 PM
    @kapa.ai Is there a way to set an alert for pipelines stalling or taking longer than usual to run?
    k
    • 2
    • 2
  • m

    Mateo Graciano

    12/05/2025, 8:18 PM
    @kapa.ai what can i do if my disk has no free space left?
    k
    • 2
    • 4
  • s

    Slackbot

    12/05/2025, 9:54 PM
    This message was deleted.
    k
    • 2
    • 1