https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • c

    caesee

    05/28/2023, 1:55 PM
    Hi, everyone I retry connect to local minio s3,still error1316:29 ERROR i.a.w.i.DefaultAirbyteStreamFactory(internalLog):116 - Exception attempting to access the S3 bucket: Stack Trace: com.amazonaws.services.s3.model.AmazonS3Exception: 'minio-offcial'. (Service: Amazon S3; Status Code: 400; Error Code: AuthorizationHeaderMalformed; Request ID: 1763510A19105904; S3 Extended Request ID: e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855; Proxy: null), S3 Extended Request ID: e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855
  • l

    Lenin Mishra

    05/29/2023, 7:11 AM
    Hey folks, I believe there is an issue with the quickbooks connector. One of the fields for payments table in the Quickbooks connector is expecting a bigint, but the API is providing a float. Can the Airbyte team prioritise this? Or can we create a PR fixing the data type in the json files? Here is the error log for your reference.
    Copy code
    2023-05-28 12:10:34 INFO i.a.w.g.DefaultNormalizationWorker(run):97 - Normalization executed in 1 minute 41 seconds for job 6366.
    2023-05-28 12:10:34 INFO i.a.w.g.DefaultNormalizationWorker(run):109 - Normalization summary: io.airbyte.config.NormalizationSummary@3ebf738b[startTime=1685275733084,endTime=1685275834419,failures=[io.airbyte.config.FailureReason@1bd36d8f[failureOrigin=normalization,failureType=system_error,internalMessage=invalid input syntax for type bigint: "2016649.0",externalMessage=Normalization failed during the dbt run. This may indicate a problem with the data itself.,metadata=io.airbyte.config.Metadata@7b630e39[additionalProperties={attemptNumber=2, jobId=6366, from_trace_message=true}],stacktrace=AirbyteDbtError: 
    18 of 337 ERROR creating incremental model _airbyte_quickbooks_4037.payments_stg........................................ [ERROR in 0.83s]
    Database Error in model payments_stg (models/generated/airbyte_incremental/quickbooks_4037/payments_stg.sql)
      invalid input syntax for type bigint: "2016649.0"
    Database Error in model payments_stg (models/generated/airbyte_incremental/quickbooks_4037/payments_stg.sql)
      invalid input syntax for type bigint: "2016649.0",retryable=<null>,timestamp=1685275834418,additionalProperties={}], io.airbyte.config.FailureReason@194851f8[failureOrigin=normalization,failureType=system_error,internalMessage=invalid input syntax for type bigint: "2016649.0",externalMessage=Normalization failed during the dbt run. This may indicate a problem with the data itself.,metadata=io.airbyte.config.Metadata@5c2e4a71[additionalProperties={attemptNumber=2, jobId=6366, from_trace_message=true}],stacktrace=AirbyteDbtError: 
    18 of 337 ERROR creating incremental model _airbyte_quickbooks_4037.payments_stg........................................ [ERROR in 0.83s]
    Database Error in model payments_stg (models/generated/airbyte_incremental/quickbooks_4037/payments_stg.sql)
      invalid input syntax for type bigint: "2016649.0"
    Database Error in model payments_stg (models/generated/airbyte_incremental/quickbooks_4037/payments_stg.sql)
      invalid input syntax for type bigint: "2016649.0",retryable=<null>,timestamp=1685275834418,additionalProperties={}]],additionalProperties={}]
    2023-05-28 12:10:34 INFO i.a.c.i.LineGobbler(voidCall):149 - 
    2023-05-28 12:10:34 INFO i.a.w.t.TemporalAttemptExecution(get):163 - Stopping cancellation check scheduling...
    2023-05-28 12:10:34 INFO i.a.c.i.LineGobbler(voidCall):149 - ----- END DEFAULT NORMALIZATION -----
    k
    • 2
    • 2
  • r

    Ran Silberman

    05/29/2023, 7:17 AM
    Hello, I have an issue with setting up Snowflake connector. This is similar to issue: https://github.com/airbytehq/airbyte/issues/3393 (Add a feature to support Looker self-hosted instance domain) that was resolved, But it is related to Snowflake. We access Snowflake behind Satori, and this means that we provide a different domain name and not xxx.snowflakecomputing.com I cannot add such a URL as there is a regex validation that does not let me enter my domain name. Can you help in resolving this?
  • o

    Octavia Squidington III

    05/29/2023, 7:45 PM
    🔥 Community Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 1pm PDT click here to join us on Zoom!
  • s

    Slackbot

    05/29/2023, 9:12 PM
    This message was deleted.
    k
    • 2
    • 2
  • r

    Robert McCarter

    05/29/2023, 9:18 PM
    I can't find an existing solution:
    Copy code
    Certificate for <mycompanyid.us-east-1.aws.snowflakecomputing.com> doesn't match any of the subject alternative names: [*.<http://prod3.us-west-2.snowflakecomputing.com|prod3.us-west-2.snowflakecomputing.com>, *.<http://us-west-2.snowflakecomputing.com|us-west-2.snowflakecomputing.com>, *.<http://global.snowflakecomputing.com|global.snowflakecomputing.com>, *.<http://snowflakecomputing.com|snowflakecomputing.com>, *.<http://prod3.us-west-2.aws.snowflakecomputing.com]|prod3.us-west-2.aws.snowflakecomputing.com]>.
    On v0.42.0 It almost looks like only
    us-west-2
    is supported, but this can't be right. Any help would be greatly appreciated
    k
    • 2
    • 2
  • r

    Ricardo Andrés Ibarra Bolívar

    05/30/2023, 1:06 AM
    Hi guys! Is it possible to split the data into partitions when the destination is an S3 bucket?
    k
    • 2
    • 2
  • s

    Slackbot

    05/30/2023, 2:23 AM
    This message was deleted.
    k
    • 2
    • 2
  • b

    Biondi Septian S

    05/30/2023, 2:32 AM
    Hello All, I created a connection from PostgreSQL to Typesense. I replicate two streams which are my 2 tables in postgresql, into typesense collections. I also created 2 collections in Typesense as destination. The name of the collections are already inline. I also already created the collections schemas before the sync using pre-defined schema method. When I did the sync, I don't know why but the replication just filled into 1 collection out of supposed to be 2 collections. But, when I tried to reset and sync again manually but just replicate the streams one by one in separate session, those collections can be filled / replicated normally. Anyone know about this issue ? Appreciate your help. Thank you.
    k
    b
    • 3
    • 3
  • s

    Slackbot

    05/30/2023, 3:13 AM
    This message was deleted.
    k
    • 2
    • 2
  • s

    Slackbot

    05/30/2023, 3:19 AM
    This message was deleted.
    k
    • 2
    • 2
  • h

    Harshit Khandelwal

    05/30/2023, 7:28 AM
    Hello All. I have created a connection between two mongodb vm. I am trying to replicate data from source mongodb to destination. However after the doing sync the documents in destination are being created with airbyte_raw prefix which I don't want. Can someone help me with this. Thanks.
    k
    • 2
    • 2
  • t

    Tushar Mudgal

    05/30/2023, 9:33 AM
    Hello everyone, I am facing issue while replicating the postgres database to S3. It fails sometimes, also, the data scanning from postgres to s3 is not stable. I am attaching images for my sync process. • if it fails in Attempt 1, it tries to fetch all data. Actual behaviour should be to fetch the data where it left off. • Airbyte Version: 0.40.30 • Server Config: 2 core 8 GB. • Data scanning daily, not exceeding 500 mb.
  • r

    Rajat Tripathi

    05/30/2023, 11:45 AM
    Hi everyone! Great to see you all 🙂 Currently, I am trying to establish connection between [Microsoft SQL Server <-> Bigquery]. With the transformation, 'normalised tabular data' enabled, I am getting the error message "exec /airbyte/entrypoint.sh failed: Exec format error" suggests that there might be an issue with the entrypoint script of your Airbyte worker. Has anyone faced this issue before?
    k
    • 2
    • 2
  • e

    Emrullah Ergun

    05/30/2023, 12:23 PM
    Hi Everyone, 😄 I am a new user of airbyte and I am running into some issues with my first connector : I am trying to get data from MSSQL TO BIGQUERY table by first loading it into GCS.
    Could not find image: airbyte/normalization:0.4.3
    I followed all the installation guide and it seems that I missed a package...
    k
    • 2
    • 2
  • d

    Dale Bradman

    05/30/2023, 1:22 PM
    Hello 👋 octavia wave - what's the best way to subscribe to changes in specific connectors? I.e. the
    DockerImageTag
    get's bumped and we want to assess whether to upgrade our version as well
    k
    m
    • 3
    • 4
  • p

    Paulo José Ianes Bernardo Filho

    05/30/2023, 4:55 PM
    Hello guys! Some one have a policy template for AWS data lake ingestion?
    k
    • 2
    • 8
  • j

    Jake Kagan

    05/30/2023, 5:38 PM
    hey im trying to connect to pardot (salesforce) and for some reason i get an ssl error:
    Copy code
    airbyte-worker                    | 2023-05-30 17:34:48 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):317 - Backing off _make_request(...) for 120.0s (requests.exceptions.SSLError: HTTPSConnectionPool(host='<http://login.salesforce.com|login.salesforce.com>', port=443): Max retries exceeded with url: /services/oauth2/token (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1129)'))))
    airbyte-worker                    | 2023-05-30 17:34:48 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):317 - Caught retryable error 'HTTPSConnectionPool(host='<http://login.salesforce.com|login.salesforce.com>', port=443): Max retries exceeded with url: /services/oauth2/token (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1129)')))' after 4 tries. Waiting 120 seconds then retrying...
    wondering if this is something on my server end, or if i need to talk to a salesforce dev? i have airbyte running in an ec2 on http, i can setup a self-signed but that won't be any good when someone else checks...
    k
    j
    • 3
    • 4
  • s

    Sathish

    05/30/2023, 6:08 PM
    Hello, I am using latest Airbyte v0.41.0 (running on EC2 Docker), Salesforce Connector 2.0.12 to synch into Snowflake. Noticing few random records that have not synched. I am comparing with data synched via Fivetran. Issue is with high volume objects such as Accounts, Opportunities, Quotes and QuoteLines. There isn't a specific pattern, and this only happens on few records. Has any one run into this?
    k
    • 2
    • 3
  • l

    Luis Felipe Oliveira

    05/30/2023, 6:55 PM
    Hello, Is it possible to register some webhook or some means of sending notifications via HTTP to an API application when the status of a connection is updated?
    k
    • 2
    • 8
  • c

    Carolina Buckler

    05/30/2023, 8:25 PM
    Trying to set up a connection on Airbyte v0.44.5 with Netsuite and getting the following error:
    Copy code
    INFO i.a.w.g.DefaultCheckConnectionWorker(run):115 - Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@39bdbe6b[status=failed,message=HTTPError('401 Client Error: Unauthorized for url: <https://333333.suitetalk.api.netsuite.com/services/rest/record/v1/contact?limit=1'),additionalProperties={}>]
    k
    • 2
    • 2
  • n

    Nipuna Prashan

    05/31/2023, 3:51 AM
    How to delete a custom connector from UI?
    k
    • 2
    • 2
  • s

    Shailesh

    05/31/2023, 6:37 AM
    Hello, everyone. My team and I are looking at Airbyte for some data integration work flows we are building. One of our potential needs is low latency syncing between two data stores (SLA in minutes). I understood that Airtbyte Cloud can sync source to destination as often as 1 hour. Can the self-hosted application perform more frequent incremental syncs? I've tried using the cron scheduler with a cron expression to run every 10 minutes but from the UI it doesn't seem that the schedule is being used. Additionally, we were also curious to know if the source is a stream like Kafka does a sync period still apply or can Airbyte manage continuous syncing from Kafka to a database destination?
    k
    • 2
    • 2
  • d

    Dale Bradman

    05/31/2023, 10:12 AM
    Is there an Airbyte connector that pulls Airbyte metadata / logs into the data warehouse? Or would it require connecting to the configured Airbyte database?
    k
    • 2
    • 2
  • g

    Gabriel Martelloti

    05/31/2023, 1:26 PM
    Hey guys. Any way for me to configure a connection to automatically change the schema and refresh it once it finds a non-breaking schema change? There is only an option for me to ignore or disable it
    k
    • 2
    • 2
  • s

    Sophie Lohezic

    05/31/2023, 1:44 PM
    Hello! I have no issue with a connector but I would like to have some feedback on the jira source connecto. We indeed consider to use it in the next few weeks/months. Is it reliable 'even in alpha? Are there known issues / limitations? Are there any development planned for it to become a V1 or even GA? Many thanks 🙏
    k
    • 2
    • 2
  • p

    Praveenraaj K S

    05/31/2023, 1:56 PM
    Hello guys, we have a Postgres to bigquery connection with CDC enabled. After the initial and a few syncs, data is not propagated correctly to bigquery. Got data count mismatch on the destination side. let's say the source table has 1700 rows but Bigquery tables have only 1670 rows. Has anyone run into this?
    k
    • 2
    • 2
  • n

    Nohelia Merino

    05/31/2023, 2:20 PM
    Hi everyone, I'm getting a "Sync failed" error with no logs, the only log trace it prints is
    Copy code
    2023-05-31 14:06:33 - Additional Failure Information: message='io.temporal.serviceclient.CheckedExceptionWrapper: java.util.concurrent.ExecutionException: java.lang.RuntimeException: io.airbyte.workers.exception.WorkerException: Running the launcher replication-orchestrator failed', type='java.lang.RuntimeException', nonRetryable=false
    This is for a custom source connector connection, I also reviewed the server logs but there are no logs available for the last 3 days at all. It has been working normally until recently. Could someone help me out with this issue? cc: @Brian Castelli @Harshil Dhariwal @kapa.ai
    k
    b
    • 3
    • 5
  • l

    Leo Qin

    05/31/2023, 3:09 PM
    @kapa.ai - how do I set up a s3 source for a path that has space characters in the path?
    k
    • 2
    • 11
  • j

    Jake Kagan

    05/31/2023, 4:42 PM
    anyone run into ssl issues? https://github.com/airbytehq/airbyte/issues/26853 trying to figure out how to deal with this
    k
    • 2
    • 2
1...197198199...245Latest