https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • j

    Javier S

    03/14/2023, 8:10 PM
    Hi everyone! Is there a way to set the destination collation for SQL Server? I have a destination server that is
    Latin1_General_BIN
    which is case sensitive. I see in my error log that Airbyte is using lowercase as the default causing the issue.
    Copy code
    [42S02] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Invalid object name 'DESTINATION_DB.information_schema.tables'.
    • 1
    • 1
  • c

    C_Ramirez

    03/14/2023, 8:48 PM
    Hi airbyte community, I am learning about airbyte. I developed a monorepo using dagster, airbyte and dbt over docker compose in a GitHub repository. I already load data using airbyte and perform dbt transformations individually using dagster. I was asked to upload dbt transformation using the airbyte UI, from a repository instead of using a dagster asset. I have three questions: 1. I wonder if uploading transformations from a separated repo is more efficient than the current setup. 2. Is there any way to load the dbt transformations using the airbyte UI considering that my repo has more files than just the dbt project directory? 3. Is it possible configure the docker compose to load dbt transformations from the same repo?
  • p

    Praty Sharma

    03/14/2023, 9:00 PM
    hey there - i’m trying to use the open source connector for Slack Channels ETL and I’m using the
    SOURCES_CREATE_API
    to create the connection. in the connector configuration (
    syncCatalog
    ) - how should i be setting the default values of
    sourceDefinedCursor
    ,
    defaultCursorField
    ,
    sourceDefinedPrimaryKey
    ? did not see anything relevant in the docs
    n
    • 2
    • 1
  • m

    Marco de Nijs

    03/14/2023, 9:23 PM
    We are going to use Airbyte cloud but i'm no IT guy, our msp needs to know what to setup so we can connect our databases/excel files etc to Airbyte, Would that be an SSH tunnel? I see SSH listed as a option on both the TSQL and the files in the documentation for sources. Can someone ELI5 what we need to get this going? Both the files and databases are hosted on a fileserver afaik.
    m
    • 2
    • 2
  • j

    Jerri Comeau (Airbyte)

    03/14/2023, 9:49 PM
    Hey everyone! Quick note from CAT about our recent survey. Thanks to everyone who participated, and all of the really great feedback you shared with us. We’ll be announcing the raffle winner tomorrow during Office Hours, but if you can’t make it that’s ok, we’ll also post the results in the #C045VK5AF54 channel. Thank you for being a part of our Community!
  • m

    Marc Fiani

    03/14/2023, 9:52 PM
    Hi Team, I am using Hubspot connector with incremental deduped+history for engagements and companies. The connector do not get all the companies that are in the database. Has anyone experienced this? How can I fix this? Many Thanks!
    n
    • 2
    • 2
  • e

    Elizabeth Rodriguez

    03/14/2023, 10:58 PM
    Hello! I am replicating data from mysql to snowflake using airbyte 0.40.2 and CDC with the mysql binary log. I noticed today that in snowflake i have more rows than in mysql in a table that is affected by a delete in cascade Someone know if airbyte is able to handle cascades deletes in the affected tables due to a primary key relation ?
    n
    u
    • 3
    • 4
  • m

    Matthew Tovbin

    03/15/2023, 5:02 AM
    The link to
    Javascript/Typescript
    CDK from this page appears to be dead... Can anyone help to fix the link?
    r
    n
    • 3
    • 2
  • b

    Bismay

    03/15/2023, 5:41 AM
    Hi , actually i am using GKE cluster and deploying airbyte but after git clone i can't locate the kube directory. when i run this command kubectl apply -k kube/overlays/stable getting below error - error: must build at directory: not a valid directory: evalsymlink failure on 'kube/overlays/stable' : lstat /home/niveus/airbyte/kube: no such file or directory
    n
    • 2
    • 1
  • b

    Bismay

    03/15/2023, 5:41 AM
    How to install airbyte with a particular version in helm
  • b

    Bismay

    03/15/2023, 5:42 AM
    can you provide me link or commans
  • b

    Bismay

    03/15/2023, 5:42 AM
    commands it will really helpful
  • b

    bennis Hajar

    03/15/2023, 7:21 AM
    ♦️Hey, this might be a simple question but I couldn't access to the default airbyte-db POSTGRES , i have deployed airbyte on k8s , can you help me please ? (i have tried the port forwarder and also connecting to the Pod airbyte-db which include the DB posgres but i didn't find any table ).
    r
    • 2
    • 1
  • a

    Aditya Bajaj

    03/15/2023, 7:32 AM
    Copy code
    Hello everyone, I am new with airbyte so can anyone help me reslove  this error ... the error is quite frequent and is hampering the process of data ingenstion ... THANKS in Advance
    
    
    Caused by: java.lang.NullPointerException: Cannot invoke "java.lang.Integer.intValue()" because the return value of "io.airbyte.workers.process.KubePortManagerSingleton.take()" is null
    	at io.airbyte.workers.process.KubeProcessFactory.create(KubeProcessFactory.java:98) ~[io.airbyte-airbyte-commons-worker-0.40.29.jar:?]
    r
    • 2
    • 1
  • r

    Raj Talukder

    03/15/2023, 7:57 AM
    Hi, I'm having an issue with configuring logging to S3 in my helm deployment on EKS. I'm not receiving any error messages, however, the bucket I set it to is not receiving any logs from airbyte. I tried disabling minio and setting the
    state.storage.type
    parameter to
    "S3"
    but nothing is being written to the bucket and it causes the worker and server pod to go into a crash loop with the error:
    Copy code
    Micronaut(handleStartupException):338 - Error starting Micronaut server: null java.lang.IllegalArgumentException: null
    For further context, I am using version
    0.44.5
    of the helm chart. For reference, here is my current configuration:
    Copy code
    state:
      ## state.storage.type Determines which state storage will be utilized.  One of "MINIO", "S3" or "GCS"
      storage:
        type: "MINIO"
    logs:
      ##  logs.accessKey.password Logs Access Key
      ##  logs.accessKey.existingSecret
      ##  logs.accessKey.existingSecretKey
      accessKey:
        password: "access_key"
        existingSecret: ""
        existingSecretKey: ""
      ##  logs.secretKey.password Logs Secret Key
      ##  logs.secretKey.existingSecret
      ##  logs.secretKey.existingSecretKey
      secretKey:
        password: "secret_key"
        existingSecret: ""
        existingSecretKey: ""
    
      ## logs.storage.type Determines which log storage  will be utilized.  One of "MINIO", "S3" or "GCS"
      ##                   Used in conjunction with logs.minio.*, logs.s3.* or logs.gcs.*
      storage:
        type: "S3"
    
      ##  logs.minio.enabled Switch to enable or disable the Minio helm chart
      minio:
        enabled: true
    
      ##  logs.externalMinio.enabled Switch to enable or disable an external Minio instance
      ##  logs.externalMinio.host External Minio Host
      ##  logs.externalMinio.port External Minio Port
      ##  logs.externalMinio.endpoint Fully qualified hostname for s3-compatible storage
      externalMinio:
        enabled: false
        host: localhost
        port: 9000
    
      ##  logs.s3.enabled Switch to enable or disable custom S3 Log location
      ##  logs.s3.bucket Bucket name where logs should be stored
      ##  logs.s3.bucketRegion Region of the bucket (must be empty if using minio)
      s3:
        enabled: true
        bucket: airbyte-logging-test
        bucketRegion: ""
    Any advice or suggestion would be greatly appreciated!
    k
    • 2
    • 1
  • b

    Bismay

    03/15/2023, 8:06 AM
    I am using e2-medium and 4gb ram is it sufficient for deployment for airbyte in stage environment
    m
    r
    • 3
    • 4
  • a

    Anatole Callies

    03/15/2023, 9:49 AM
    Hi, Despite my Postgres to BigQuery Airbyte sync being successful, I found by chance that there are some discrepancies between my source and my destination. In particular some JSON columns are null on some rows in my destination while they are filled in the source. And I know they were already filled when the initial sync started. Do you know a way to efficiently compare 2 databases, in particular if 1 of them is postgres and the other one BQ ? I was thinking of doing something like this : run a query on the source and compute a hash of the result, run the same query in the destination and compare the hashes. But I’m quite sure there must be better solutions
  • u

    Umar Hussain

    03/15/2023, 10:53 AM
    Hello airbyte team, I’ve started using airbyte-os in anger within my company on a POC basis for a few sources, namely Mysql, Zendesk & Salesforce. With regards to zendesk - what do you recommend for the initial sink if you have Petabytes of data - say 100-200? based on the current
    200mb
    syncs into snowflake I assume that this will take over 2-3 weeks 😄
    w
    • 2
    • 1
  • a

    Avi Garg

    03/15/2023, 11:35 AM
    Hi All, I am using airbyte 0.42.0 and octavia 0.41.1, when importing my airbyte source destinaion using octavia its working fine but while applying it creates source and destination correctly but not able to create connection, getting below error
    u
    s
    • 3
    • 4
  • a

    Avi Garg

    03/15/2023, 11:35 AM
    raise get_type_error(input_value, path_to_item, valid_classes, airbyte_api_client.exceptions.ApiTypeError: Invalid type for variable 'non_breaking_changes_preference'. Required value type is NonBreakingChangesPreference and passed type was str at ['non_breaking_changes_preference']
  • a

    Avi Garg

    03/15/2023, 11:38 AM
    please note that octavia 0.42.0 image is not available
  • a

    Avi Garg

    03/15/2023, 12:00 PM
    can someone plz help
  • a

    Ashwin Kumar

    03/15/2023, 12:24 PM
    Hi all, any help using S3 for State Storage? We do not want to use PostGres, but want to use S3 for storing state of the Syncs. I have tried setting the EnvConfigs for S3_STATE_STORAGE_* but I get the statePath in the IntegrationConfig to be null. (even when I am running Incremental Scans) Thank you
    n
    • 2
    • 1
  • a

    Akhil Chouhan

    03/15/2023, 12:36 PM
    Hi airbyte team, as suggested in this doc https://docs.airbyte.com/deploying-airbyte/on-aws-ec2/ we have used t2.large node with amazon linux AMI in our production environment but it fails in ingest 5 Gibs of data from postgres to clickhouse our ec2 instance crashes, cpu utilization reaches 86 percent can someone please help?
    r
    • 2
    • 2
  • a

    Akhil Chouhan

    03/15/2023, 1:12 PM
    also tried ubuntu AMI, and it still crashes!
  • b

    Bismay

    03/15/2023, 1:20 PM
    getting below issue while connecting to mongodb • Airbyte version: 0.40.33 • OS Version / Instance: GCP n1-standard-2. • Deployment: Kubernetes engine on GCP.
  • b

    Bismay

    03/15/2023, 1:20 PM
    2023-03-15 120633 INFO i.a.w.t.TemporalAttemptExecution(get):138 - Cloud storage job log path: /workspace/1be8c3f6-d22f-49d3-b2c4-65ab46e170bb/0/logs.log 2023-03-15 120633 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.40.17 2023-03-15 120633 INFO i.a.w.p.KubeProcessFactory(create):100 - Attempting to start pod = e-mongodb-v2-check-1be8c3f6-d22f-49d3-b2c4-65ab46e170bb-0-jjjzh for airbyte/source-mongodb-v2:0.1.19 with resources io.airbyte.config.ResourceRequirements@2c3e0e9e[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=] 2023-03-15 120633 INFO i.a.w.p.KubeProcessFactory(create):103 - e-mongodb-v2-check-1be8c3f6-d22f-49d3-b2c4-65ab46e170bb-0-jjjzh stdoutLocalPort = 9030 2023-03-15 120633 INFO i.a.w.p.KubeProcessFactory(create):106 - e-mongodb-v2-check-1be8c3f6-d22f-49d3-b2c4-65ab46e170bb-0-jjjzh stderrLocalPort = 9031 2023-03-15 120633 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-03-15 120633 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$9):589 - Creating stdout socket server... 2023-03-15 120633 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- START CHECK ----- 2023-03-15 120633 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-03-15 120633 INFO i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$10):607 - Creating stderr socket server... 2023-03-15 120633 INFO i.a.w.p.KubePodProcess(<init>):520 - Creating pod e-mongodb-v2-check-1be8c3f6-d22f-49d3-b2c4-65ab46e170bb-0-jjjzh... 2023-03-15 120634 INFO i.a.w.p.KubePodProcess(waitForInitPodToRun):309 - Waiting for init container to be ready before copying files... can anyone face this type of issue earlier
  • s

    Srikanth Sudhindra

    03/15/2023, 1:39 PM
    Hi Team, is there an easy way to select/deselect all streams within a connection? ATM I have to manually do them. This is a pain when the number of tables at source is huge.
    • 1
    • 1
  • i

    Isaias Darci Dieterich

    03/15/2023, 2:00 PM
    Hello! I am trying create connection destination in Google Cloud Storage (GCS), but i receive a message error below: I am using a airbyte v0.42.0 And a create a HMAC Key in GCS, but don't work ... any solution?
  • i

    Isaias Darci Dieterich

    03/15/2023, 2:01 PM
    I'm trying a HMAC Key from user and account service, but not working
1...162163164...245Latest