https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • r

    Rocky Appiah

    01/13/2023, 6:47 PM
    Something get broken with the updated docker-compose file?
    Copy code
    $ docker-compose up -d
    WARNING: The DEPLOYMENT_MODE variable is not set. Defaulting to a blank string.
    WARNING: The LOG_CONNECTOR_MESSAGES variable is not set. Defaulting to a blank string.
    WARNING: The SHOULD_RUN_NOTIFY_WORKFLOW variable is not set. Defaulting to a blank string.
    WARNING: The SECRET_PERSISTENCE variable is not set. Defaulting to a blank string.
    WARNING: The JOB_ERROR_REPORTING_SENTRY_DSN variable is not set. Defaulting to a blank string.
    WARNING: The APPLY_FIELD_SELECTION variable is not set. Defaulting to a blank string.
    WARNING: The FIELD_SELECTION_WORKSPACES variable is not set. Defaulting to a blank string.
    WARNING: The NEW_SCHEDULER variable is not set. Defaulting to a blank string.
    WARNING: The WORKER_ENVIRONMENT variable is not set. Defaulting to a blank string.
    WARNING: The GITHUB_STORE_BRANCH variable is not set. Defaulting to a blank string.
    WARNING: The REMOTE_CONNECTOR_CATALOG_URL variable is not set. Defaulting to a blank string.
    WARNING: The TEMPORAL_HISTORY_RETENTION_IN_DAYS variable is not set. Defaulting to a blank string.
    WARNING: The UPDATE_DEFINITIONS_CRON_ENABLED variable is not set. Defaulting to a blank string.
    ERROR: The Compose file './docker-compose.yaml' is invalid because:
    services.airbyte-connector-builder-server.depends_on contains an invalid type, it should be an array
    services.airbyte-cron.depends_on contains an invalid type, it should be an array
    services.server.depends_on contains an invalid type, it should be an array
    services.webapp.depends_on contains an invalid type, it should be an array
    services.worker.depends_on contains an invalid type, it should be an array
    services.bootloader.depends_on contains an invalid type, it should be an array
  • r

    Rocky Appiah

    01/13/2023, 6:52 PM
    I see some changes to the docker-compose file, do we need to upgrade/change something to get the latest version working? Changes here
    s
    • 2
    • 5
  • m

    Mateusz Kijewski

    01/13/2023, 7:09 PM
    TIL: If you want to store logs on S3 or GCS don’t you dare setting
    logs.minio.enabled
    to
    false
    s
    • 2
    • 1
  • v

    Vitória de Barros Carvalho

    01/13/2023, 7:41 PM
    Hello! I really need help. I'm trying to build a table off of data from a facebook ads account (I'm building custom to get the data) and I'm having trouble finding these three columns= age, gender and location. Please please please if anyone knows how I can get this information I'd really appreciate the help
  • v

    Vitória de Barros Carvalho

    01/13/2023, 7:42 PM
    my custom insights looks like this right now:
  • m

    Mikhail Masyagin

    01/14/2023, 6:54 AM
    Hey friends! Is it possible to set Tags (Key-Value pairs) in S3 files for S3 destination?
    u
    • 2
    • 1
  • r

    Rendy B. Junior

    01/14/2023, 6:56 AM
    Hi all, I have a source which connect to an API, where HTTP403 response is expected (we'll never know unless we check)> how to handle / ignore this specific response code 403? I read the codes, one of the way is to override
    HttpStream._send
    , copy paste the content, and add another else block after should_retry check context: python source connector
    n
    • 2
    • 1
  • m

    Maksim Miceta

    01/14/2023, 10:00 AM
    I have a problem to deploy Airbyte on MacBook Air M1. Can anyone help? It has taken a whole day doing this:
    r
    n
    • 3
    • 4
  • r

    Rocky Appiah

    01/14/2023, 4:39 PM
    Running airbyte via an aws ec2 instance (arm64), getting this error during normalization to Snowflake?
    Copy code
    2023-01-14 16:34:30 INFO i.a.w.p.DockerProcessFactory(create):164 - Preparing command: docker run --rm --init -i -w /data/1/0/normalize --log-driver none --name normalization-snowflake-normalize-1-0-frzai --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.40.28 airbyte/normalization-snowflake:0.2.25 run --integration-type snowflake --config destination_config.json --catalog destination_catalog.json
    2023-01-14 16:34:33 normalization > [FATAL tini (6)] exec /airbyte/entrypoint.sh failed: Exec format error
    2023-01-14 16:34:33 INFO i.a.w.n.DefaultNormalizationRunner(close):189 - Terminating normalization process...
    2023-01-14 16:34:33 ERROR i.a.w.g.DefaultNormalizationWorker(run):83 - Normalization failed for job 1.
    io.airbyte.workers.exception.WorkerException: Normalization process did not terminate normally (exit code: 1)
    	at io.airbyte.workers.normalization.DefaultNormalizationRunner.close(DefaultNormalizationRunner.java:200) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?]
    	at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:81) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?]
    	at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:34) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?]
    	at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.40.28.jar:?]
    source, destination, and normalization versions are:
    Copy code
    airbyte/source-postgres            1.0.34    c4194c9f52c9   4 weeks ago    640MB
    airbyte/destination-snowflake      0.4.42    a1e9484f57f2   41 hours ago   1.41GB
    airbyte/normalization-snowflake    0.2.25    7ddea81b7155   5 weeks ago    783MB
    Running Airbyte version 0.40.28
    r
    u
    • 3
    • 4
  • d

    Deepak Kumar

    01/14/2023, 5:16 PM
    Hi, I hope you are also fine. I am a Fresher I got $500 for signing up will @airbyte take that back from me ? Will it be safe for my bank card details ?
    u
    • 2
    • 1
  • i

    ihsan islam

    01/14/2023, 9:51 PM
    Hello I am new here! I have signed up on the Airbyte site after researching about ZOHO CRM connector for Apache Superset. however, when i log in i do not see the ZOHO CRM option for the Source data point. Could someone guide me to the correct place? or I am missing something or it is a marketing error?
    👀 1
    s
    • 2
    • 1
  • j

    Jordan Fox

    01/14/2023, 10:53 PM
    Just putting the question out there - is anyone using or working on Databricks SQL as a source connector yet? Is anyone else interested in Databricks SQL as a source?
    ➕ 1
    n
    n
    l
    • 4
    • 7
  • m

    Meir Ifrach

    01/15/2023, 12:30 PM
    hi, I dont have Kafka as destination when I try to create connection with stripe. this the first connection i'm trying to run.. am I missing something? I see in the doc that you do support Kafka as destination
    n
    • 2
    • 1
  • l

    Leo Schick

    01/15/2023, 3:26 PM
    Hey guys, I an searching for someone who cloud develop a source connector for a simple web API. Just need a quote in the first run. Which is the right channel to post this in here?
    u
    • 2
    • 1
  • j

    José Lúcio Zancan Júnior

    01/15/2023, 11:25 PM
    Someone has a guide or a tip about upgrading the AirByte deployment via Helm? I found only the guide for the K8S deployment via source-code (the
    kube/overlays/stable
    method) I just need to change the
    version
    parameter on my values.yaml and do a helm upgrade? Or there's something else?
    n
    • 2
    • 1
  • o

    Omar Abdullahi Ahmed

    01/16/2023, 8:57 AM
    Hey everyone, I was wondering if any of you have experience with connecting Zoho Books with Airbyte? Our organization is heavily using Airbyte, and I was wondering if there are any pre-built connectors available or if there is a workaround to get the data into Airbyte. Any insights or suggestions would be greatly appreciated. Thanks!
    n
    • 2
    • 1
  • i

    Ishan Anilbhai Koradiya

    01/16/2023, 10:16 AM
    Hi everyone, I am trying to do a manual sync in airbyte using the open source api however the data doesn't sync. Is there something else we need to do for this ?
    n
    • 2
    • 1
  • j

    Jordan Fox

    01/16/2023, 10:33 AM
    Curious on thoughts, have a new Oracle production database to sync. Deciding how to solve the problem. We can't turn on transaction logs/cdc. The transaction tables are large. They have no create date or last modified date. They have a production date but the users can go back to any production date and modify them at any time. There is no primary key (its an aggregate of 6 keys). Do I make a modification to the Oracle connector to support Incremental Lag Append (pull last X days, append to destination) or do I just suck it up and make a custom python script. I forsee having quite a few other sources, particularly from old Oracle db applications being like this. I'd set up debezium but it's a vendor hosted app and we aren't allowed to.
    u
    • 2
    • 3
  • m

    Manish Tomar

    01/16/2023, 11:00 AM
    connectionTimeout
    parameter for Hikari JDBC connection can be changed if we change the JDBC url Params from UI of Postgres source Connector ? https://github.com/airbytehq/airbyte/pull/15226, If yes How can I check whether it successfully overrides the default value`connectionTimeout = 10000ms` or not?
    u
    n
    • 3
    • 8
  • m

    Miguel Ángel Torres Font - Valencia C.F.

    01/16/2023, 1:59 PM
    Hello all! This morning I upgraded the airbyte version and one of the connections seems to have disappeared. It still appears in the list, but when I click on it, it tells me that an error has occurred. I have replicated the connection and it works perfectly, which means that there was no connectivity failure. What I am looking for now is to try to remove this connection from the list, as I can't get rid of it in any way from the GUI and I would try not to remove it by force at the database level. Any ideas?
    u
    u
    • 3
    • 4
  • m

    Miguel Ángel Torres Font - Valencia C.F.

    01/16/2023, 2:00 PM
    Feel free to contact me if more information is needed
  • a

    Assaf Pinhasi

    01/16/2023, 2:25 PM
    Incremental sync and initial data in destination table: • I have a huge table in BigQuery which I synced using Google tools (one off). • Currently, the maximal value of the column
    last_update_dt
    is X • Now, I want to define an incremental sync to that table. The sync will be based on
    last_update_dt
    column. • I want the first sync that runs to start from records where
    last_update_dt > X
    i.e. I want the incremental sync to pick up from the current state of the destination table (vs. from scratch). Is that possible? [edit] Is it possible to do something like set the connection state in the postgres db to point to the latest value of the column?
    n
    o
    • 3
    • 7
  • l

    Lucas Gonthier

    01/16/2023, 2:54 PM
    Hi team, I have a use case where I have a connection for each of our customers. I would like to know if it's okay to create a workspace for each one of them (We only use the API not the UI unless we need to debug). In that case there will be 1 connection and 1 workspace per customer. Is it scalable ? Is there any problem doing this ? All the connections could run at the same time ?
    s
    • 2
    • 2
  • v

    Vitória de Barros Carvalho

    01/16/2023, 6:01 PM
    Hi team! I need help finding three columns of custom insights on the facebook marketing source. They are: website leads, on facebook leads and website checkouts initiated. Does anyone has any idea of how I could find those?
  • p

    Philip Johnson

    01/16/2023, 6:59 PM
    Hey All! Getting the following error on a custom low-code connector when trying to add it as a source:
    ISO8601Error("Unable to parse duration string '1d'")
    , which is strange. Previously, in the tutorial this was the value shown as an example, but now it looks like it's
    P1D
    for both step and cursor granularity. If I try and use those locally, I get a different error:
    Something went wrong in the connector. See the logs for more details...line 190, in _parse_timedelta\n  assert parts is not None\nAssertionError\n.
    Can anyone provide a bit of guidance on what the step and cursor_granularity should actually be here? Coming from the stream slicer section in the yaml file:
    Copy code
    stream_slicer:
          type: "DatetimeStreamSlicer"
          start_datetime:
            datetime: "{{ config['start_date'] }}"
            datetime_format: "%Y-%m-%d"
          end_datetime:
            datetime: "{{ now_utc() }}"
            datetime_format: "%Y-%m-%d %H:%M:%S.%f+00:00"
          step: "1d"
          datetime_format: "%Y-%m-%d"
          cursor_field: "{{ options['stream_cursor_field'] }}"
          cursor_granularity: "day"
        $options:
          name: "austin_permits"
          primary_key: "permit_number"
          path: "/3syk-w9eu.json?$where=issue_date>='{{config['start_date'] or 'latest'}}'"
          cursor_field: "issue_date"
    s
    • 2
    • 8
  • a

    Apoorva

    01/16/2023, 8:18 PM
    Hi
    s
    • 2
    • 1
  • a

    Apoorva

    01/16/2023, 8:18 PM
    I am trying to deploy airbyte on ec2 instance using docker
  • a

    Apoorva

    01/16/2023, 8:18 PM
    but as I run docker-compose up
  • a

    Apoorva

    01/16/2023, 8:19 PM
    airbyte started installing on docker container
  • a

    Apoorva

    01/16/2023, 8:19 PM
    issue is deployment is still taking place and airbyte-tepmoral is still running/installing on docker
1...122123124...245Latest