https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • v

    Vicky Kwan

    04/21/2023, 6:05 PM
    Hi team, I'm experimenting with the Snowflake object tagging feature. My team would be interested to apply the decentralized approach (under the Tag Benefits there's a section on Centralized or Decentralized Management). In our model, we would set up a
    tag_admin
    role for creating tags, and allow Airbyte's service role to apply tags. I'm imagining Airbyte will be able to apply various tags based on the connectors. How do we achieve this when Airbyte creates a new Snowflake Schema, and applies the tags we want on that schema? Any tips will be appreciated!
    k
    • 2
    • 2
  • e

    Ella Rohm-Ensing (Airbyte)

    04/21/2023, 6:16 PM
    @kapa.ai do I need a developer token to use the google ads connector in airbyte cloud?
    k
    • 2
    • 2
  • s

    Slackbot

    04/21/2023, 6:30 PM
    This message was deleted.
  • s

    Srikanth Sudhindra

    04/21/2023, 6:34 PM
    Hi All, I am seeing the following error when I hit the
    refresh source schema
    under connections -> replication tab -
    Internal Server Error: Cannot invoke "io.airbyte.api.model.generated.AirbyteCatalog.getStreams()" because "discovered" is null
    Airbyte OSS is on EKS and deployed using helm 0.45.11. Anyone seen this issue before ?
    k
    • 2
    • 3
  • a

    Anjaneyulu K

    04/21/2023, 6:56 PM
    'Show terse objects in "DB".__AIRBYTE__SCHEMA' running continuously from Airbyte snowflake-destination sync. But __AIRBYTE__SCHEMA does not exist. Could anyone help me to fix this
    k
    • 2
    • 2
  • c

    Christopher Wu

    04/21/2023, 7:11 PM
    Assuming a k8s Airbyte deployment, are logs in Minio rotated automatically after a certain lifespan / size, and can that be configured easily?
    k
    • 2
    • 2
  • j

    James Liebler

    04/21/2023, 7:23 PM
    Hey everyone, is it possible to know the size of a sync before the job is initiated? This would be using API's directly.
    k
    • 2
    • 2
  • r

    Ravi

    04/21/2023, 10:10 PM
    Hi all, I’m trying to setup an s3 source to write to redshift and I get this error. Is there a way to turn debug on to see what is really happening?
    Copy code
    2023-04-21 21:54:21 [1;31mERROR[m i.a.w.i.VersionedAirbyteStreamFactory(internalLog):313 - Traceback (most recent call last):
      File "/airbyte/integration_code/source_s3/source_files_abstract/formats/csv_parser.py", line 33, in inner
        return fn(self, file, file_info)
      File "/airbyte/integration_code/source_s3/source_files_abstract/formats/csv_parser.py", line 188, in get_inferred_schema
        schema_dict = self._get_schema_dict(file, infer_schema_process)
      File "/airbyte/integration_code/source_s3/source_files_abstract/formats/csv_parser.py", line 196, in _get_schema_dict
        return run_in_external_process(
      File "/airbyte/integration_code/source_s3/utils.py", line 40, in run_in_external_process
        raise potential_error
    pyarrow.lib.ArrowInvalid: CSV parse error: Empty CSV file or block: cannot infer number of columns
    k
    r
    • 3
    • 5
  • u

    UUBOY scy

    04/22/2023, 10:32 AM
    After I modify
    airbyte-integrations/bases/base-normalization/normalization/transform_catalog/stream_processor.py
    , how can I build a testing normalization image to test?
    k
    • 2
    • 17
  • s

    Slackbot

    04/22/2023, 2:09 PM
    This message was deleted.
    k
    p
    • 3
    • 5
  • u

    UUBOY scy

    04/22/2023, 5:15 PM
    Hi All. I’d like to replace the default normalization image with my own one, but it cannot be enable. I modify
    destination_definitions.yaml
    and run
    ./gradlew :airbyte-config:init:processResources
    to regenerate oss_catalog.json, and then restart Airbyte via docker compose on local machine. Anyone has idea?
    k
    • 2
    • 8
  • u

    UUBOY scy

    04/23/2023, 4:09 AM
    Hi All, I’d like to test custom normalization image, so I replace the default normalization image with my own one, but it cannot be enable. I modify
    destination_definitions.yaml
    and run
    ./gradlew :airbyte-config:init:processResources
    to regenerate oss_catalog.json, and then restart Airbyte via docker compose on GCP compute engine, also tried restarting Airbyte instance, but the default airbyte/normalization image still be used. Anyone has idea?
  • r

    Rutger Weemhoff

    04/23/2023, 7:07 AM
    Hi all! I built a custom connector using the low code connector builder for my REST API source. This source provides a "lastupdatetimestamp" in timezone Europe/Amsterdam without timezone, so it looks like an UTC timestamp. When using incremental sync Airbyte stores the latest received "lastupdatetimestamp". For Airbyte this looks like a timestamp 2 hours in the future. When I trigger a new manual sync, Airbyte decides not to do anything because to Airbyte it seems like the current time is earlier than the stored state. I cannot change the way the source API provides timestamps. Any ideas to fix this timezone mismatch?
    k
    j
    • 3
    • 7
  • y

    yuan sun

    04/23/2023, 9:03 AM
    Hi all! I added a lot of sources during the test, but I want to delete them now, what should I do? @
    🙏 1
  • r

    Rytis Zolubas

    04/23/2023, 11:03 AM
    Hello, I have 10 workspaces with the same connections. How can I manage effectively changes (e.g. schema/replication) across all workspaces? What would be the best practice? e.g. So I have facebook -> snowflake connection and I want to add an additional stream for all facebook -> snowflake connections in 10 different workspaces. Thanks!
    k
    • 2
    • 2
  • r

    Ryan Taylor

    04/23/2023, 2:47 PM
    Sync is failing and the ERROR in the logs is "2023-04-23 143840 [1;31mERROR[m i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):216 - Sync output exceeds the max temporal message size of 2097152, actual is 2537771." What does this mean and how can I resolve this error?
    k
    • 2
    • 3
  • u

    UUBOY scy

    04/24/2023, 4:58 AM
    Hi All, I am using docker compose to run Airbyte on GCE, how can I configure global timezone for every container, as well as newly container, created?
    k
    j
    • 3
    • 12
  • k

    King Ho

    04/24/2023, 6:54 AM
    Hi Airbyte-ers! We having general normalisation errors pushing data from hubspot into bigquery. We have our raw tables which are fine but as soon as we have to normalise into bq tables there are some rows which kick out an error. These are normally struct fields. Meanwhile our exact same pipeline on Airbyte Cloud completes with no errors. Is there some resource limitation or something similar that we need to set up on our OSS version for workers or something to get the pipeline to complete / for a more stable experience?
    k
    • 2
    • 3
  • a

    Asutosh Nayak

    04/24/2023, 8:06 AM
    When i am sending an request to airbyte api to create a source I am getting the below error. HTTPConnectionPool(host='http://airbyte.xyz.com', port=80): Max retries exceeded with url: /api/v1/sources/create (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f7561511400>: Failed to establish a new connection: [Errno -2] Name or service not known'))
    k
    • 2
    • 4
  • m

    Massy Bourennani

    04/24/2023, 8:32 AM
    👋 I’m looking at the different metrics airbyte provides and I have question on some of them:
    👀 1
    k
    • 2
    • 9
  • m

    multazim deshmukh

    04/24/2023, 11:55 AM
    Hi, Does Airbyte supports data replication to Redshift from MS SQL Server using Change Tracking ?
    k
    • 2
    • 2
  • m

    Mehmet Berk Souksu

    04/24/2023, 1:32 PM
    Hello, I am using Airbyte to load data from Postgres to Postgres. And it is loading everything without any problem but I see a lot of messages like:
    Copy code
    INFO i.a.v.j.JsonSchemaValidator(test):130 - JSON schema validation failed. 
    errors: $: null found, string expected
    Is there any way to limit this or show only once?
    k
    • 2
    • 3
  • r

    Rafael Rossini

    04/24/2023, 1:48 PM
    Can i start a job via api? my airbyte is self hosted on an ec2 linux instance.
    k
    m
    • 3
    • 4
  • c

    Corey Tucker

    04/24/2023, 1:51 PM
    hey, im a complete novice to data warehouses etc. im looking to pull together all of my sales data from bigcommerce, amazon, my POS system called VEND and shopify all under one umbrella and to be able to report on it on powerbi
    k
    u
    • 3
    • 3
  • s

    Slackbot

    04/24/2023, 1:52 PM
    This message was deleted.
  • m

    Mehmet Berk Souksu

    04/24/2023, 2:06 PM
    Hello, Another question related to the sync between 2 postgres databases. JSONB column in the source database reflected as String in the destination database. Is there solution to this other then custom transformations?
    k
    • 2
    • 2
  • s

    Sean Zicari

    04/24/2023, 3:25 PM
    Good morning. I just updated to Airbyte 0.44.0 using the Helm chart and disabled the connector-builder-server. The webapp process is in a crash loop backoff because connector-builder-server is unavailable. Is there additional configuration needed so the webapp doesn’t look for the connector-builder-server? The only configuration I see is this:
    Copy code
    connector-builder-server:
        url: /connector-builder-api
    k
    • 2
    • 3
  • d

    Dandi Qiao

    04/24/2023, 4:07 PM
    Hi, I’m wondering when will the feature to select only certain columns in a table be released for the open source Airbyte? I’ve seen it in the airbyte cloud version.
    k
    • 2
    • 2
  • m

    Mathieu Lamiot

    04/24/2023, 6:22 PM
    Hello folks 👋 I’m starting up experimenting with Airbyte x GitHub today. I set up a Github -> PostgreSQL connection successfully: I see issues, PR, etc in my database. However, my projects are not retrieved. Looking at the documentation, it seems the connector outputs the Projects stream, but nothing related to ProjectV2 which is becoming the standard for GitHub project. Am I missing something or is this feature not supported by Airbyte? Thank you for your help!
    ✅ 1
    k
    • 2
    • 2
  • j

    Jon VerLee

    04/24/2023, 8:30 PM
    Hi all! Perhaps a rookie question but I’m trying to figure out what endpoint the API lives at when I’m running airbyte locally. For example, the cloud hosted endpoint for sources is: https://api.airbyte.com/v1/sources However, when I run the same thing locally: http://api.localhost:8000/v1/sources I’m simply redirected to the UI interface. What am I missing?
    k
    x
    • 3
    • 10
1...186187188...245Latest