https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • f

    Felipe Castro

    09/20/2022, 9:02 PM
    Hello! Does someone know why (Airbyte 0.40.7), connection (MySQL 0.6.13 -> S3 0.3.15) is logging lines like this?
    2022-09-20 20:42:41 *ERROR* c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2021-06-22T18:51:25.000000
    h
    • 2
    • 2
  • g

    Gemini Keith

    09/21/2022, 2:12 AM
    Hello, smart guys~ Has anyone ever seen following issue when adding a new source?
    Copy code
    2022-09-21 02:10:20 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-21 02:10:20 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
    h
    • 2
    • 1
  • d

    Derek Herincx

    09/21/2022, 4:37 AM
    Hi everyone, I’m a newbie to Airbyte (building a simple use-case since we’re looking to move away from Stitch) and I’m running into the following issue: I’ve defined a new connector and was able to successfully build out the Docker image, create the new connector locally, and establish a connection. When I try to ingest data for my “segment” stream with
    Basic Normalization
    , I get the following error (I know that the connector works because when I try to ingest with the normalization setting set to
    Raw data (JSON)
    , I am able to successfully retrieve data):
    Copy code
    2022-09-21 04:30:31 normalization >   File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/transform.py", line 36, in run
    2022-09-21 04:30:31 normalization >     self.process_catalog()
    2022-09-21 04:30:31 normalization >   File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/transform.py", line 64, in process_catalog
    2022-09-21 04:30:31 normalization >     processor.process(catalog_file=catalog_file, json_column_name=json_col, default_schema=schema)
    2022-09-21 04:30:31 normalization >   File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/catalog_processor.py", line 55, in process
    2022-09-21 04:30:31 normalization >     stream_processors = self.build_stream_processor(
    2022-09-21 04:30:31 normalization >   File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/catalog_processor.py", line 138, in build_stream_processor
    2022-09-21 04:30:31 normalization >     properties = get_field(get_field(stream_config, "json_schema", message), "properties", message)
    2022-09-21 04:30:31 normalization >   File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/catalog_processor.py", line 230, in get_field
    2022-09-21 04:30:31 normalization >     raise KeyError(message)
    2022-09-21 04:30:31 normalization > KeyError: "'json_schema'.'properties' are not defined for stream segments"
    ✅ 1
    h
    • 2
    • 6
  • b

    BASTIN JERRY

    09/21/2022, 11:02 AM
    Hey guys, I am developing an HTTP source connector, is incremental sync possible for an HTTP source?
    g
    h
    • 3
    • 3
  • d

    Dudu Vaanunu

    09/21/2022, 11:40 AM
    HI all, Did anyone managed to create a PagerDuty incident by using the Notification webhook option in Airbyte? (Not with Slack) If so, I would love to know how you’ve created it. Thanks!
    ✍️ 1
    ✅ 1
    m
    • 2
    • 4
  • o

    Opeyemi Daniel

    09/21/2022, 11:50 AM
    Hello team Can someone explain how to handle arrays in airbyte. I'm trying to migrate data from mongo db
    h
    • 2
    • 1
  • r

    Ricardo de Deijn

    09/21/2022, 2:57 PM
    Hello, I’m getting an error message called
    "'TokenAuthenticator' object is not callable"
    . I get this error message after calling the next page on an endpoint. The first page gets synced perfectly fine, with a working TokenAuthenticator. What could be the problem here?
    ✅ 1
    m
    • 2
    • 5
  • z

    Zaza Javakhishvili

    09/21/2022, 8:04 PM
    Hi, Someone please help solve this issue https://github.com/airbytehq/airbyte/issues/16841
    ✍️ 1
    s
    • 2
    • 3
  • d

    Dimitris Bachtsevanis

    09/21/2022, 8:19 PM
    Hi, I have a couple of connections that are stuck and cannot do nothing with them. Cannot even stop the operation or delete the connection
    m
    • 2
    • 1
  • r

    Robert Put

    09/22/2022, 1:15 AM
    Hello! Trying out airbyte to sync postgres data to snowflake, It works for the Full syncs of data, but i can't seem to find the incremental options, Postgres 13.7 -> snowflake postgres source -> 1.0.10 snowflake dest -> 0.4.36 airbyte -> 0.40.7 all the tables have:
    updated_at
    fields to use as cursor but no option to select them. Previous was processing the same data through stitch so i'd think it would be possible.
    m
    • 2
    • 4
  • s

    Srinidhi krishnamurthy

    09/22/2022, 4:57 AM
    Hello Team , i was setting up datadog for monitoring airbyte , we are facing similar issue highlighted in the discussion here. https://discuss.airbyte.io/t/setup-datadog-monitoring/1270/6 , Issue : • Worker metrics not available in datadog dashboard , Datadog runs as Host Agent • Worker metrics are available in datadog dashboard , Datadog runs as Docker Agent as per the discussion , it seems to change the worker network to 'host' , question : • would there be any additional changes to do for the network of a container to change to host ? as all containers are communicating over bridge and running on single EC2 instance ? • we would like to have DataDog running as Host Agent because of some advantages.
  • a

    Akshay Baura

    09/22/2022, 6:42 AM
    Hello team, is s3 authentication via ec2 instanceprofile only allowed for s3 destination ? Asking since I dont see it mentioned for s3 source in the documentation.
    ✅ 1
    h
    m
    • 3
    • 4
  • y

    Yousef Hosny

    09/22/2022, 6:54 AM
    Hi all, I have a question regarding setting up S3 as a destination, currently I want to load the data source to different folders in S3, is there an efficient way to do so or I have to create a single S3 destination for each of the (output) s3 folders and create individual connections, please advise me
    ✍️ 1
    s
    • 2
    • 2
  • s

    Shashank Tiwari

    09/22/2022, 7:22 AM
    Hi all, wanted to know how do you find out the operationIds to create a connection using API?
    ✅ 1
    h
    • 2
    • 1
  • o

    Océane Fontaine

    09/22/2022, 10:18 AM
    Hi, I have the same problem, anyone else ?
  • r

    Robert Put

    09/22/2022, 12:51 PM
    for self hosted airbyte would there be a reason a postgres source shows no option for incremental snyc? I see part of the docs that say you will have to manually pick the cursor for postgres also supported types. But it only shows options for full refresh... i have this working through stitch currently and would like to migrate
    m
    • 2
    • 27
  • r

    Ricardo de Deijn

    09/22/2022, 3:14 PM
    Hello, does someone know how I can delete a custom source in the UI? Im running the UI in a local docker container
    m
    n
    • 3
    • 4
  • k

    kiran

    09/22/2022, 3:58 PM
    I cannot seem to delete a connection (in the UI). Every time I go through the
    Delete Connection
    section, it doesn’t actually disappear. It reappears after refreshing. I deleted the source and destination so I don’t know why the connection won’t go away. Has anyone else had this problem?
    l
    • 2
    • 3
  • a

    Amos Gutman

    09/22/2022, 4:52 PM
    anyone has some examples on how to setup the jdbc images ?
    m
    • 2
    • 1
  • d

    Davis Ford

    09/22/2022, 5:32 PM
    Can anyone tell me how airbyte deals with deleted rows? Does it just ignore them on replication?
    s
    • 2
    • 2
  • z

    Zawar Khan

    09/22/2022, 6:01 PM
    Hi Team, Can anybody reply to my query. https://discuss.airbyte.io/t/incremental-sync-acceptance-test-failure/2668
    s
    • 2
    • 1
  • d

    Dimitris Bachtsevanis

    09/22/2022, 7:16 PM
    Is this your first time deploying Airbyte: No OS Version / Instance: Rocky Linux Memory / Disk: 64Gb / 4Tb SSD Deployment: Docker Airbyte Version: 0.40.8 Description: Some sync tasks are stuck and it's impossible to stop the process or delete the connection.
    ✍️ 1
    m
    • 2
    • 2
  • d

    Daniel Le'Sage

    09/23/2022, 12:04 AM
    Has anyone figured out how to host airbyte with a reverse proxy to make the webapp available under the default server? It is currently exposed through port 8000 but I don’t like having to access my.domain.com:8000
    ✍️ 1
    s
    • 2
    • 6
  • z

    zafar mahmood

    09/23/2022, 6:17 AM
    Hi can anyone share how can I connect Airbyte (ec2) with external RDS database ? as according to this documentation https://docs.airbyte.com/operator-guides/configuring-airbyte-db/ . I have followed the instructions of changing the host name and other parameters in .env file, also removed db service from docker-compose file but airbyte is not running.
    ✍️ 1
    s
    • 2
    • 3
  • h

    Hiep Minh Pham

    09/23/2022, 9:16 AM
    Hi All, Anybody has some experience with using Google Search Console connector? I am having a hard time to setup the connection. How could we get the
    refresh token
    from the Service Account credentials?
    ✅ 1
    ✍️ 1
    s
    • 2
    • 3
  • m

    Mané Rom

    09/23/2022, 4:14 PM
    Good afternoon. I would want to discuss with an Airbyte Team member to know if our use-case with Airbyte applies for the MIT-Licensed layer. Thank you
    m
    w
    • 3
    • 4
  • t

    Thomas C

    09/23/2022, 5:22 PM
    I noticed some alpha connectors are not yet available on the managed Cloud offering (e.g. S3/Kafka), but available on self-hosted. Is there a way to know in advance which and when alpha connectors will be made available on Cloud?
    m
    a
    • 3
    • 4
  • j

    Jonathan Crawford

    09/23/2022, 6:41 PM
    Hey gang! I’ve been working on an initial import for 2 weeks and haven’t been able to get it to complete. This process dies at the very end of the DBT normalization step while the Postgres target database is doing hours of vacuuming. It eventually results in a timeout. Here’s a screenshot of the output. The table it’s attempting to normalize is 68GB at the source. Any guidance or experience here? Thanks in advance!
    ✍️ 1
    m
    e
    c
    • 4
    • 19
  • d

    Davis Ford

    09/23/2022, 7:20 PM
    Hi, when I turned on Incremental Sync + Deduped + History on a table, I set the cursor field to be
    updated_at
    -- when I manually ran a sync, I see an identical row for each primary key now. The
    updated_at
    is the same for both, and all the other fields are the same except the
    _AIRBYTE_UNIQUE_ID
    column where one contains an id and the other is
    NULL
    . This was unexpected, as I expected it to dedupe. The first sync I did the mode was Full Refresh + Overwrite, then I switched it to Incremental Sync + Dedupe + History, but I reset the connection so it should have cleared the table. Any ideas on what I'm missing here?
    ✍️ 1
    e
    s
    a
    • 4
    • 8
  • b

    Bassem Ben Jemaa

    09/25/2022, 8:39 PM
    Hi Gurus I started building some stuff with Airbyte. The first thing I've done is to configure Incremental sync for one table (from Salesforce) called Account into Redshift. The result is 3 tables instead of one in Redshift
    ✍️ 1
    s
    • 2
    • 3
1...676869...245Latest