https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • t

    Tiger Sun

    02/24/2023, 12:49 AM
    hi all! i’ve been working on deploying airbite into our k8s cluster via helm for the last couple hours but been running into issues generally with temporal networking with an error starting the scanner, has anyone seen this before? using version 0.43.29 — maybe there’s another stable version of airbyte that people are using that might be recommended?
    Copy code
    {
      "level": "fatal",
      "ts": "2023-02-24T00:23:43.647Z",
      "msg": "error starting scanner",
      "service": "worker",
      "error": "context deadline exceeded",
      "logging-call-at": "service.go:233",
      "stacktrace": "<http://go.temporal.io/server/common/log.(*zapLogger).Fatal\n\t/temporal/common/log/zap_logger.go:150\ngo.temporal.io/server/service/worker.(*Service).startScanner\n\t/temporal/service/worker/service.go:233\ngo.temporal.io/server/service/worker.(*Service).Start\n\t/temporal/service/worker/service.go:153\ngo.temporal.io/server/service/worker.ServiceLifetimeHooks.func1.1\n\t/temporal/service/worker/fx.go:80|go.temporal.io/server/common/log.(*zapLogger).Fatal\n\t/temporal/common/log/zap_logger.go:150\ngo.temporal.io/server/service/worker.(*Service).startScanner\n\t/temporal/service/worker/service.go:233\ngo.temporal.io/server/service/worker.(*Service).Start\n\t/temporal/service/worker/service.go:153\ngo.temporal.io/server/service/worker.ServiceLifetimeHooks.func1.1\n\t/temporal/service/worker/fx.go:80>"
    }
    ➕ 2
    m
    u
    s
    • 4
    • 4
  • x

    xi-chen.qi

    02/24/2023, 1:33 AM
    hi, teams. I want to debug docker contains locally, but I don’t know how to set it up. If there is a detailed document explaining it, maybe it can solve my confusion
    n
    • 2
    • 1
  • c

    Chen Lin

    02/24/2023, 2:22 AM
    Hi guys, I'm using S3 connector to sync data, from the doc it says output file can be compressed, I would like the connector to output gzip data, how would I turn the compression on? TIA
    x
    • 2
    • 4
  • h

    Hai Huynh

    02/24/2023, 4:02 AM
    Hi every one! I have 1 issue. I am using postgres with incremental dedup mode and normalize and now i want to clean up data. My postgres database containing 4 tables that are: stg, _airbyte_raw_, scd and table deduped My ask is: how can table need clean up? My preference document: https://docs.airbyte.com/understanding-airbyte/connections/incremental-deduped-history#overview Thanks!
    m
    u
    • 3
    • 5
  • h

    Hamid Shariati

    02/24/2023, 9:19 AM
    my co-workers try to use Laravel commands to migrate from old system, only because of DTO. How can I tell them using Airbyte is better and less time consuming? How can I produce my docker image transformer to merge some tables to one table in new system (both of them postgres) or one table to there table of new system?
    u
    • 2
    • 1
  • e

    Erik Alfthan

    02/24/2023, 9:54 AM
    Hello Community First time poster - bear with me if I am unclear. I have set up Airbyte with helm on AKS. Unfortunately, my IT department forced me behind a firewall and things became much harder. My current issue is that I can not setup connectors and I do not know why. (I expect either routing or firewalls are misconfigured somewhere. The symptoms are the same for both a source connector and a target connector
    Copy code
    2023-02-24 09:32:49 INFO i.a.w.t.TemporalAttemptExecution(get):138 - Cloud storage job log path: /workspace/41a0d0ed-961c-42d7-9130-6b533d235309/0/logs.log
    2023-02-24 09:32:49 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.40.32
    2023-02-24 09:32:49 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):172 - Attempt 0 to save workflow id for cancellation
    2023-02-24 09:32:49 INFO i.a.c.i.LineGobbler(voidCall):114 - 
    2023-02-24 09:32:49 INFO i.a.w.p.KubeProcessFactory(create):98 - Attempting to start pod = source-mssql-check-41a0d0ed-961c-42d7-9130-6b533d235309-0-aaori for airbyte/source-mssql:0.4.28 with resources io.airbyte.config.ResourceRequirements@5ca586f[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=] and allowedHosts null
    2023-02-24 09:32:49 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- START CHECK -----
    2023-02-24 09:32:49 INFO i.a.c.i.LineGobbler(voidCall):114 -
    EDIT: Adding some details on my setup: I'm using the helm chart, version 0.43.29 with and external azure flexible postgres, and enabled TLS on the temporal server (setting extraEnvs SQL_TLS and SQL_TLS_ENABLED)
    s
    • 2
    • 4
  • s

    Shraddha Borkar

    02/24/2023, 10:55 AM
    Hello Team, I am trying the OpenWeather source. Is there any way to get data for more than one geo point? Thanks!
    n
    • 2
    • 1
  • j

    Julien F

    02/24/2023, 1:31 PM
    Hi everyone, my connection (with a custom source, Firestore, and a BigQuery destination), keeps disabling itself daily at random. The last sync passes though. It seems to be linked to this log line, which is the last line of the last sync before the connection is stopped:
    Copy code
    2023-02-24 09:50:15 [32mINFO[m i.a.c.t.StreamResetRecordsHelper(deleteStreamResetRecordsForJob):50 - deleteStreamResetRecordsForJob was called for job 2706 with config type sync. Returning, as config type is not resetConnection.
    I looked at docker logs but couldn’t find anything. Does someone have an idea of what is happening? Everything works well the rest of the time
    a
    u
    • 3
    • 4
  • a

    An R

    02/24/2023, 4:35 PM
    Hi, our worker is crashing due to
    ERROR i.m.r.Micronaut(handleStartupException):338 - Error starting Micronaut server: null
    has anybody experienced this?
    m
    s
    a
    • 4
    • 8
  • r

    Robert Put

    02/24/2023, 5:41 PM
    Just relinkin since its been a few days and i still have no ideas: https://airbytehq.slack.com/archives/C021JANJ6TY/p1677003163724079
    n
    • 2
    • 2
  • k

    Krzysztof

    02/24/2023, 6:08 PM
    Hi guys
    ✅ 1
  • i

    Igor Safonov

    02/24/2023, 6:34 PM
    Hi, I am using python airbyte_cdk to write my own connector and have some difficulties with
    stream_slices
    method. Is there a way to compute next slice depending on the state after the previous slice? My problem. Source HTTP GET api supports parameters:
    start_datetime
    ,
    end_datetime
    ,
    max_rows
    (the last one has the maximum value of 1_000_000). I need to download 2023-02-20...2023-02-22. The slices are [2023-02-20, 2023-02-21, 2023-02-22]. But if I have more than
    max_rows
    entries to fetch in one date (e.g. 1_200_000, then 200_000 entries get lost), because the cdk goes to the next slice that has been computed before, and there is no way to know how many rows in advance.
    w
    • 2
    • 2
  • g

    Gunnar Lykins

    02/24/2023, 6:45 PM
    Hi there - we are currently in the process of deploying our helm charts with version
    0.44.1
    , and are running into an error regarding
    STRICT_COMPARISON_NORMALIZATION_TAG
    . It appears that this PR resolves this issue, when is this planned to be merged with master? Thanks! 🙂 cc: @Shashank Singh
    ✅ 1
    s
    s
    • 3
    • 2
  • k

    Konstantin Lackner

    02/24/2023, 7:00 PM
    Good afternoon! While using the WooCommerce Source connector, I run into the following error when syncing only the
    products
    stream. Can someone help? Very much appreciated!
    Copy code
    2023-02-24 18:57:24 normalization > Traceback (most recent call last):
    2023-02-24 18:57:24 normalization >   File "/usr/local/bin/transform-catalog", line 8, in <module>
    2023-02-24 18:57:24 normalization >     sys.exit(main())
    2023-02-24 18:57:24 normalization >   File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/transform.py", line 104, in main
    2023-02-24 18:57:24 normalization >     TransformCatalog().run(args)
    2023-02-24 18:57:24 normalization >   File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/transform.py", line 36, in run
    2023-02-24 18:57:24 normalization >     self.process_catalog()
    2023-02-24 18:57:24 normalization >   File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/transform.py", line 64, in process_catalog
    2023-02-24 18:57:24 normalization >     processor.process(catalog_file=catalog_file, json_column_name=json_col, default_schema=schema)
    2023-02-24 18:57:24 normalization >   File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/catalog_processor.py", line 64, in process
    2023-02-24 18:57:24 normalization >     stream_processor.collect_table_names()
    2023-02-24 18:57:24 normalization >   File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/stream_processor.py", line 227, in collect_table_names
    2023-02-24 18:57:24 normalization >     for child in self.find_children_streams(self.from_table, column_names):
    2023-02-24 18:57:24 normalization >   File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/stream_processor.py", line 361, in find_children_streams
    2023-02-24 18:57:24 normalization >     elif is_combining_node(properties[field]):
    2023-02-24 18:57:24 normalization >   File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/utils.py", line 120, in is_combining_node
    2023-02-24 18:57:24 normalization >     if data_type.ONE_OF_VAR_NAME in properties and any(
    2023-02-24 18:57:24 normalization >   File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/utils.py", line 121, in <genexpr>
    2023-02-24 18:57:24 normalization >     data_type.WELL_KNOWN_TYPE_VAR_NAME in option[data_type.REF_TYPE_VAR_NAME] for option in properties[data_type.ONE_OF_VAR_NAME]
    2023-02-24 18:57:24 normalization > KeyError: '$ref'
    n
    • 2
    • 3
  • k

    Konstantin Lackner

    02/24/2023, 7:08 PM
    ** Need help with debugging GA4 Connector* * About the Google Analytics 4 (GA4) connector: I'm receiving the data fine in BigQuery only for the tables related to
    active_users
    - so the daily, weekly and four weekly. However all other tables are empty! Can someone point me in a direction on how to debug this? Would be greatly appreciated. 1. Airbyte version: 0.40.32 2. GA4 Connector version: 0.1.1 3. Find the logs attached in the thread
    n
    y
    • 3
    • 3
  • s

    Shashank Singh

    02/24/2023, 7:09 PM
    Heya Team Airbyte, Thanks for open-sourcing this software and building great community around it . We are trying to build a POC on EKS + Helm , for our org and which would lead to a bigger presence of Airbyte in our org . We can't seem to catch any break with a working helm chart, with Error in Micronaut to STRICT_COMPARISON_NORMALIZATION_TAG. Our ask : We were wondering if there is a stable/sample helm chart +values.yml we can try with.
    ✅ 1
    this 1
    j
    a
    +3
    • 6
    • 12
  • m

    Mayank V

    02/24/2023, 7:32 PM
    Hi Team Currently during the sync source schema is inferred, is there a way to enforce a schema for a source? Is it possible to do from UI? How does the user generally enforce the source schema?
    w
    • 2
    • 2
  • m

    Michael Taylor

    02/24/2023, 8:13 PM
    I am trying to load S3 data with compressed csv files into Redshift using airbyte. I get a less than helpful
    Failure Origin: source, Message: Something went wrong in the connector. See the logs for more details.
    when I run and if I dig through hundreds of lines of log files I see a message:
    Copy code
    pyarrow.lib.ArrowInvalid: In CSV column #6: CSV conversion error to int64: invalid value '1,830'
    Is this related to the normalizer? Is there an option I can set to either - leave things as strings or handle the commas in the numbers ? I can't redo thousands of csv file formats prior to loading. I hope I am missing something simple here.
    s
    • 2
    • 7
  • m

    Mike B

    02/24/2023, 8:44 PM
    Is there any documentation on what the enum states for streams (global | stream | legacy) mean? I have an incremental connection that I'd like to force to use a specific cursor value, and found the POST state/create_or_update endpoint, but am unsure of which value I should use for the stateType field.
    u
    u
    • 3
    • 3
  • b

    Bruno Azevedo

    02/24/2023, 11:20 PM
    Hi everyone, I created a source-connector that pulls data from an API and it has two incremental streams. The issue is that when I run it with a stream_state, seems like Airbyte first runs the
    stream_slices
    method as
    full-refresh
    but does not ingest any data, and after that, it runs the correct
    stream_slices
    method with the correct date: I put a `logger.info(slices)`inside the connector and here are the results:
    n
    • 2
    • 3
  • b

    Balasubramanian T K

    02/25/2023, 8:32 AM
    Hello, there is a weird error like: \033[31mdocker compose not found! please install docker compose!\033[39m after running ./run-ab-platform.sh in the base directory Anybody knows the fix?
  • b

    Balasubramanian T K

    02/25/2023, 8:34 AM
    Tried: sudo curl -L "https://github.com/docker/compose/releases/download/v2.16.0/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose sudo mv /usr/local/bin/docker-compose /usr/bin/docker-compose sudo chmod +x /usr/bin/docker-compose
    • 1
    • 1
  • b

    Balasubramanian T K

    02/25/2023, 9:02 AM
    Update: Fix found, the pulled docker-compose.yaml from the repo is empty. I don't know the reason So I copied the definition from repo
  • m

    Minhaj Pasha

    02/25/2023, 2:16 PM
    Hello- Unable to install airbyte on GCP :followed instructions given as per link https://docs.airbyte.com/deploying-airbyte/on-gcp-compute-engine
  • m

    Minhaj Pasha

    02/25/2023, 2:16 PM
    it's failing at step : mkdir airbyte && cd airbyte curl -sOO https://raw.githubusercontent.com/airbytehq/airbyte-platform/main/{.env,flags.yml,docker-compose.yaml} docker compose up -d
    🐞 1
    w
    m
    • 3
    • 2
  • s

    Srikanth Sudhindra

    02/25/2023, 3:25 PM
    Hi All, I have installed Airbyte on K8s using helm. My question is how can I add additional entries to configmap since this is auto generated during installation in this case ?
    s
    c
    s
    • 4
    • 8
  • a

    Avi Sagal

    02/26/2023, 3:44 PM
    Hi All, i’m starting to use the Klaviyo to Postgres connector and I can see that even with normalized data there’s still nested data. we’re thinking about adding a transformer layer to extract the relevant data. does using custom transformer disable the use of the default normalization? did anyone else encounter this situation and has a better approach? thanks:)
    • 1
    • 1
  • i

    i.am

    02/26/2023, 7:45 PM
    Hi y’all! I was reading the documentation on the Google ads, Bing ads, Facebook ads connectors and it seems like they are only available to bring data from those networks to my warehouse. Is there a way to send it in reverse — from warehouse to ad network? One of my clients is currently using highlevel for this, but I would like to find a solution that I can bring with me to other clients
    n
    s
    • 3
    • 5
  • c

    Cheryl Z

    02/27/2023, 8:31 AM
    Hi all, I was reading the docs for Amazon Seller Partner connector, here are some questions: 1. Under the 'Supported Streams', the first table 'FBA Inventory Reports' has linked to ASC's 'Manage FBA Inventory report'. These are two different tables, I'm wondering if the link is mistaken or if the table name has changed by Airbytes? Same issue for the second table 'FBA Orders Reports', it has linked to another report ('Removal order detail report') 2. Serval tables' links have directed to Amazon's selling partner API report reference page, which does not provide clear info about tables' field name, is there any other way to know this? 3. Based on the Changelog, the connector has no updates since last year, will there be any further iterations?
    y
    • 2
    • 1
  • a

    Alexander Schmidt

    02/27/2023, 9:31 AM
    Hey just a quick question. Can i pull a specific connector version? E.g. the currenct amazon seller partner connector is version 0.2.32 but i want to work on version 0.2.29.
    u
    u
    • 3
    • 3
1...150151152...245Latest