https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • k

    Kfir

    11/19/2022, 6:00 PM
    Is there a way to trigger a OSS docker build&push for
    airbyte/source-google-workspace-admin-reports
    ? It wasn’t build for
    arm64
    as the other sources https://hub.docker.com/r/airbyte/source-google-workspace-admin-reports/tags
    n
    • 2
    • 8
  • r

    Rytis Zolubas

    11/19/2022, 6:01 PM
    I got this temporal error, anyone knows what is it about? Thanks!
    Temporal error.txt
    n
    • 2
    • 12
  • v

    Venkat Dasari

    11/20/2022, 5:50 AM
    Anyone want to play with Airbyte Rest API for creating sources, destinations, and connectors. I wrote a small article on how to do this with Postman. Let me know if you need more info. https://medium.com/@dasari2828/using-airbyte-with-rest-api-2e65c180ee80
    octavia loves 1
    octaviaparty 1
    octavia thanks 1
  • r

    Rytis Zolubas

    11/20/2022, 8:51 AM
    Hello! Is it possible to pass any metadata to POST /v1/connections/sync api end point? If not it would be great addition and I could create an issue in github. The use case: I run sync job and then create a task token. The task token is activated once job has finished. The problem is that sometimes job finishes very quickly (before the task token creation). This is why I would like to add metadata with tasktoken to a job call. Then I would call a trigger from the postgres database to finish the task token once status not in ('running', 'incomplete'). Thanks! Thanks P.S. This would be solved if there would be possible to have a simple workflow inside airbyte: job1, job2, job3 finishes -> run DBT cloud Edit: just figured out that Airbyte webhook now includes JOB ID, that helps as well.
    👀 1
    e
    • 2
    • 2
  • m

    Murat Cetink

    11/20/2022, 7:40 PM
    Hello, I upgraded my Airbyte docker installation to v*0.40.21* and started to get the
    WorkerException: Could not find image: airbyte/destination-snowflake:0.40.40
    error. I wonder if anyone else is getting the same error or it’s just me.
    n
    • 2
    • 6
  • h

    Hrvoje Piasevoli

    11/20/2022, 9:16 PM
    Hi, I tried to ingest airbyte database to bigquery to test the bigquery denormalized connector as the other variant of the connector is useless for nested data. The airbyte db is a good testing source as it has tables with jsonb columns (eg actor_catalog.catalog). The result is not what it is supposed to be as the catalog data is represented as a string of json. Tweaking the streams config using Octavia and setting the catalog column to be type: object results in error. When I use the regular bigquery destination from airbyte db and compare to eg zendesk_support I see a difference in _airbyte__raw data column; all the jsonb column content is represented as string and therefore no additional tables get created for airbyte db tables, opposed to zendesk where the data column contains nested json. Is there something that can be done to preperly use the bigquery denormalized destination to ingest airbyte db (Postgres source) as true parsed struct/repeatable columns
    n
    • 2
    • 4
  • n

    Nipuna Prashan

    11/21/2022, 3:08 AM
    Hi, I tried to install airbyte helm new version, but airbyte-bootloader fails with following log. 2022-11-21 030037 ERROR o.f.c.i.l.s.Slf4jLog(error):57 - Migration of schema "public" to version "0.40.12.001 - AddWebhookOperationColumns" failed! Changes successfully rolled back. Exception in thread "main" org.flywaydb.core.internal.command.DbMigrate$FlywayMigrateException: Migration failed ! I'm using a external DB. Anyone knows about this?
    • 1
    • 1
  • n

    Nipuna Prashan

    11/21/2022, 3:10 AM
    found a related git issue as well (without a solution) https://github.com/airbytehq/airbyte/issues/18217
  • r

    Rishabh D

    11/21/2022, 8:42 AM
    Hi team, While extracting JIRA data from source as csv, the files are loaded as 0.csv, 1.csv, 2.csv and so on. Also when the ‘Destination stream prefix’ is provided, the data isn’t loaded in that path but in the parent path provided in the s3 source itself. Please help here. PFA logs
    logs-727.txt
    n
    • 2
    • 3
  • m

    Michael Sonnleitner

    11/21/2022, 9:13 AM
    Hello all, We are currently having problems with a mysql source that keeps breaking when importing large data sets. We keep getting the same error from the log:
    Copy code
    ....
    2022-11-20 10:21:55 [44msource[0m > Nov 20, 2022 10:21:55 AM com.github.shyiko.mysql.binlog.BinaryLogClient$5 run
    2022-11-20 10:21:55 [44msource[0m > INFO: Keepalive: Trying to restore lost connection to <http://database-server.com:3306|database-server.com:3306>
    2022-11-20 10:25:16 [44msource[0m > Stopping the task and engine
    2022-11-20 10:25:16 [44msource[0m > Stopping down connector
    2022-11-20 10:26:46 [44msource[0m > Coordinator didn't stop in the expected time, shutting down executor now
    2022-11-20 10:28:16 [44msource[0m > Connection gracefully closed
    2022-11-20 10:28:16 [44msource[0m > Stopped FileOffsetBackingStore
    2022-11-20 10:28:16 [44msource[0m > Debezium engine shutdown.
    2022-11-20 10:28:16 [44msource[0m > The main thread is exiting while children non-daemon threads from a connector are still active.
    Ideally, this situation should not happen...
    Please check with maintainers if the connector or library code should safely clean up its threads before quitting instead.
    The main thread is: main (RUNNABLE)
     Thread stacktrace: java.base/java.lang.Thread.getStackTrace(Thread.java:1610)
            at io.airbyte.integrations.base.IntegrationRunner.dumpThread(IntegrationRunner.java:334)
            at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:282)
            at io.airbyte.integrations.base.IntegrationRunner.produceMessages(IntegrationRunner.java:219)
            at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:141)
            at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:100)
            at io.airbyte.integrations.source.mysql.MySqlSource.main(MySqlSource.java:309)
    2022-11-20 10:28:16 [44msource[0m > Active non-daemon thread: debezium-mysqlconnector-database-change-event-source-coordinator (TIMED_WAITING)
     Thread stacktrace: java.base@17.0.4.1/jdk.internal.misc.Unsafe.park(Native Method)
    ....
    We can already rule out that it is an out of memory problem on our airbyte server. At least according to the server's logs, there seems to be no problem here. Do any of you have an idea how we could solve this problem?
    mysql.log
    s
    • 2
    • 3
  • i

    Ivan Pilipchuk

    11/21/2022, 10:24 AM
    Hi all, We are currently having a problem with google-sheet source,

    https://user-images.githubusercontent.com/51704301/203024551-59110a9c-75d2-47a0-a427-0c851d56dd83.png▾

    https://user-images.githubusercontent.com/51704301/203024974-225dd6da-bce3-4b1c-ab1e-e583fe736fb0.png▾

    Can anyone help me with this problem?
    m
    • 2
    • 3
  • r

    Rahul Borse

    11/21/2022, 11:13 AM
    Hi All, I just commented airbyte metadata column in RootLevelFlatteningSheetGenerator.java in base-java-s3. Then I ran below commands. SUB_BUILD=PLATFORM ./gradlew build VERSION=dev docker-compose up Created a connection between hubspot sounce and s3 destination with RootLevelFlattening for csv. But still I can see airbyte metadata columns in my destination result. I am not sure why changes are not reflecting. Can someone help me.
    m
    • 2
    • 21
  • m

    Mads Christensen

    11/21/2022, 12:22 PM
    Hi all - I keep getting a
    non-json response
    error. I use Google Analytics 4 as source and PostgreSQL as destination. I shows up after 60 seconds, which indicates the default timeout. I tried to set it to 0 (infinity) in the jdbc string, but doesnt work. I see that many have had this problem, but has anyone ever found the solution?
    m
    • 2
    • 14
  • a

    Agung Pratama

    11/21/2022, 12:58 PM
    Hi, I am currently trying octavia-cli to make reproducible dev environment across my organization. I have succesfully
    octavia import all
    and my goal is to checkin all the manifest in the git repo, so my work colleague can just
    octavia apply
    to configure their local airbyte. However when I tried, I got this exception:
    Copy code
    🐙 - Octavia is targetting your Airbyte instance running at <http://localhost:8000> on workspace 419064f1-2e08-480c-be95-69c932d2a463.
    🐙 - TimescaleDB (tracking_db) exists on your Airbyte instance according to your state file, let's check if we need to update it!
    😴 - Did not update because no change detected.
    
    🐙 - MySQL (space_database) exists on your Airbyte instance according to your state file, let's check if we need to update it!
    😴 - Did not update because no change detected.
    
    🐙 - MySQL (user_info_database) exists on your Airbyte instance according to your state file, let's check if we need to update it!
    😴 - Did not update because no change detected.
    
    🐙 - MySQL (space_database) <> TimescaleDB (tracking_db) does not exists on your Airbyte instance, let's create it!
    Traceback (most recent call last):
      File "/usr/local/bin/octavia", line 8, in <module>
        sys.exit(octavia())
      File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1128, in __call__
        return self.main(*args, **kwargs)
      File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1053, in main
        rv = self.invoke(ctx)
      File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1659, in invoke
        return _process_result(sub_ctx.command.invoke(sub_ctx))
      File "/usr/local/lib/python3.9/site-packages/octavia_cli/base_commands.py", line 54, in invoke
        raise e
      File "/usr/local/lib/python3.9/site-packages/octavia_cli/base_commands.py", line 51, in invoke
        result = super().invoke(ctx)
      File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1395, in invoke
        return ctx.invoke(self.callback, **ctx.params)
      File "/usr/local/lib/python3.9/site-packages/click/core.py", line 754, in invoke
        return __callback(*args, **kwargs)
      File "/usr/local/lib/python3.9/site-packages/click/decorators.py", line 26, in new_func
        return f(get_current_context(), *args, **kwargs)
      File "/usr/local/lib/python3.9/site-packages/octavia_cli/check_context.py", line 91, in wrapper
        f(ctx, **kwargs)
      File "/usr/local/lib/python3.9/site-packages/octavia_cli/apply/commands.py", line 29, in apply
        apply_single_resource(resource, force)
      File "/usr/local/lib/python3.9/site-packages/octavia_cli/apply/commands.py", line 66, in apply_single_resource
        messages = create_resource(resource)
      File "/usr/local/lib/python3.9/site-packages/octavia_cli/apply/commands.py", line 127, in create_resource
        created_resource, state = resource.create()
      File "/usr/local/lib/python3.9/site-packages/octavia_cli/apply/resources.py", line 696, in create
        return self._create_or_update(self._create_fn, self.create_payload)
      File "/usr/local/lib/python3.9/site-packages/octavia_cli/apply/resources.py", line 670, in create_payload
        return WebBackendConnectionCreate(
      File "/usr/local/lib/python3.9/site-packages/airbyte_api_client/model_utils.py", line 46, in wrapped_init
        return fn(_self, *args, **kwargs)
      File "/usr/local/lib/python3.9/site-packages/airbyte_api_client/model/web_backend_connection_create.py", line 345, in __init__
        setattr(self, var_name, var_value)
      File "/usr/local/lib/python3.9/site-packages/airbyte_api_client/model_utils.py", line 185, in __setattr__
        self[attr] = value
      File "/usr/local/lib/python3.9/site-packages/airbyte_api_client/model_utils.py", line 510, in __setitem__
        self.set_attribute(name, value)
      File "/usr/local/lib/python3.9/site-packages/airbyte_api_client/model_utils.py", line 157, in set_attribute
        value = validate_and_convert_types(
      File "/usr/local/lib/python3.9/site-packages/airbyte_api_client/model_utils.py", line 1582, in validate_and_convert_types
        raise get_type_error(input_value, path_to_item, valid_classes,
    airbyte_api_client.exceptions.ApiTypeError: Invalid type for variable 'geography'. Required value type is Geography and passed type was str at ['geography']
    👍 1
    m
    i
    d
    • 4
    • 15
  • k

    Krzysztof

    11/21/2022, 2:12 PM
    Hi guys,
  • k

    Krzysztof

    11/21/2022, 2:12 PM
    https://github.com/airbytehq/airbyte/issues/19642
  • k

    Krzysztof

    11/21/2022, 2:14 PM
    but webapp requires it CONNECTOR_BUILDER_API_HOST
    l
    m
    b
    • 4
    • 9
  • r

    Regitze Sdun

    11/21/2022, 2:36 PM
    Hi all, I sat up a postgres to bigquery connection like this: https://airbyte.com/tutorials/postgres-to-bigquery and I wanted to extract data in an Incremental way and load it using Deduped + history mode. However, when I set it up like that I get the following message:
    The form is invalid. Please make sure that all fields are correct.
    Any idea what I'm doing wrong?
    a
    • 2
    • 27
  • j

    JP

    11/21/2022, 2:36 PM
    Hey, does anyone have any ideas regarding what seems like a permission issue during a BigQuery load:
    Exception attempting to access the Gcs bucket
    Stack Trace: com.amazonaws.services.s3.model.AmazonS3Exception: Access denied.
    Service account currently has
    "storage.multipartUploads.abort",
    "storage.multipartUploads.create",
    "storage.multipartUploads.list",
    "storage.multipartUploads.listParts",
    "storage.buckets.get",
    "storage.buckets.create",
    "storage.buckets.getIamPolicy",
    "storage.buckets.list",
    "storage.objects.create",
    "storage.objects.get",
    "storage.objects.getIamPolicy",
    "storage.objects.list",
    "bigquery.config.get",
    b
    s
    • 3
    • 4
  • h

    Hiep Minh Pham

    11/21/2022, 2:44 PM
    Hi guys, Initially I have the sync mode as
    Full refresh | Overwrite
    and I changed it to
    Incremental | Append
    in the UI (I did not choose to reset all streams as I need to keep historical data). However, the connector still get full data other than run an incremental model. Is this a bug?
    • 1
    • 1
  • j

    Joviano Cicero Costa Junior

    11/21/2022, 3:32 PM
    Hello everyone. I am trying to connect airbyte to snowflake but I am getting error "non-json response". Something tips to that?
    👀 1
    m
    n
    • 3
    • 5
  • c

    Cesar Santos

    11/21/2022, 6:20 PM
    Can someone please help me with this problem?
    m
    • 2
    • 1
  • s

    Sapin Dahal

    11/21/2022, 7:47 PM
    Hello, Can someone help me with this issue? While running this command -- python main.py check --config secrets/config.json I get -
    {"type": "LOG", "log": {"level": "ERROR", "message": "Check failed"}}
    {"type": "CONNECTION_STATUS", "connectionStatus": {"status": "FAILED", "message": "'Unable to connect to stream action - '"}}
    But when I run -- python main.py check --debug --config secrets/config.json I get a response and connection success message
    {"type": "LOG", "log": {"level": "INFO", "message": "Check succeeded"}}
    {"type": "CONNECTION_STATUS", "connectionStatus": {"status": "SUCCEEDED"}}
    a
    • 2
    • 15
  • n

    Nelson Rafael Perez

    11/21/2022, 10:40 PM
    Hello Folks, I writing because I have the following situation. I am trying to move data from Google Bigquery to Google Firestore but I am getting this error:
    The datastore operation timed out, or the data was temporarily unavailable."
    The Bigquery table has 2.240.000 records and each record has 30 fields, It is around 550MB, I know that is a lot of data but is there something that I can do from Airbyte side to avoid this situation?
    n
    • 2
    • 1
  • a

    Adrian Bakula

    11/21/2022, 11:15 PM
    Hey all, running Airbyte open source and we just upgraded to
    0.40.18
    . Noticing that
    /v1/connections/search
    doesn't return results anymore, though it was working fine in
    0.40.14
    . Anyone else experiencing this? Seems like no search parameters work.
    this 2
    n
    g
    • 3
    • 5
  • m

    Mohammad Abu Aboud

    11/22/2022, 1:27 AM
    Hi Everyone, Where I can select Full Refresh Mode instead of Increment | Append, It should be supported according to the documentation of QuickBooks and GCS on airbyte Reference: https://airbytehq.github.io/integrations/destinations/gcs/ https://docs.airbyte.com/integrations/sources/quickbooks/?__hstc=27854691.4f28cea016f[…]7&amp;_ga=2.14207607.1321900125.1669074004-1069498585.1665326771
    m
    • 2
    • 5
  • h

    Hao Kuang

    11/22/2022, 2:05 AM
    Hello friends. Is there any plans to make
    MAX_ITERATION_VALUE
    configurable? We do sometimes see
    init:Error
    when source pod get created due to 1-min hard-coded timeout?
    n
    • 2
    • 1
  • p

    PALLAPOTHU MANOJ SAI KUMAR

    11/22/2022, 6:04 AM
    Hi friends, While we are trying to get Facebook-marketing data using Facebook marketing connector we are not able to get
    cost_per_conversion
    column(given full permission of data to sync). could anyone please let us know how to get it. image version of source Facebook marketing connector - 0.2.72
    m
    • 2
    • 10
  • f

    Faris

    11/22/2022, 7:03 AM
    Hello, world, I am facing an issue with Postgres connector, both my source (aws rds Postgres read replica) and destination aws rds Postgres. The connection will sync and then fail with the following message:
    Copy code
    Sync Failed
    Last attempt:67.92 MB513,973 emitted recordsno records2m 44s
    Failure Origin: source, Message: Something went wrong in the connector. See the logs for more details.
    
    2022-11-21 14:26:40 - Additional Failure Information: java.lang.RuntimeException: org.postgresql.util.PSQLException: FATAL: terminating connection due to conflict with recovery Detail: User query might have needed to see row versions that must be removed. Hint: In a moment you should be able to reconnect to the database and repeat your command.
    source connector version 1.0.22 destination connector version 0.3.26 @Nataly Merezhuk (Airbyte) any direction to why this error happens? (posted this question long ago but couldn't get any cues to solve it)
  • r

    Rahul Borse

    11/22/2022, 7:19 AM
    Hi All, While building platform I am getting below error (Attached screenshot). I am using node 16.15.1 and gradle 7.5.1 latest version. Can someone please help. I am stuck here
    n
    • 2
    • 1
1...979899...245Latest