https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • k

    Kevin Conseil

    06/27/2023, 11:49 AM
    Hi Everyone, Can we do basic transformation within Airbyte ? I am thinking of removing rows on a google sheet to transform it into a grid so it is readable by Airbyte
    k
    a
    • 3
    • 11
  • m

    Matheus Barbosa

    06/27/2023, 11:55 AM
    Please someone help me, our company is having a big problem when trying to sync Google Ads data with ClickHouse and these errors related to Nested columns having different array sizes is always showing up! How can we avoid that and successfully sync this data with ClickHouse?
    Copy code
    20:17:09.821209 [error] [MainThread]: Database Error in model AW_campaigns_scd (models/generated/airbyte_incremental/scd/metrito_airbyte/AW_campaigns_scd.sql)
    20:17:09.821596 [error] [MainThread]:   :HTTPDriver for <https://rtswk5h81k.us-east-2.aws.clickhouse.cloud:8443> returned response code 500)
    20:17:09.821870 [error] [MainThread]:    Code: 190. DB::Exception: Elements 'campaign.excluded_parent_asset_field_types' and 'campaign.targeting_s__g.target_restrictions' of Nested data structure 'campaign' (Array columns) have different array sizes. (SIZES_OF_ARRAYS_DONT_MATCH) (
    20:17:09.822422 [error] [MainThread]: Database Error in model AW_display_topics_performance_report (models/generated/airbyte_tables/metrito_airbyte/AW_display_topics_performance_report.sql)
    20:17:09.822815 [error] [MainThread]:   :HTTPDriver for <https://rtswk5h81k.us-east-2.aws.clickhouse.cloud:8443> returned response code 500)
    20:17:09.823084 [error] [MainThread]:    Code: 190. DB::Exception: Elements 'ad_group_criterion.final_urls' and 'ad_group_criterion.topic.path' of Nested data structure 'ad_group_criterion' (Array columns) have different array sizes. (SIZES_OF_ARRAYS_DONT_MATCH) (version 23.5.1.34
    20:17:09.823501 [error] [MainThread]:   compiled Code at ../build/run/airbyte_utils/models/generated/airbyte_tables/metrito_airbyte/AW_display_topics_performance_report.sql,retryable=<null>,timestamp=1686946637696,additionalProperties={}]],additionalProperties={}]
    These are part our logs
    k
    • 2
    • 2
  • e

    Erry Kostala

    06/27/2023, 12:31 PM
    @Serhii Chvaliuk [GL] would you be opposed to me taking over here? https://github.com/airbytehq/airbyte/pull/25599 get the build to pass etc, i’d be keen to see this get over the line
    k
    j
    • 3
    • 4
  • h

    Haim Beyhan

    06/27/2023, 1:02 PM
    We are running a POC in Kubernetes. We're getting "Terminating due to java.lang.OutOfMemoryError: Java heap space" in airbye-server pod logs. Then we start getting 502 in ui, cannot reach server.
    k
    • 2
    • 2
  • o

    Octavia Squidington III

    06/27/2023, 1:45 PM
    🔥 Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 16:00 CEST / 10am EDT click here to join us on Zoom!
  • s

    Slackbot

    06/27/2023, 8:18 PM
    This message was deleted.
    k
    • 2
    • 2
  • k

    klaus hofenbitzer

    06/27/2023, 11:33 PM
    I am using airbyte 0.44.12. I have configured a new connector using the new Builder. This is a HTTPS connection to one of our manufacturing systems called OSI-PI. The created URL works fine with Postman, but in airbyte I am getting the follwoing error:
    Copy code
    (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1129)')))
    Anyway to switch off SSL verification using the airbyte builder?
    k
    t
    • 3
    • 4
  • e

    Ekansh Verma

    06/28/2023, 7:13 AM
    I am using Airbyte 0.43.0 and while setting up a dynamodb source connector I encounterd a KMS key error stating
    The ciphertext refers to a customer master key that does not exist, does not exist in this region, or you are not allowed to access.
    However when I try to access the tables with a boto3 client I can successfully list and scan tables. Anyway to fix this?
    k
    • 2
    • 2
  • a

    Alessandro Pietrobon

    06/28/2023, 8:30 AM
    hi team. Not sure if i'm in the right channel, but i have an airbyte cloud question. For our work we split the connections into several workspaces. For a new workspace created this week, though, i am no longer able to enroll in the free connector program: I just can't find the sign up link within the airbyte cloud ui. Anyone facing the same problem? Or anyone that can point me to the right channel to ask the question? thanks https://support.airbyte.com/hc/en-us/articles/15941851343003-Managing-Your-Credits
    k
    • 2
    • 2
  • s

    Swathi Chakrabavi Ramamurthy

    06/28/2023, 9:07 AM
    Hello Team, We are facing issue with two connectors. Can you please guide us if there is any alternative for it. 1. Hubspot: We are unable to read the email_subscriptions. We are using the Private App and access has been provided to all the scopes mentioned here.. The private app has been created using admin access. Is there any way to get this table loaded ? 2. Outreach: We are trying to synchronize the mailing table and when performing the load we are getting the below error. The run is fullrefresh-overwrite. The error message is as below.
    Request URL: <https://api.outreach.io/api/v2/mailings?page%5Bsize%5D=1000&count=false&sort=updatedAt>, Response Code: 502, Response Text: <html>
    k
    • 2
    • 2
  • s

    Slackbot

    06/28/2023, 9:25 AM
    This message was deleted.
    k
    • 2
    • 2
  • r

    Rishav Sinha

    06/28/2023, 10:58 AM
    Is there any way to connect Azure AD
    k
    • 2
    • 2
  • v

    Vrushali Ghodake

    06/28/2023, 11:15 AM
    Hi everyone, I encountered an error while upgrading my Airbyte version from 0.35.53-alpha to v0.50.5 using Docker for deployment.
    Copy code
    ERROR i.m.r.Micronaut(handleStartupException):338 - Error starting Micronaut server: Failed to inject value for parameter [secretPersistence] of method [secretsHydrator] of class: io.airbyte.config.persistence.split_secrets.SecretsHydrator
    
    Message: No bean of type [io.airbyte.config.persistence.split_secrets.SecretPersistence] exists for the given qualifier: @Named('secretPersistence'). Make sure the bean is not disabled by bean requirements (enable trace logging for 'io.micronaut.context.condition' to check) and if the bean is enabled then ensure the class is declared a bean and annotation processing is enabled (for Java and Kotlin the 'micronaut-inject-java' dependency should be configured as an annotation processor).
    Path Taken: new ApplicationInitializer() --> ApplicationInitializer.checkConnectionActivities --> List.checkConnectionActivities([CheckConnectionActivity checkConnectionActivity]) --> new CheckConnectionActivityImpl(WorkerConfigsProvider workerConfigsProvider,ProcessFactory processFactory,[SecretsHydrator secretsHydrator],Path workspaceRoot,WorkerEnvironment workerEnvironment,LogConfigs logConfigs,AirbyteApiClient airbyteApiClient,String airbyteVersion,AirbyteMessageSerDeProvider serDeProvider,AirbyteProtocolVersionedMigratorFactory migratorFactory,FeatureFlags featureFlags) --> SecretsHydrator.secretsHydrator([SecretPersistence secretPersistence])
    io.micronaut.context.exceptions.DependencyInjectionException: Failed to inject value for parameter [secretPersistence] of method [secretsHydrator] of class: io.airbyte.config.persistence.split_secrets.SecretsHydrator
    I found some references and based on the comments, I added the following configuration variables to my .env file:
    Copy code
    WORKER_LOGS_STORAGE_TYPE=MINIO
    WORKER_STATE_STORAGE_TYPE=MINIO
    AWS_ACCESS_KEY_ID={AWS_ACCESS_KEY_ID}
    AWS_SECRET_KEY={my_AWS_SECRET_KEY}
    I would appreciate any assistance in resolving this issue. Thank you!
    k
    • 2
    • 2
  • a

    Abdelkarim EL-HAJJAMI

    06/28/2023, 11:54 AM
    Hello everyone, For the context, we try to customize the destination-iceberg connector for our internal use without it having an interest for the community, to do this, we first try to build the image of the connector and push it on ECR without making any changes to the code. Except that when we try to add it as a custom destination on Airbyte it fails at the check level with the following message: RetryableException(Data read has a different checksum than expected. Was 0xd41d8cd98f00b204e9800998ecf8427e, but expected 0x748f1ee24f544f0b90b317a9fd8ccb70. Can you please guide me? Thank you!
    k
    l
    • 3
    • 4
  • l

    Luis Vicente

    06/28/2023, 12:31 PM
    I was checking the "Powered by Airbyte" webpage and while it points to some documentation on how to use the headless mode, I'm confused about a couple of things. Through the API we can setup sources, destinations and connections. We can even configure streams. But the only options for dealing with schema changes are ignore or disable. I don't see support for the new modes - that can handle schema changes as long as the destination supports it. The API doesn't support triggering a schema discovery.
    k
    • 2
    • 4
  • c

    Carolina Buckler

    06/28/2023, 2:16 PM
    Getting
    ERROR i.a.d.j.StreamingJdbcDatabase$1(tryAdvance):107 SQLState: S1000, Message: Error retrieving record: Unexpected Exception: java.io.EOFException message given: Can not read response from server. Expected to read 144 bytes, read 41 bytes before connection was unexpectedly lost.
    with the MySQL connector (v2.0.25) and Airbyte (v0.50.1) ; I’ve tried adding
    useCursorFetch=true&defaultFetchSize=1000&autoReconnect=true&validationQuery=SELECT 1&testOnBorrow=true
    in the JDBC URL Parameters (Advanced) and setting the `JOB_MAIN_CONTAINER_MEMORY_REQUEST`in the env file to 8g since it seems to happen after
    2023-06-28 13:09:24 INFO i.a.w.g.ReplicationWorkerHelper(processMessageFromSource):226 - Records read: 4400000 (1 GB)
    Any other recommendations on what to adjust?
    k
    s
    • 3
    • 9
  • l

    Luke

    06/28/2023, 5:51 PM
    Can someone please help me connect S3. I keep getting an error saying the key does not exist:
    Copy code
    Could not connect to the S3 bucket with the provided configuration. The specified key does not exist. (Service: Amazon S3; Status Code: 404; Error Code: NoSuchKey; Request ID: WJV7VEXRTDQJNS9S; S3 Extended Request ID: YlpukOgdI+zNZpQkwRjzwXyGChRUob36XfnF+Eb/xjTl4pmwqnsgbnXtmnfOj7jJhwuNZSG/8BQ=; Proxy: null)
    k
    j
    • 3
    • 7
  • l

    Luke

    06/28/2023, 5:52 PM
    image.png
  • l

    Luke

    06/28/2023, 5:52 PM
    image.png
  • l

    Luke

    06/28/2023, 6:04 PM
    Screenshot 2023-06-28 at 11.04.02 AM.png
  • o

    Octavia Squidington III

    06/28/2023, 7:45 PM
    🔥 Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 1PM PDT click here to join us on Zoom!
  • c

    Cesar Santos

    06/28/2023, 7:54 PM
    Hey folks! I’m currently running a connection that uses an s3 parquet file as source and it writes to Redshift, my problem is: Airbyte writes null instead of the timestamp columns. Have you seen this before? When I force the data type to
    varchar
    it does work, but I will need to cast this column back to timestamp latter.
    k
    • 2
    • 3
  • h

    Haim Beyhan

    06/29/2023, 10:01 AM
    We're getting error during Reset process. What does it mean?
    Copy code
    io.grpc.StatusRuntimeException: RESOURCE_EXHAUSTED: grpc: received message larger than max (10129409 vs. 4194304)
    k
    • 2
    • 2
  • h

    Haim Beyhan

    06/29/2023, 10:31 AM
    if I install helm package how do I access/modify airbyte-config.yaml ?
    k
    • 2
    • 2
  • e

    Eddie Newitt

    06/29/2023, 11:19 AM
    Hey, I have a question about the Google Analytics (universal) source connector. It only has a 'start date' field in the configuration UI - this is a problem because it means you can't specify a date range to export. The Google API is limiting the data range to 14 months it seems, so unless there is an option to specify a start/end range for an export - the connector is limited to the previous rolling 14 months of data. My data from before April 2022 is stranded. I am running an instance of airbyte open source & I raised an issue in the repo: https://github.com/airbytehq/airbyte/issues/27661
    k
    • 2
    • 3
  • o

    Octavia Squidington III

    06/29/2023, 1:45 PM
    🔥 Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 16:00 CEST / 10am EDT click here to join us on Zoom octavia loves
  • a

    Alan Klein

    06/29/2023, 2:06 PM
    Hello, I see on this page, a source called {REST:API} but I cannot find this in the Airbyte UI under Sources, any ideas?
    k
    • 2
    • 2
  • w

    Walker Philips

    06/29/2023, 3:11 PM
    Getting a "Server temporarily unavailable (http.502.rRNRxZ28ZS5ko2wfkCny7D)" error when refreshing source schemas or attempting to view schema changes for a connection. Previously was working fine in a past airbyte version and is now broken after upgrading to 0.44.12
    k
    h
    • 3
    • 4
  • e

    Erik Webb

    06/29/2023, 4:37 PM
    I'm hitting this error when trying to fetch the schema via the GCS (Google Clould Storage) connector. It seems like my permissions are right, since the connection test passed?
    Copy code
    Internal message: 'utf-8' codec can't decode byte 0xff in position 0: invalid start byte
    Failure type: system_error
    k
    • 2
    • 3
  • a

    Alan Klein

    06/29/2023, 5:51 PM
    Looks like the documentation site is having issues - https://docs.airbyte.com/
    k
    p
    • 3
    • 5
1...207208209...245Latest