https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • r

    Robert Put

    10/28/2022, 2:38 PM
    Is there some timeout on sync's? I've been running a sync from stripe to snowflake, twice now on the initial sync it stops after 72 hours
    Copy code
    2022-10-28 14:04:21 WARN i.a.c.t.CancellationHandler$TemporalCancellationHandler(checkAndHandleCancellation):53 - Job either timed out or was cancelled.
    s
    • 2
    • 11
  • a

    Alberto Aguilera

    10/28/2022, 2:41 PM
    Hello! New here. I have a quick question. I was trying to access the api documentation through this page https://docs.airbyte.com/api-documentation It directs me to this page for api documentation https://airbyte-public-api-docs.s3.us-east-2.amazonaws.com/rapidoc-api-docs.html However, it returns a blank page. Where I can find the documentation for the api?
    s
    • 2
    • 1
  • j

    jonatan

    10/28/2022, 3:06 PM
    Hola buenos dias.! comenzado con Airbyte... me exige un usuario y contraseña y no puedo ingresar. ayuda.!
    m
    • 2
    • 1
  • k

    Kevin Phan

    10/28/2022, 4:06 PM
    hello everyone! Updated to 0.40.16 and i am getting this error for the
    airbyte-worker
    pod:
    Copy code
    Message: No bean of type [io.airbyte.config.persistence.split_secrets.SecretPersistence] exists for the given qualifier: @Named('secretPersistence'). Make sure the bean is not disabled by bean requirements (enable trace logging for 'io.micronaut.context.condition' to check) and if the bean is enabled then ensure the class is declared a bean and annotation processing is enabled (for Java and Kotlin the 'micronaut-inject-java' dependency should be configured as an annotation processor).
    Path Taken: new ApplicationInitializer() --> ApplicationInitializer.syncActivities --> List.syncActivities([ReplicationActivity replicationActivity],NormalizationActivity normalizationActivity,DbtTransformationActivity dbtTransformationActivity,PersistStateActivity persistStateActivity,NormalizationSummaryCheckActivity normalizationSummaryCheckActivity,WebhookOperationActivity webhookOperationActivity) --> new ReplicationActivityImpl(Optional containerOrchestratorConfig,ProcessFactory processFactory,[SecretsHydrator secretsHydrator],Path workspaceRoot,WorkerEnvironment workerEnvironment,LogConfigs logConfigs,String airbyteVersion,FeatureFlags featureFlags,Integer serverPort,AirbyteConfigValidator airbyteConfigValidator,TemporalUtils temporalUtils,AirbyteApiClient airbyteApiClient) --> SecretsHydrator.secretsHydrator([SecretPersistence secretPersistence])
    io.micronaut.context.exceptions.DependencyInjectionException: Failed to inject value for parameter [secretPersistence] of method [secretsHydrator] of class: io.airbyte.config.persistence.split_secrets.SecretsHydrator
    
    Message: No bean of type [io.airbyte.config.persistence.split_secrets.SecretPersistence] exists for the given qualifier: @Named('secretPersistence'). Make sure the bean is not disabled by bean requirements (enable trace logging for 'io.micronaut.context.condition' to check) and if the bean is enabled then ensure the class is declared a bean and annotation processing is enabled (for Java and Kotlin the 'micronaut-inject-java' dependency should be configured as an annotation processor).
    Path Taken: new ApplicationInitializer() --> ApplicationInitializer.syncActivities --> List.syncActivities([ReplicationActivity replicationActivity],NormalizationActivity normalizationActivity,DbtTransformationActivity dbtTransformationActivity,PersistStateActivity persistStateActivity,NormalizationSummaryCheckActivity normalizationSummaryCheckActivity,WebhookOperationActivity webhookOperationActivity) --> new ReplicationActivityImpl(Optional containerOrchestratorConfig,ProcessFactory processFactory,[SecretsHydrator secretsHydrator],Path workspaceRoot,WorkerEnvironment workerEnvironment,LogConfigs logConfigs,String airbyteVersion,FeatureFlags featureFlags,Integer serverPort,AirbyteConfigValidator airbyteConfigValidator,TemporalUtils temporalUtils,AirbyteApiClient airbyteApiClient) --> SecretsHydrator.secretsHydrator([SecretPersistence secretPersistence])
        at ...... (ABBREVIATED)
    Stream closed EOF for airbyte/airbyte-worker-5586f985d7-hcg9l (airbyte-worker-container)
    Does anyone have any ideas? I am using kustomize.
    • 1
    • 3
  • m

    Manish Tomar

    10/28/2022, 5:12 PM
    I am not able to connect to my Postgres RDS instance using Airbyte Cloud but from my Local Machine Airbyte instance, I am able to connect
    m
    • 2
    • 1
  • c

    Coleman Kelleghan

    10/28/2022, 8:52 PM
    Hello Airbyte, I have a question about the source connection Google Analytics (Universal Analytics) (airbyte/source-google-analytics-v4): We are connecting to multiple GA sources using this connector and using the cursor field “ga_date”. Because our sources exist at various timezones, and we don’t have access to our sources’ GA admin API, we need to know what the underlying GA account timezone setting is to convert the ga_date to UTC. Is there an existing or recommended mechanism for exposing the timezone through the schema, airbyte API, metadata, or any other method? Thanks Edit: I should add that this is going to a Postgres destination Edit 2: My understanding is that the timezone is in a metadata property on the API responses from GA
    h
    • 2
    • 1
  • r

    Robert Put

    10/28/2022, 8:53 PM
    any tips on how to speed up a stripe historical initial sync? its taking over 72 hours.... at about 1000 records a minute
    m
    • 2
    • 10
  • d

    Dan Cook

    10/28/2022, 10:02 PM
    I've successfully fetched Google Search Console data for our site back as far as possible (500 days). But today is Oct 28 and my newest data has a date value of Oct 26. It seems that the GSC connector is always 1+ days behind "real time". Is that due to data prep going on within Google itself? Exactly 500 days in the past is my oldest data: 2021-06-15, but I have 498 days worth of data with the 2 missing days being yesterday and today. 🤔 ----------- Software: Airbyte OS (0.40.15) Connector: Google Search Console (0.1.18) Destination: Snowflake (0.4.38)
    a
    s
    • 3
    • 2
  • s

    Sistemas Cesumin

    10/29/2022, 7:11 AM
    Hello everyone! I started with my 14 days trial. We need to connect to Amazon SP-API but its resulting quite dificult. We already have a registered app (Draft) in Amazon AWS (wich we are already using within a Python script, and extracting data without issues). So in Airbyte Im trying to create an Amazon Seller Partner connection. After I setup all the app variables, I try to authenticate with the blue button but in the Amazon's popup windows I get this message:
    Copy code
    App ID: amzn1.application-oa2-client.ddc82db35bdc4a448a50cebd5d8fe744
    Error Code: MD1000
    The URL is: https://sellercentral.amazon.es/apps/authorize/consent?application_id=amzn1.application-oa2-client.ddc82db35bdc4a448a50cebd5d8fe744&redirect_uri=https%3A%2F%2Fcloud.airbyte.io%2Fauth_flow&state=GaLb5JM&version=beta&mons_sel_dir_mcid=amzn1.merchant.d.AAH6GE64QRLD4BN5SJGLZTG6OHLA&mons_sel_mkid=A1RKKUPIHCS9HS&mons_sel_dir_paid=amzn1.pa.d.AANB5EETGCQUGAUVPCSEI6VF3OLA&ignore_selection_changed=true&mons_redirect=change_domain Any ideas? Thank you in advance,
    • 1
    • 1
  • s

    Sistemas Cesumin

    10/29/2022, 7:36 AM
    I'm also trying to setup a MySQL connector (wich should easier), but when I clic the blue button (Set up destination) after several seconds I get the message: The connection tests failed.
    Copy code
    Could not connect with provided configuration. HikariPool-1 - Connection is not available, request timed out after 60007ms.
    The same settings & credentials are being used without hassle with SQLyog, just tested... Any ideas? Thank you in advance,
    • 1
    • 1
  • m

    Mikhail Masyagin

    10/29/2022, 3:31 PM
    Hello friends! I'm still trying to run Airbyte with two PostgreSQL databases and it fails with config database:
    Copy code
    Migration of schema "public" to version "0.29.15.001 - Add temporalWorkflowId col to Attempts" failed!
    ...
    Message: SQL [select * from airbyte_metadata where key = ?]; ERROR: relation "airbyte_metadata" does not exist
    I'm using Airbyte VERSION=0.40.17 and two instances of PostgreSQL 13.7. The most wired things are that I'm running another Airbyte - 0.40.0-alpha and it works correct (on another two instances of PostgreSQL 13.7) and table
    airbyte_metadata
    exists🥲 Hope, that we'll find the solution
    h
    • 2
    • 6
  • s

    Sudhendu Pandey

    10/29/2022, 9:33 PM
    Hi there! Wanted to confirm behavior of Airbyte in the below scenario: Airbyte Version: 0.39.41-alpha Source: PostgreSQL, connector version: 1.0.21 Destination: Snowflake, connector version: 0.4.38 Sync Mode: Tried both (incremental/dedup+History and Incremental/append) Replication Method: Tried both (cursor based and cdc based) Transformation: Tried both (Raw Data and Normalized) Behavior: If my sync frequency is every 30 minutes, I see that Snowflake warehouse is started, bunch of queries are executed EVEN IF THERE ARE NO UPDATES AT PostgreSQL end. It shows 0 Bytes | no records | no records | 5m 35s | Sync. Question: Why does the destination Snowflake instances needs to be touched if there are no updates at the source end. It doesn't make any sense. Drawback: For our use-case where source is updated very infrequently. We end-up paying for 5 minutes of warehouse every 30 minutes even though we hardly sync 100-200 records. Am I doing something wrong here? Is the understanding correct? Thanks!
    • 1
    • 1
  • s

    Scott Chua

    10/30/2022, 9:30 AM
    In the defined set of Jinja macros, which currently include •
    now_local
    •
    now_utc
    •
    today_utc
    •
    timestamp
    •
    max
    •
    day_delta
    is there a reason
    today_local
    was left out? Or
    now_timezone
    and
    today_timezone
    that take timezone as argument? 😄 If not I’m happy to open that PR — a source I’m currently building a connector for returns data according to some preset, non-UTC timezone. 🙂 More broadly, is there any capability at the moment for Jinja macros to be defined at the connector level, rather than globally? 🤔 Many thanks!
    m
    • 2
    • 1
  • f

    Francisco Viera

    10/30/2022, 1:33 PM
    hello, exists a way to up this limit io.grpc.StatusRuntimeException: RESOURCE_EXHAUSTED: grpc: received message larger than max (11993702 vs. 4194304)
    s
    • 2
    • 1
  • s

    Sean Zicari

    10/30/2022, 2:34 PM
    Greetings! I’m testing out AirByte, setting up a Kafka -> Postgres sink. I notice I need to choose a replication frequency. Is this intentional? Kafka sinks are typically message-driven. Or is this a rough edge, or something else? Thanks!
    h
    • 2
    • 1
  • d

    Duck Psy

    10/31/2022, 2:09 AM
    Hi team, can i use MySQL ( instead of Postgresql ) as Database of Airbyte ?
    • 1
    • 3
  • r

    Rahul Borse

    10/31/2022, 4:38 AM
    Hi Team, I am not to see Customer IO in airbyte oss UI as a source connectors. But in airbyte documentation it is mentioned that is supported. Can someone help me out?
    h
    • 2
    • 8
  • a

    Aazam Thakur

    10/31/2022, 9:41 AM
    Hi team, I'm working on a connector using the Low Code CDK and am so close to finishing it with just one error in my way 🥹 How do i solve this error? Assertion error at least one record should be read from the catalog
    h
    s
    • 3
    • 9
  • a

    Aazam Thakur

    10/31/2022, 9:43 AM
    Screenshot from 2022-10-31 15-10-56.png
    s
    • 2
    • 3
  • t

    Tanasorn Chindasook

    10/31/2022, 10:54 AM
    Hi team, I found a small issue in the data normalisation. We are currently working with a custom connector that is using the
    full-refresh append
    sync method. The issue that we are currently seeing is that in the
    _airbyte_raw
    (before normalisation) table there is only one record for each id emitted per day. However, there are duplicates on the
    raw_
    (after normalisation) table based on the
    _airbyte_normalized_at
    field. For some reason, records that are emitted on two different days are being normalised on the same day. Could someone help provide us with some insight to why this behaviour is occurring and what we can do to fix it? Please see attached screenshot of the duplication based on
    _airbyte_normalized_at
    . Data has been obscured for privacy purposes. Thank you so much in advance!
    s
    m
    m
    • 4
    • 5
  • t

    Timo Hartmann

    10/31/2022, 12:53 PM
    Hi team, a sync from a custom source (http) to an S3 destination cancelled itself. It was triggered by Airflow's
    AirbyteTriggerSyncOperator
    . The logs (see below for the entire log) do not contain any info wrt why it was cancelled or what cancelled it. The warnings about the JSON schema validation can be ignored as they appear during every sync (even the successful ones) and have nothing to do with the cancellation issue. I would greatly appreciate your help, since I cannot explain this (it happened twice yet).
    Copy code
    Cancelled
    NaN Bytes | no records | no records | 10s | Sync
    Failure Origin: Unknown, Message: This attempt was cancelled
    7:03AM 10/29
    2022-10-29 05:03:47 - Additional Failure Information: Setting attempt to FAILED because the job was cancelled
    
    2022-10-28 05:03:33 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
    2022-10-28 05:03:33 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. 
    errors: $.format_type: does not have a value in the enumeration [Avro], $.compression_codec: string found, object expected, $.compression_codec: should be valid to one and only one of the schemas 
    2022-10-28 05:03:33 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword requires - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
    2022-10-28 05:03:33 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. 
    errors: $.flattening: is missing but it is required, $.format_type: does not have a value in the enumeration [CSV]
    2022-10-28 05:03:33 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. 
    errors: $.format_type: does not have a value in the enumeration [JSONL]
    2022-10-28 05:03:33 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
    2022-10-28 05:03:34 INFO i.a.w.t.TemporalAttemptExecution(get):105 - Docker volume job log path: /tmp/workspace/321/0/logs.log
    2022-10-28 05:03:34 INFO i.a.w.t.TemporalAttemptExecution(get):110 - Executing worker wrapper. Airbyte version: 0.40.1
    2022-10-28 05:03:34 INFO i.a.c.i.LineGobbler(voidCall):83 - Checking if <http://134983622818.dkr.ecr.eu-central-1.amazonaws.com/airbyte/source-<THESOURCE>:0.1.0|134983622818.dkr.ecr.eu-central-1.amazonaws.com/airbyte/source-<THESOURCE>:0.1.0> exists...
    2022-10-28 05:03:34 INFO i.a.c.i.LineGobbler(voidCall):83 - <http://134983622818.dkr.ecr.eu-central-1.amazonaws.com/airbyte/source-<THESOURCE>:0.1.0|134983622818.dkr.ecr.eu-central-1.amazonaws.com/airbyte/source-<THESOURCE>:0.1.0> was found locally.
    2022-10-28 05:03:34 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 321
    2022-10-28 05:03:34 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/321/0 --log-driver none --name source-<THESOURCE>-check-321-0-furce --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -WORKER_CONNECTOR_IMAGE=<WORKER_CONNECTOR_IMAGE>.<http://amazonaws.com/airbyte/source-<THESOURCE>:0.1.0|amazonaws.com/airbyte/source-<THESOURCE>:0.1.0> -e WORKER_JOB_ATTEMPT=0 -e AIRBYTE_VERSION=0.40.1 -e WORKER_JOB_ID=321 <WORKER_CONNECTOR_IMAGE>.<http://dkr.ecr.eu-central-1.amazonaws.com/airbyte/source-<THESOURCE>:0.1.0|dkr.ecr.eu-central-1.amazonaws.com/airbyte/source-<THESOURCE>:0.1.0> check --config source_config.json
    2022-10-28 05:03:39 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):99 - Check succeeded
    2022-10-28 05:03:40 INFO i.a.w.t.TemporalAttemptExecution(get):131 - Stopping cancellation check scheduling...
    2022-10-28 05:03:40 INFO i.a.w.t.s.a.StreamResetActivityImpl(deleteStreamResetRecordsForJob):37 - deleteStreamResetRecordsForJob was called for job 321 with config type sync. Returning, as config type is not resetConnection.
    2022-10-28 05:03:40 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. 
    errors: $.format_type: does not have a value in the enumeration [Avro], $.compression_codec: string found, object expected, $.compression_codec: should be valid to one and only one of the schemas 
    2022-10-28 05:03:40 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed.
    • 1
    • 1
  • y

    Yoan Yahemdi

    10/31/2022, 1:14 PM
    Hello guys, I’m still having deprecated version error on Facebook Source connector for open-source Airbyte despite the recent facebook sdk update: Can someone help ?
    a
    s
    • 3
    • 3
  • m

    Michal Moravik

    10/31/2022, 1:26 PM
    Hi everyone, I was just looking into the Postgres source connector and noticed there is this limitation mentioned under the “Limitations” section:
    • The Postgres source connector currently does not handle schemas larger than 4MB.
    What is this size referring to? I hope this does not include the actual data size😄
    s
    • 2
    • 2
  • t

    Tarak dba

    10/31/2022, 2:01 PM
    Hi Guys, I am unable to see the cursor field for the incremental replication with append? I am using source as a MySQL type is CDC.
    • 1
    • 1
  • f

    Francisco Viera

    10/31/2022, 2:04 PM
    Hello, how can specify the credentials json bigquery on octavia-cli ?
    🙏 1
    s
    d
    d
    • 4
    • 6
  • e

    Espoir Murhabazi

    10/31/2022, 2:04 PM
    Hello, I am having this issue about a connector I am building. https://discuss.airbyte.io/t/save-cursor-field-on-a-persistent-storage/3036
    h
    • 2
    • 4
  • e

    Espoir Murhabazi

    10/31/2022, 2:04 PM
    If someone can help..
  • z

    Zaza Javakhishvili

    10/31/2022, 3:14 PM
    Hi guys, before this issue will be done, any idea how can I easily identify stream sequence?
    h
    • 2
    • 1
  • p

    Patrik Deke

    10/31/2022, 3:51 PM
    Hi guys Quick question: how can i export/import the configuration of airbyte ? i've seen some post that i could export the config via the UI i've turned "Advanced mode" in the settings on using airbyte version 0.40.17 but the configuration windows looks as following => only donwload of the logs possible, not of configuration i've started airbyte locally via docker-compose as-is on github do i need to set any env variables to access this functionality ?
    m
    s
    n
    • 4
    • 6
  • t

    Tarak dba

    10/31/2022, 4:06 PM
    Hi Folks, Can Airbyte will support below? 1. Column level filtering to Destination (specific columns should be replicate from source to destination). 2. If we add/drop any column in the source then will it add/drop to the destination. 3. Rename column from source to destination ? 4. Change column data type from source to destination? 5. Rename table from source to destination? NOTE: My Source DB is MySQL and Destination DB is Google BigQuery.
    f
    • 2
    • 6
1...848586...245Latest