https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • k

    Kevin Lin

    12/05/2022, 5:44 PM
    can the slack connector retrieve DM message history for a user? doesn’t seem like its supported by the connector (https://docs.airbyte.com/integrations/sources/slack#data-type-map) though it seems to be available via slack API
    • 1
    • 3
  • c

    Camilo Atencio

    12/05/2022, 7:29 PM
    Hi all! I’m trying to set up HPA for my airbyte deployment using helm charts. I see there is a parameter worker.hpa.enabled which I set to true, however, i dont see this object being deployed and I do not see any templates in the chart describing it. Is there anything I’m missing?
    s
    l
    • 3
    • 2
  • g

    Graeme Morrell

    12/05/2022, 8:03 PM
    Hey all, just a quick one... Just set up a quick connection from google ads to redshift i selected for it to normalize the data into a table but the raw data is left behind in another table.. is is normal for it to be left behind i assume i would need to delete this via another step if we watned.... is there a way to force this to be kept in a different schema for example just to keep seperate from the final table?
    m
    • 2
    • 5
  • j

    Jordan Fox

    12/05/2022, 8:13 PM
    Has anyone using the azure blob storage destination updated the connector so an Incremental Append sync doesn't create a 0 byte blob if there's no new data? I'm currently scanning in the evening and removing any 0 byte blobs. For context, running an incremental append every 5 minutes will create 105,120 blobs/year, pointless if they're 0 bytes.
    n
    • 2
    • 6
  • a

    Aabi

    12/05/2022, 8:39 PM
    Hello everyone! Hope you're all doing well. I have this very same issue running Airbyte using kustomize on AWS EKS on version 0.40.21 https://github.com/airbytehq/airbyte/issues/18016 Has anyone had success trying to resolve the issue using s3? Thanks!
    s
    u
    • 3
    • 4
  • c

    Chen Huang

    12/05/2022, 11:49 PM
    Does anybody experience that Airbyte closes the stream before the stream is finished? Details are included in the thread.
    l
    • 2
    • 5
  • j

    Jonathan Cachat PhD (JC)

    12/06/2022, 5:02 AM
    Facebook Spend not matching Ads Manager?? I am syncing from facebook to BQ in two ways (same source) and then BigQuery & BigQuery Denormalized destinations. Both are providing data daily for weeks, no problems with the sync. However, if I compare my monthly totals for Oct, Nov & Dec to those reported in the Facebook Ads Manager Reporting Interface - they are off. For example, in facebook Ad Manager Nov total is $3,069. In denormalized & standard bigquery tables (Ad Insights) my Nov total is $2432.41 in both cases. What can account for this difference?? I am using the __airbyte__emitted_at field within my WHERE statements as well. Any ideas?
    b
    s
    b
    • 4
    • 10
  • t

    Tmac Han

    12/06/2022, 6:28 AM
    I met an error in my local machine using docker compose , I did not change any thing.
    • 1
    • 1
  • t

    Tmac Han

    12/06/2022, 6:39 AM
    more error: message: "Internal Server Error: Duplicate key 2602c50b-0bfa-4acc-868a-b608dbb0ce90 (attempted merging values io.airbyte.config.ActorCatalogFetchEvent@466598b2[id=<null>,actorId=2602c50b-0bfa-4acc-868a-b608dbb0ce90,actorCatalogId=2e459619-94db-4ca1-9df2-64ed09d6912f,configHash=<null>,connectorVersion=<null>,createdAt=1670302144] and io.airbyte.config.ActorCatalogFetchEvent@35e3ff9[id=<null>,actorId=2602c50b-0bfa-4acc-868a-b608dbb0ce90,actorCatalogId=2e459619-94db-4ca1-9df2-64ed09d6912f,configHash=<null>,connectorVersion=<null>,createdAt=1670302144])"
    r
    m
    +4
    • 7
    • 10
  • s

    Sheshan

    12/06/2022, 6:44 AM
    Hi guys, Can someone help me with this issue ? I already raised a ticket https://discuss.airbyte.io/t/incremental-sync-mode-doesnt-work-on-custom-destination/3340
    n
    • 2
    • 1
  • h

    HunterZhang

    12/06/2022, 7:12 AM
    https://cloud.airbyte.io/signup This URL cannot be opened. Return to 403 Forbidden.
    s
    • 2
    • 1
  • h

    HunterZhang

    12/06/2022, 7:14 AM
    Please help me solve this problem!
  • p

    Phuc Dinh Minh

    12/06/2022, 7:37 AM
    Hi guys, really need help! I cannot setup new connection. It always show Error:
    Copy code
    Internal Server Error: SQL [insert into "public"."operation" ("id", "workspace_id", "name", "operator_type", "operator_normalization", "operator_dbt", "operator_webhook", "tombstone", "created_at", "updated_at") values (cast(? as uuid), cast(? as uuid), ?, ?::"public"."operator_type", cast(? as jsonb), cast(? as jsonb), cast(? as jsonb), ?, cast(? as timestamp with time zone), cast(? as timestamp with time zone))]; ERROR: insert or update on table "operation" violates foreign key constraint "operation_workspace_id_fkey" Detail: Key (workspace_id)=(xxxxxxxxxxxxxx) is not present in table "workspace".
    I tried to • docker-compose down • delete airbyte-workspace volume But it doesn't work I check airbyte log, and the airbyte-cron show: DefinitionsUpdater(updateDefinitions):54 - Connector definitions update disabled. How can I fix thiss!!!
    t
    s
    • 3
    • 2
  • d

    Dheeraj Pranav

    12/06/2022, 8:14 AM
    Is any one facing issue while saving dbt normalization in airbyte latest version : 0.40.23
    t
    n
    • 3
    • 22
  • m

    Mihály Dombi

    12/06/2022, 8:45 AM
    Hello! Did anybody have this error when was reading stream activites from Facebook Marketing: “Please reduce the amount of data you’re asking for, then retry your request” ?
    👍 1
    k
    • 2
    • 2
  • a

    Avi Sagal

    12/06/2022, 10:06 AM
    hi, i’m using a google analytics connection with basic Normalize. for some reason or error part of the data in the inital sync didn’t run the normalization part. how can i do a full normalization run ? running the sync again run normalization only on the new records. thanks!
    n
    a
    • 3
    • 12
  • b

    Bhavya Verma

    12/06/2022, 11:09 AM
    How to completely remove/delete a custom connector built in Airbyte
    • 1
    • 1
  • a

    Alfred Johnson

    12/06/2022, 11:23 AM
    hey any ideas on how to access customs dimensions in ga4 connector? unable to find good documentation . lets say if i have a custom dimension abc123 that want to sync accross?
    s
    • 2
    • 1
  • f

    Florent

    12/06/2022, 11:34 AM
    Hello! Does anyone know whether the Postgres connector uses a single transaction when reading records from all the tables to ingest, or one transaction per table to ingest?
    n
    • 2
    • 4
  • b

    Beeshal Rizal

    12/06/2022, 12:05 PM
    Hi team, wondering if you could provide some assistance around how the MongoDB connector works for append only mode. Few questions: 1. Are arrays supported? 2. How long until nested objects are supported?
    m
    • 2
    • 4
  • a

    Aayush Sharma

    12/06/2022, 12:13 PM
    Hey guys, I've been trying to set postgresql as the source and extract the data to a csv destination, while making the connection it says "connection refused to localhost:5432". I have edited pg_hba.conf file to allow all IPv4 and IPv6 connections, still I'm getting the same error.
    w
    s
    • 3
    • 2
  • s

    Stephan Mitterer

    12/06/2022, 12:38 PM
    Hi everyone, I'm very new to airbyte and have the following question: Is it possibe to add something to the request header of the "file" connector? Instead of writing a new connection, I'd just like to add some graphql query to the header as the API only requires an API key at the end of the url. Thanks!
    s
    • 2
    • 1
  • f

    Faris

    12/06/2022, 2:06 PM
    Sync Failed due to normalisation failure!! *Last attempt:*106.2 MB210,806 emitted records210,806 committed records1m 54s Failure Origin: normalization, Message: Normalization failed during the dbt run. This may indicate a problem with the data itself.
    Copy code
    2022-12-05 21:45:37 normalization > Completed with 2 errors and 0 warnings:
    2022-12-05 21:45:37 normalization > Database Error in model employees_stg (models/generated/airbyte_incremental/jisr_rr/employees_stg.sql)
    2022-12-05 21:45:37 normalization >   time zone displacement out of range: "+105454-01-01"
    2022-12-05 21:45:37 normalization > Database Error in model payruns_stg (models/generated/airbyte_incremental/jisr_rr/payruns_stg.sql)
    2022-12-05 21:45:37 normalization >   time zone displacement out of range: "+20225-08-25T09:18:47.000000"
    2022-12-05 21:45:37 normalization >   CONTEXT:  parallel worker
    2022-12-05 21:45:37 normalization > Done. PASS=39 WARN=0 ERROR=2 SKIP=4 TOTAL=45
    2022-12-05 21:45:37 INFO i.a.w.g.DefaultNormalizationWorker(run):93 - Normalization executed in 47 seconds.
    2022-12-05 21:45:37 ERROR i.a.w.g.DefaultNormalizationWorker(run):101 - Normalization Failed.
    2022-12-05 21:45:37 INFO i.a.w.g.DefaultNormalizationWorker(run):106 - Normalization summary: io.airbyte.config.NormalizationSummary@7552a195[startTime=1670276689822,endTime=1670276737454,failures=[io.airbyte.config.FailureReason@50b1ff7[failureOrigin=normalization,failureType=system_error,internalMessage=time zone displacement out of range: "+105454-01-01",externalMessage=Normalization failed during the dbt run. This may indicate a problem with the data itself.,metadata=io.airbyte.config.Metadata@552949c2[additionalProperties={attemptNumber=2, jobId=79, from_trace_message=true}],stacktrace=AirbyteDbtError: 
    9 of 45 ERROR creating incremental model _airbyte_jisr_rr.employees_stg................................................. [ERROR in 1.08s]
    Upon working to produce the error from the source tables (aws postgres), none of the date/timestamp columns could through this error. Did anyone face this issue?
    u
    s
    g
    • 4
    • 4
  • w

    Will Callaghan

    12/06/2022, 3:01 PM
    Is there any way to see the tickets (and more generally the progress) associated with items on the Airbyte Roadmap? More specifically, I’m interested in seeing the progress on: • Column Selection • Auto-detect schema changes
    s
    • 2
    • 1
  • k

    KalaSai

    12/06/2022, 4:18 PM
    Hi, I am running into the below while deploying airbyte on kubernetes. This is on airbyte/worker:0.40.23. Have checked everything but not able to get around this. Any help is appreciated.
    Copy code
    Message: Index 1 out of bounds for length 1
    Path Taken: new ApplicationInitializer() --> ApplicationInitializer.checkConnectionActivities --> List.checkConnectionActivities([CheckConnectionActivity checkConnectionActivity]) --> new CheckConnectionActivityImpl(WorkerConfigs workerConfigs,ProcessFactory processFactory,SecretsHydrator secretsHydrator,Path workspaceRoot,WorkerEnvironment workerEnvironment,LogConfigs logConfigs,[AirbyteApiClient airbyteApiClient],String airbyteVersion,AirbyteMessageSerDeProvider serDeProvider,AirbyteMessageVersionedMigratorFactory migratorFactory)
    io.micronaut.context.exceptions.BeanInstantiationException: Error instantiating bean of type  [io.airbyte.workers.temporal.check.connection.CheckConnectionActivityImpl]
    n
    • 2
    • 7
  • e

    Ethan Brouwer

    12/06/2022, 5:03 PM
    Does anyone know what happens when you have multiple sources pointing to the same destination? Specifically in our case we want to set things as full_refresh/overwrite, but I don't know if things will overwrite each other.
    m
    a
    +2
    • 5
    • 6
  • g

    Gopinath Sekar

    12/06/2022, 5:41 PM
    Hi. Am trying to reverse proxy my airbyte open source on ec2 instance and I keep getting the "challenge failed. No valid A records found for domain." Am I doing something wrong here or should I reach out someplace else for this?
    s
    • 2
    • 1
  • k

    Krishna Elangovan

    12/06/2022, 6:47 PM
    Trying to deploy airbyte into prod using eks and getting stuck with this issue https://github.com/airbytehq/airbyte/issues/18016 is there a explanation on how to setup S3 as log storage
    n
    • 2
    • 1
  • k

    Kevin Peters

    12/06/2022, 8:09 PM
    Hi. I recently upgraded my K8 instance hosted on GKE to the latest version
    v0.40.23
    and now all of my connections have begun to fail after the normalization stage.
    n
    • 2
    • 4
  • m

    Maksym Humeniuk

    12/06/2022, 8:31 PM
    Hello, guys! I've tried to create and run connection between AWS S3 and Clickhouse (in Airbyte cloud) and got this error while sync
    Copy code
    Failure Origin: normalization, Message: Normalization failed during the dbt run. This may indicate a problem with the data itself.
    I found the same issue on Airbyte forum but there is no solution. Anyone faced issue like this?
    j
    m
    • 3
    • 4
1...105106107...245Latest