https://linen.dev logo
Join SlackCommunities
Powered by
# ask-community-for-troubleshooting
  • a

    Ashish Singh

    10/11/2022, 2:02 PM
    Hi All ! New to this channel. We have setup Airbyte in a local environment in a Linux machine. However, there is no authentication/user management system in Airbyte. We would like to setup a secure login screen when anyone tries to open Airbyte. I have tried to look for solutions on the internet but could not come across anything reliable. Appreciate any help. Cheers // Ashish
    ✍️ 1
    e
    d
    u
    • 4
    • 7
  • e

    Eduardo Aviles

    10/10/2022, 5:59 AM
    Hi guys, i'm trying to do a full load from Mysql source to Postgres destination, I've doing a lot syncs with small tables and it takes lake: 3 minutes to load 115,000 rows 44.6MB. but when I do some large tables the sync proces just hangs there , the screenshot says 23 and the other screenshot shows that it hangs at 68GB . It took 16 hours to get 68GB. I saw that there are some threads and posts at the forum talking about the performance connection but I couldn't find anything that helps thanks
    ✍️ 1
    s
    j
    • 3
    • 10
  • m

    Michael Cooper

    10/11/2022, 3:18 PM
    Hello! I’m still encountering an issue with connector scheduling. None of the connections I have will sync unless manually triggered despite having a 24 hour sync frequency. The frequency is set via the web UI in the dropdown menu. This is also running Airbyte
    0.37.1-alpha
    ✍️ 1
    • 1
    • 7
  • d

    Dylan Pierce

    10/11/2022, 3:20 PM
    Would like to participate in the Hackathon. I have opened an issue for a specific API that I’m familiar with and would like to give building a connector a try. Could I get a confirmation from an Airbyte team member to start? https://github.com/airbytehq/connector-contest/issues/132
    ✍️ 1
    e
    s
    • 3
    • 7
  • n

    Nishant George

    10/10/2022, 4:55 PM
    Hi folks - I've got airbyte setup on AWS EC2 to replicate Hubspot > Postgres every 5 mins. My problem is, the 40gb disk fills up every couple days. I've been running this command manually in AWS EC2 console every 1-2 days to clear out useless logs:
    Copy code
    find /var/lib/docker/volumes/airbyte_workspace/_data/ -maxdepth 1 -regex '/var/lib/docker/volumes/airbyte_workspace/_data/[0-9]+' -mtime +10 | xargs rm -rf
    Does anyone have a more hands-free solution? (Why doesn't Airbyte manage its disk space better out of the box...?)
    👍 1
    ✍️ 1
    h
    • 2
    • 3
  • s

    Steven Herweijer

    10/11/2022, 6:09 PM
    Hi, I've setup Airbyte and I'm liking it so far. One thing bothers me though, when moving data from postgres to another postgres, table names on the destination are getting shortened well below the maximum table name length of postgres. Why is this? Can I turn this off and make it use the same table name as from source?
    ✍️ 1
    m
    • 2
    • 24
  • c

    claudio viera

    10/11/2022, 7:02 PM
    magento source not exist ?
    u
    m
    • 3
    • 2
  • t

    Teri Eyenike

    10/11/2022, 8:58 PM
    I have followed the docs on EC2 but I am stucked at connecting my instance locally to Airbyte. How can I make this work from my terminal? https://docs.airbyte.com/deploying-airbyte/on-aws-ec2/#connect-to-airbyte
    ✍️ 2
    s
    • 2
    • 17
  • j

    Jordan Young

    10/11/2022, 9:25 PM
    Hello! Im writing a spec.yaml file and was wondering what setting airbyte_secret: true does to an input property? Thanks
    ✍️ 1
    e
    b
    • 3
    • 6
  • w

    Wilter Yee

    10/11/2022, 10:45 PM
    Hi there! I just recently set up airbyte and I’m trying to set my connector (GA4 -> Snowflake) to replicate every day at 12pm UTC via the cron option in the UI (
    0 0 12 * * ?
    ), and every time I hit
    Save Changes
    , it reverts back to manual. When I select another option in the dropdown and save, it’ll apply properly. Does anyone else know why that’s happening?
    ✍️ 1
    a
    m
    s
    • 4
    • 6
  • g

    gunu

    10/12/2022, 2:51 AM
    Has anyone here successfully used the Service Account Key Authentication for the
    google search console
    source connector?
    ✍️ 1
    m
    • 2
    • 5
  • z

    Zachary Damcevski

    10/12/2022, 3:52 AM
    Hey all, does anyone know with incremental loads, for the first sync does it retrieve all the data back to the start date? Then on subsequent runs it uses the
    last_sync_max_cursor_field_value
    , what happens if the previous sync failed for any reason? Does it take the last sync time of a successful run? https://docs.airbyte.com/understanding-airbyte/connections/incremental-append#known-limitations
    ✍️ 1
    m
    h
    • 3
    • 5
  • g

    gunu

    10/12/2022, 4:34 AM
    Can someone please help setting up a local way of getting
    google search console
    source OAuth credentials:
    access
    and
    refresh
    tokens e.g. python code with client. Existing Oauth clients have been affected
    October 3, 2022 - the OOB flow is deprecated for OAuth clients created before February 28, 2022
    https://developers.google.com/identity/protocols/oauth2/resources/oob-migration
    ✍️ 1
    e
    • 2
    • 6
  • s

    Santiago Stachuk

    10/12/2022, 5:03 AM
    Greetings friends! I wanted to modify the Google ads connector, should I go for a "`low-code cdk`" approach or should I just modify the existing one?
    ✍️ 1
    e
    s
    • 3
    • 7
  • s

    Satya Varma

    10/12/2022, 6:12 AM
    Hi All, I have been using Airbyte to sync data from Salesforce to redshift. We are facing an issue of missing data when there is a huge load. And we found a way to backfill the data by updating the status in state table. But, don’t have an idea of how to pass end time also to backfill the data only in a required time range rather than backfilling the entire data from a starting point. If any one have any idea/lead, Please let me know. Thanks in advance
    ✍️ 1
    e
    m
    • 3
    • 7
  • s

    sonti srihari

    10/12/2022, 7:27 AM
    👋 Hello, team! We have a plan to do the POC on Airbyte So I started installing Airbyte in my local computer using docker compose
    ✍️ 1
    h
    • 2
    • 2
  • s

    sonti srihari

    10/12/2022, 7:40 AM
    and when I tried to sign-in with Google/Github and I fall into below error "*An unknown error happened during sign in: auth/internal-error*"
  • h

    Huib

    10/12/2022, 9:58 AM
    Hi! I’m trying to sync data from MSSQL (CDC) to S3. When I set the sync to be a full sync, it works fine, but as soon as I put it to incremental, it fails with the following message:
    Copy code
    2022-10-12 09:53:41 - Additional Failure Information: tech.allegro.schema.json2avro.converter.AvroConversionException: Failed to convert JSON to Avro: Could not evaluate union, field Location is expected to be one of these: NULL, DOUBLE. If this is a complex type, check if offending field (path: Location) adheres to schema: 9.447
    ✍️ 1
    h
    q
    m
    • 4
    • 10
  • f

    Felipe Soares Costa

    10/12/2022, 11:40 AM
    Hi there! We are trying to implement Airbyte in our company right now. Our use case is some MySQL sources with a lot of events coming in and we are using CDC to replicate the changes on BigQuery. We create different connections from the sources (basically each table is a connection) to the destination. However we are getting a lot of failures related to bin logs not being available (Binlog mbin.017786 is not available. This is a critical error, it means that requested binlog is not present on mysql server). Some jobs complete with success after some attempts and others not. We have a huge retention policy already, so I don't think this is the problem. Looking to the logs the normalization process complete with success after the error but the state is still failed. Are we doing something wrong in our side? It's ok to have multiple connections to the same source running in parallel?
    ✍️ 1
    e
    h
    • 3
    • 7
  • s

    Shashank Tiwari

    10/12/2022, 12:27 PM
    Can someone help me in values.yaml file for deploying Airbyte on EKS using helm charts
    ✍️ 1
    m
    • 2
    • 6
  • r

    Ramon Vermeulen

    10/12/2022, 1:35 PM
    Getting an error when upgrading airbyte on k8s from 0.40.2 to 0.40.14 in the
    aibyte-chart-airbyte-worker
    pod with the helm release:
    Copy code
    Message: No bean of type [io.airbyte.config.persistence.split_secrets.SecretPersistence] exists for the given qualifier: @Named('secretPersistence'). Make sure the bean is not disabled by bean requirements (enable trace logging for 'io.micronaut.context.condition' to check) and if the bean is enabled then ensure the class is declared a bean and annotation processing is enabled (for Java and Kotlin the 'micronaut-inject-java' dependency should be configured as an annotation processor).
    Seems the same problem as https://airbytehq.slack.com/archives/C021JANJ6TY/p1665149370573089 ? UPDATE: Due to @lucien his answer I know he was running 0.40.6 without problems. Upgrading from 0.40.2 to 0.40.6 succeeded. So this somewhat confirms this problem is introduced between 0.40.6 and 0.40.13. UPDATE2: Can it be something related to this commit in version 0.40.10? https://github.com/airbytehq/airbyte/commit/9abcbadd9316d3017a4573bc195f44e15b5a0dfb
    ✍️ 1
    • 1
    • 8
  • m

    Mitch Eccles

    10/12/2022, 2:59 PM
    Hi 🙂. I'm using the Airbyte API in a workflow to automate the setting up of connections between a Google Ads source and a Postgres Destination. To setup my connection I'm making a POST request to the endpoint
    <https://my-airbyte-endpoint/api/v1/connections/create|api/v1/connections/create>
    and I'm receiving an error:
    Errors: supply old or new schedule schema but not both
    . I'm not sure why I am seeing this error, as I have basically copied the schedule schema from the documented example on the API documentation. My schedule looks likes like this:
    Copy code
    "schedule": {
        "units": 24,
        "timeUnit": "hours"
      },
      "scheduleType": "basic",
      "scheduleData": {
        "basicSchedule": {
          "timeUnit": "hours",
          "units": 24
        }
      },
    What am I doing wrong? And where can I find documentation on the old or new schedule schema? I'm using airbyte 0.40.4
    ✍️ 1
    j
    • 2
    • 3
  • m

    Mohit Reddy

    10/12/2022, 3:31 PM
    Hi, we are using Airbyte in our kubernetes clusters with a connection with BigQuery source (0.1.8 version I think) and Kafka destination (0.1.8 I think). Airbyte is setup with 0.39.1 version. The jobs are run on large machines (16cpu, 32G) as we previously noted that on large batches, the containers were being killed by the OOM reaper. the source table has ~100M records. We run into these non deterministic errors (each attempt failed with different amount of data read) and it seems that it is not reading all the data but still logs “Source has no more messages, closing connection.“; I can confirm that the source pod is eventually stopped, but that seems to be the cause of failure here? The following logs are from Attempt 3. Initial attempts failed at 2M records read and 90M records read.
    Copy code
    2022-10-12 15:17:22 [32mINFO[m i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):334 - Records read: 4587000 (655 MB)
    2022-10-12 15:17:31 [32mINFO[m i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):337 - Source has no more messages, closing connection.
    2022-10-12 15:17:33 [32mINFO[m i.a.w.p.KubePodProcess(close):713 - (pod: t-107 / source-bigquery-read-4529-2-irywx) - Closed all resources for pod
    2022-10-12 15:17:33 [32mINFO[m i.a.v.j.JsonSchemaValidator(test):56 - JSON schema validation failed. 
    errors: $: null found, object expected
    2022-10-12 15:17:33 [1;31mERROR[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$1):70 - Validation failed: null
    2022-10-12 15:17:34 [32mINFO[m i.a.w.p.KubePodProcess(close):713 - (pod: t-107 / destination-kafka-write-4529-2-wnxxp) - Closed all resources for pod
    2022-10-12 15:17:34 [1;31mERROR[m i.a.w.g.DefaultReplicationWorker(run):177 - Sync worker failed.
    java.util.concurrent.ExecutionException: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped!
    	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?]
    	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?]
    	at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:170) ~[io.airbyte-airbyte-workers-0.39.1-alpha.jar:?]
    	at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:62) ~[io.airbyte-airbyte-workers-0.39.1-alpha.jar:?]
    	at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.1-alpha.jar:?]
    	at java.lang.Thread.run(Thread.java:833) [?:?]
    	Suppressed: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled.
    		at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:136) ~[io.airbyte-airbyte-workers-0.39.1-alpha.jar:?]
    		at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:134) ~[io.airbyte-airbyte-workers-0.39.1-alpha.jar:?]
    		at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:62) ~[io.airbyte-airbyte-workers-0.39.1-alpha.jar:?]
    		at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.1-alpha.jar:?]
    		at java.lang.Thread.run(Thread.java:833) [?:?]
    Caused by: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped!
    	at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:341) ~[io.airbyte-airbyte-workers-0.39.1-alpha.jar:?]
    	at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?]
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?]
    	... 1 more
    Caused by: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled.
    	at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:136) ~[io.airbyte-airbyte-workers-0.39.1-alpha.jar:?]
    	at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:339) ~[io.airbyte-airbyte-workers-0.39.1-alpha.jar:?]
    	at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?]
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?]
    	... 1 more
    2022-10-12 15:17:34 [32mINFO[m i.a.w.g.DefaultReplicationWorker(run):236 - sync summary: io.airbyte.config.ReplicationAttemptSummary@613c6c00[status=failed,recordsSynced=4587950,bytesSynced=687407575,startTime=1665586308868,endTime=1665587854669,totalStats=io.airbyte.config.SyncStats@4fe132af[recordsEmitted=4587950,bytesEmitted=687407575,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[io.airbyte.config.StreamSyncStats@2251ad00[streamName=fennel_impressions,stats=io.airbyte.config.SyncStats@5ffe54a3[recordsEmitted=4587950,bytesEmitted=687407575,stateMessagesEmitted=<null>,recordsCommitted=<null>]]]]
    2022-10-12 15:17:34 [32mINFO[m i.a.w.g.DefaultReplicationWorker(run):265 - Source did not output any state messages
    2022-10-12 15:17:34 [33mWARN[m i.a.w.g.DefaultReplicationWorker(run):273 - State capture: No new state, falling back on input state: io.airbyte.config.State@47f56920[state={"cdc":false,"streams":[{"cursor":"0","stream_name":"fennel_impressions","cursor_field":["added_on"],"stream_namespace":"impression_dataset"}]}]
    2022-10-12 15:17:34 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling...
    2022-10-12 15:17:34 [32mINFO[m i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):157 - sync summary: io.airbyte.config.StandardSyncOutput@5f71a285[standardSyncSummary=io.airbyte.config.StandardSyncSummary@47bc3f11[status=failed,recordsSynced=4587950,bytesSynced=687407575,startTime=1665586308868,endTime=1665587854669,totalStats=io.airbyte.config.SyncStats@4fe132af[recordsEmitted=4587950,bytesEmitted=687407575,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[io.airbyte.config.StreamSyncStats@2251ad00[streamName=fennel_impressions,stats=io.airbyte.config.SyncStats@5ffe54a3[recordsEmitted=4587950,bytesEmitted=687407575,stateMessagesEmitted=<null>,recordsCommitted=<null>]]]],normalizationSummary=<null>,state=io.airbyte.config.State@47f56920[state={"cdc":false,"streams":[{"cursor":"0","stream_name":"fennel_impressions","cursor_field":["added_on"],"stream_namespace":"impression_dataset"}]}],outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@3d45e59d[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@1aaa2f0[stream=io.airbyte.protocol.models.AirbyteStream@539d5dca[name=fennel_impressions,jsonSchema={"type":"object","properties":{"post_id":{"type":"number"},"user_id":{"type":"number"},"added_on":{"type":"number"},"event_name":{"type":"string"},"event_timestamp":{"type":"number"},"time_spent_secs":{"type":"number"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=<null>,defaultCursorField=[],sourceDefinedPrimaryKey=[],namespace=impression_dataset,additionalProperties={}],syncMode=incremental,cursorField=[added_on],destinationSyncMode=append,primaryKey=[],additionalProperties={}]],additionalProperties={}],failures=[io.airbyte.config.FailureReason@757fb41f[failureOrigin=source,failureType=<null>,internalMessage=io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped!,externalMessage=Something went wrong within the source connector,metadata=io.airbyte.config.Metadata@6704de20[additionalProperties={attemptNumber=2, jobId=4529}],stacktrace=java.util.concurrent.CompletionException: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped!
    	at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:315)
    	at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:320)
    	at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1807)
    	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
    	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
    	at java.base/java.lang.Thread.run(Thread.java:833)
    Caused by: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped!
    	at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:341)
    	at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)
    	... 3 more
    Caused by: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled.
    	at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:136)
    	at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:339)
    	... 4 more
    ,retryable=<null>,timestamp=1665587853062]]]
    2022-10-12 15:17:34 [32mINFO[m i.a.w.t.TemporalUtils(withBackgroundHeartbeat):236 - Stopping temporal heartbeating...
    2022-10-12 15:17:34 [32mINFO[m i.a.c.p.ConfigRepository(updateConnectionState):774 - Updating connection 1a755743-0802-4dea-a026-29071d0c55db state: io.airbyte.config.State@2e6b37f2[state={"cdc":false,"streams":[{"cursor":"0","stream_name":"fennel_impressions","cursor_field":["added_on"],"stream_namespace":"impression_dataset"}]}]
    2022-10-12 15:17:13 [44msource[0m > 2022-10-12 15:17:13 [32mINFO[m i.a.i.s.r.AbstractDbSource(lambda$createReadIterator$7):250 - Reading stream fennel_impressions. Records read: 4530000
    ...
    2022-10-12 15:17:21 [44msource[0m > 2022-10-12 15:17:20 [32mINFO[m i.a.i.s.r.AbstractDbSource(lambda$createReadIterator$7):250 - Reading stream fennel_impressions. Records read: 4580000
    2022-10-12 15:17:31 [44msource[0m > 2022-10-12 15:17:31 [32mINFO[m i.a.i.s.r.AbstractDbSource(lambda$read$2):124 - Closing database connection pool.
    2022-10-12 15:17:31 [44msource[0m > 2022-10-12 15:17:31 [32mINFO[m i.a.i.s.r.AbstractDbSource(lambda$read$2):126 - Closed database connection pool.
    2022-10-12 15:17:31 [44msource[0m > Exception in thread "main" com.google.cloud.bigquery.BigQueryException: <http://www.googleapis.com|www.googleapis.com>
    2022-10-12 15:17:31 [44msource[0m > 	at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:115)
    2022-10-12 15:17:31 [44msource[0m > 	at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.listTableData(HttpBigQueryRpc.java:514)
    2022-10-12 15:17:31 [44msource[0m > 	at com.google.cloud.bigquery.BigQueryImpl$29.call(BigQueryImpl.java:1129)
    2022-10-12 15:17:31 [44msource[0m > 	at com.google.cloud.bigquery.BigQueryImpl$29.call(BigQueryImpl.java:1124)
    ......
    2022-10-12 15:17:31 [44msource[0m > 	... 53 more
    2022-10-12 15:17:33 [43mdestination[0m > 2022-10-12 15:17:33 [32mINFO[m i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):65 - Airbyte message consumer: succeeded.
    2022-10-12 15:17:33 [43mdestination[0m > 2022-10-12 15:17:33 [32mINFO[m i.a.i.b.IntegrationRunner(runInternal):171 - Completed integration: io.airbyte.integrations.destination.kafka.KafkaDestination
    2022-10-12 15:17:33 [43mdestination[0m > 2022-10-12 15:17:33 [32mINFO[m i.a.i.d.k.KafkaDestination(main):86 - Completed destination: class io.airbyte.integrations.destination.kafka.KafkaDestination
    ✍️ 1
    e
    s
    • 3
    • 13
  • b

    Bogdan

    10/12/2022, 3:42 PM
    Hello, I created a topic here about my Google Ads IDs that are wrong after the sync. Can somebody help me with this, please?
    ✍️ 1
    • 1
    • 3
  • s

    sonti srihari

    10/12/2022, 4:29 PM
    when I tried to sign-in with Google/Github and I fall into below error "*An unknown error happened during sign in: auth/internal-error*
    u
    • 2
    • 1
  • j

    Jagruti Tiwari

    10/12/2022, 5:23 PM
    Hi everyone, I was rewriting: Deploying Airbyte to Kubernetes tutorial. Was wondering if Using an external DB section can be marked as completed. Because I found a doc that says: Connecting to an External Postgres database.
    m
    s
    a
    • 4
    • 4
  • j

    Joe Swatosh

    10/12/2022, 6:45 PM
    We’ve written a patch to address a date formatting issue with the Netsuite source connector when attempting incremental updates. It seems we aren’t the only ones to have run into the problem. It appears that the input format of the date has changed from m/d/Y to Y-m-d. This assumption is based on the error message as the last time I checked, the documentation hadn’t caught up. Review on the PR has stalled because of a failing acceptance test. However, I cannot get the acceptance tests passing for this connector on the master branch. I’m getting 5 failures. I’m guessing that I’m missing some sort of precondition, but I have no idea what it might be. These are the failures:
    ======================================================================================================================================================================== short test summary info =========================================================================================================================================================================
    FAILED ../../bases/source-acceptance-test/source_acceptance_test/tests/test_core.py:TestDiscovery:test_defined_refs_exist_in_schema[inputs0] - AssertionError: Found unresolved
    $refs
    values for selected streams: ({‘customrecord_advpromo_discount’: [‘/services/rest/record/v1/metadata-catalog/inventoryitem’, ’/services/rest/record/v1/metadata-catalog/assem...
    FAILED ../../bases/source-acceptance-test/source_acceptance_test/tests/test_core.py:TestBasicRead:test_read[inputs0] - docker.errors.ContainerError: Command ‘read --config /data/tap_config.json --catalog /data/catalog.json’ in image ’<Image: ‘airbyte/source-netsuite:dev’>' returned non-zero exit status 1: {“type”: “TRACE”, “trace”: {“type”: “ERROR”, “emit...
    FAILED ../../bases/source-acceptance-test/source_acceptance_test/tests/test_full_refresh.py:TestFullRefresh:test_sequential_reads[inputs0] - docker.errors.ContainerError: Command ‘read --config /data/tap_config.json --catalog /data/catalog.json’ in image ’<Image: ‘airbyte/source-netsuite:dev’>' returned non-zero exit status 1: {“type”: “TRACE”, “trace”: {...
    FAILED ../../bases/source-acceptance-test/source_acceptance_test/tests/test_incremental.py:TestIncremental:test_two_sequential_reads[inputs0] - AssertionError: Should produce at least one record
    FAILED ../../bases/source-acceptance-test/source_acceptance_test/tests/test_incremental.py:TestIncremental:test_read_sequential_slices[inputs0] - AssertionError: Should produce at least one record
    Can anyone offer a suggestion how I might get these tests working so that I can look at the real failure on the branch? Thanks!
    ✍️ 1
    e
    n
    • 3
    • 5
  • l

    laila ribke

    10/12/2022, 7:26 PM
    Hi, I have an airbyte connection to snowplow db in postgres. I´ve set up the connection from google ads and it works perfectly. BUT I get two schemas: __airbyte__snowplow and snowplow. In the first one I get stg tables and in the second one I have raw tables, scd tables and the normalized ones. Which tables I should use and which ones I can delete? It is getting really messy
    ✍️ 1
    👀 1
    s
    • 2
    • 9
  • s

    Samantha Duggan

    10/12/2022, 8:10 PM
    Hello! I’ve been evaluating Airbyte’s open source offering as an alternative to Fivetran for my organization. We’re really excited about Airbyte and there’s a lot we love about it, but we have two blocking features that I’d love to get your take on. 1. SSO login support, specifically via Okta. I see that it’s on the roadmap for Airbyte Cloud - do you have any information on the timeline of this effort? Are there plans to release this for the open source version as well? 2. What are the plans for user management within Airbyte? In the open source version, it seems as though it only supports a single user. Are there plans to implement multiple users and ability to set read/write permissions? Really appreciate any information you can provide here! We do have Airbyte set up behind our VPN, but our infosec team requires stricter access controls than that can provide. We’re talking through alternative paths such as using Octavia exclusively instead of the UI, but I think user management in the UI will be the key for us making the swap.
    ➕ 4
    ✍️ 1
    j
    s
    • 3
    • 3
  • n

    Nazih Kalo

    10/12/2022, 10:19 PM
    Hi 👋 Noob question: There's this fork (https://github.com/Brigad/airbyte/tree/shrodingers/destination-databricks-normalizable) that has an updated connector (databricks) with new incremental dedupe sync mode that I'd like to use. I cloned the fork locally and tried
    docker compose up
    but I'm not getting the new options from that repo 🤔 any help/documentation on how to
    docker compose / build a fork
    locally?
    ✍️ 1
    s
    • 2
    • 4
1...747576...245Latest