https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • e

    Etienne GIROT

    01/18/2023, 10:33 AM
    Hello everyone ! Upgraded airbyte yesterday and most of my migrations stopped working because of the new check on nullable values in cursor: • This release • This PR I'm migrating data from a
    VIEW
    in Postgres, and when building views, there are apparently no way to tell PG to consider some column
    NOT NULL
    . So when airbyte performs the query (here is code):
    SELECT (EXISTS (SELECT FROM information_schema.columns WHERE table_schema = 'public' AND table_name = 'payment_metrics' AND is_nullable = 'YES' AND column_name = 'trace_start')) AND (EXISTS (SELECT from public."payment_metrics" where "trace_start" IS NULL LIMIT 1)) AS nullValue
    (where
    payment_metrics
    is my view and
    trace_start
    is the column used as a cursor) It takes forever... I have no way of: • forcing PG to consider
    trace_start
    as
    is_nullable = 'NO'
    • creating index so that the
    SELECT ... IS NULL LIMIT 1
    is fast ◦ although none of the values are null... Is there a way that I can tell airbyte to skip that check?
    m
    • 2
    • 2
  • t

    toumask

    01/18/2023, 11:05 AM
    Hallo everyone, we extract 12 Tables in Full refresh | overwrite mode from MSSQL DB and load it into a PostgreSQL DB. it takes in average 3 hourse to do so. are those speeds normal for airbyte? how can one make it faster? maybe a wrong configuration? we self-host airbyte using docker compose file from the airbyte repo and it's currently running version 0.40.18.
    n
    • 2
    • 3
  • a

    Akilesh V

    01/18/2023, 11:20 AM
    Hi All, in
    hubspot
    connection we are enable
    deals
    streams with
    incremental | deduped history
    sync mode, but in deals_properties contains duplicate row, is there any solution get rid of duplicates without modifying DBT script or changing sync mode.
    ✅ 1
    u
    • 2
    • 2
  • m

    Miguel Ángel Torres Font - Valencia C.F.

    01/18/2023, 11:25 AM
    Hi all! We are trying to generate a small visual dashboard to visualise the various connections I have in Airbyte with data on the latest run. My problem is that I've been investigating the API and I can't find the data I'm looking for. Ideally I would like to get for each connection the completion status and the time of launch. Do you know if it exists anything that may fit with this?
    n
    • 2
    • 1
  • i

    ihsan islam

    01/18/2023, 11:29 AM
    Can someone please help me figure this error out for ZOHO connector - refresh token is correct
    Copy code
    2023-01-18 11:25:04 INFO i.a.w.t.TemporalAttemptExecution(get):136 - Docker volume job log path: /tmp/workspace/da20fe77-8a29-4276-9668-4d7e68c3140e/0/logs.log
    2023-01-18 11:25:04 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.40.28
    2023-01-18 11:25:04 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):172 - Attempt 0 to save workflow id for cancellation
    2023-01-18 11:25:04 INFO i.a.c.i.LineGobbler(voidCall):114 - 
    2023-01-18 11:25:04 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- START CHECK -----
    2023-01-18 11:25:04 INFO i.a.c.i.LineGobbler(voidCall):114 - 
    2023-01-18 11:25:04 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/source-zoho-crm:0.1.0 exists...
    2023-01-18 11:25:04 INFO i.a.c.i.LineGobbler(voidCall):114 - airbyte/source-zoho-crm:0.1.0 was found locally.
    2023-01-18 11:25:04 INFO i.a.w.p.DockerProcessFactory(create):120 - Creating docker container = source-zoho-crm-check-da20fe77-8a29-4276-9668-4d7e68c3140e-0-jhenx with resources io.airbyte.config.ResourceRequirements@6740bbac[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=]
    2023-01-18 11:25:04 INFO i.a.w.p.DockerProcessFactory(create):164 - Preparing command: docker run --rm --init -i -w /data/da20fe77-8a29-4276-9668-4d7e68c3140e/0 --log-driver none --name source-zoho-crm-check-da20fe77-8a29-4276-9668-4d7e68c3140e-0-jhenx --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=true -e FIELD_SELECTION_WORKSPACES= -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=0 -e WORKER_CONNECTOR_IMAGE=airbyte/source-zoho-crm:0.1.0 -e AUTO_DETECT_SCHEMA=false -e AIRBYTE_VERSION=0.40.28 -e WORKER_JOB_ID=da20fe77-8a29-4276-9668-4d7e68c3140e airbyte/source-zoho-crm:0.1.0 check --config source_config.json
    2023-01-18 11:25:04 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):100 - Reading messages from protocol version 0.2.0
    2023-01-18 11:25:07 ERROR i.a.w.i.DefaultAirbyteStreamFactory(internalLog):116 - Check failed
    2023-01-18 11:25:07 INFO i.a.w.g.DefaultCheckConnectionWorker(run):110 - Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@6a6fef64[status=failed,message=Exception("Error while refreshing access token: 'access_token'")]
    2023-01-18 11:25:07 INFO i.a.w.t.TemporalAttemptExecution(get):163 - Stopping cancellation check scheduling...
    2023-01-18 11:25:07 INFO i.a.c.i.LineGobbler(voidCall):114 - 
    2023-01-18 11:25:07 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- END CHECK -----
    2023-01-18 11:25:07 INFO i.a.c.i.LineGobbler(voidCall):114 -
    u
    u
    +3
    • 6
    • 8
  • g

    Grember Yohan

    01/18/2023, 11:45 AM
    Hello Airbyte community 👋 We tried upgrading Airbyte from 0.40.27 to 0.40.28, and since then, we can't access sync logs from the UI 😞 We can see the sync history properly but when we try to click on one specific sync to see the logs, it triggers the infamous 'Oops! Something went wrong' page, and the following error appears in the console:
    Copy code
    react-dom.production.min.js:216 Error: Internal Server Error: Cannot invoke "io.airbyte.config.storage.CloudStorageConfigs.getType()" because the return value of "io.airbyte.config.helpers.LogConfigs.getStorageConfigs()" is null
        at apiOverride.ts:107:9
        at f (regeneratorRuntime.js:86:17)
        at Generator._invoke (regeneratorRuntime.js:66:24)
        at Generator.next (regeneratorRuntime.js:117:21)
        at r (asyncToGenerator.js:3:20)
        at u (asyncToGenerator.js:25:9)
    Downgrading from 0.40.28 to 0.40.27 fixes this issue. Should I share this somewhere specific to document this regression and improve its chances to be fixed?
    u
    • 2
    • 1
  • w

    wolfgan dand

    01/18/2023, 12:27 PM
    Can someone please tell me if it is possible to get a connection (in order to connect to the database through this connection in the dag) from airbyte in airflow?(i don't find it in api documentation)
    n
    p
    • 3
    • 9
  • m

    McKenna West

    01/18/2023, 3:50 PM
    Hi everyone! We're trying to get a Databricks destination set up but keep running into an error. When testing the connection during setup, it runs fine, but then when we try to sync we get the following error. Any ideas? line 132: _Failed to finalize copy to temp table due to: java.sql.SQLException: [Databricks][DatabricksJDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: org.apache.hive.service.cli.HiveSQLException: Error running query: java.nio.file.AccessDeniedException: <s3path_>_: getFileStatus on <s3path>: com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden; request: HEAD <s3path> {} Hadoop 3.3.4, aws-sdk-java/1.12.189 Linux/4.15.0-2081-aws-fips OpenJDK_64-Bit_Server_VM/25.345-b01 java/1.8.0_345 scala/2.12.14 vendor/Azul_Systems,Inc. cfg/retry-mode/legacy com.amazonaws.services.s3.model.GetObjectMetadataRequest; Request ID: 2H0AFD7D7CQW5RX0, Extended Request ID: lN/ALks7QIeuNNlDNtlrCyK15z8eHqSaRQDqYpDhdHjiXLf69cME9aPsUPegnOqKANGFC15VDOk=, Cloud Provider: AWS, Instance ID: <instanceID> (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: 2H0AFD7D7CQW5RX0; S3 Extended Request ID: lN/ALks7QIeuNNlDNtlrCyK15z8eHqSaRQDqYpDhdHjiXLf69cME9aPsUPegnOqKANGFC15VDOk=; Proxy: null), S3 Extended Request ID: lN/ALks7QIeuNNlDNtlrCyK15z8eHqSaRQDqYpDhdHjiXLf69cME9aPsUPegnOqKANGFC15VDOk=:403 Forbidden
    c
    a
    • 3
    • 3
  • d

    Danish Raza

    01/18/2023, 4:21 PM
    Hi, is there any way to see all API endpoints? like a swagger
    ✅ 1
    • 1
    • 1
  • f

    Francesco F

    01/18/2023, 5:01 PM
    Hi, I am new to the community and trying to set up AirByte locally on Windows 11 through this page: https://docs.airbyte.com/deploying-airbyte/local-deployment/#setup--launch-airbyte Currently stuck on the
    docker compose up
    command, as it seems like my VSCode keeps loading in a loop. Any suggestion? I have installed Docker and it seems to be up and running correctly.
    m
    r
    • 3
    • 5
  • g

    god830

    01/18/2023, 6:56 PM
    I'm using the File source now, and trying to get a gzip file. Is the Reader Options correct?
    🆘 1
    s
    • 2
    • 20
  • g

    god830

    01/02/2023, 5:18 PM
    Hi, I'm getting this error when connecting to NetSuite.
    HTTPError('400 Client Error: Bad Request for url: <https://7074563.suitetalk.api.netsuite.com/services/rest/record/v1/contact?limit=1>')
    The only post I can find related to this is unsolved: https://discuss.airbyte.io/t/airbyte-netsuite-connector-400-client-error/2948
    s
    • 2
    • 3
  • i

    ihsan islam

    01/18/2023, 10:16 PM
    Is there an ability to receive a webhook on Airbyte from source and populate the data into a destination db? I see https://www.rudderstack.com/ has this feature and Maybe Airbyte has something similar? Because that could avoid us developing a connector for every third party app we work with
  • w

    Walker Philips

    01/18/2023, 10:45 PM
    Does Airbyte handle duplicate records behind the scene for incremental loads? Like if it pulled the record { id: 1 val: my val updated_at: 2020-01-01 } TWICE, will it automatically ignore it the second time? It seems de duplicated handles the scenario where updated_at might change and essentially makes a temporal table... that is not what I am asking Quoting one of the guides: You should see that only the record from the last date is being synced! This is acceptable behavior, since Airbyte requires at-least-once delivery of records, so repeating the last record twice is OK."
    a
    • 2
    • 2
  • s

    Sean Zicari

    01/19/2023, 12:10 AM
    It appears the most frequent sync schedule one can set in AirByte is once a minute. Is it possible to synchronize every 3 seconds without writing my own logic to call the AirByte API myself?
    u
    h
    • 3
    • 4
  • v

    Vincent Koc

    01/19/2023, 12:44 AM
    I know role based access is not supported, but how is the demo.airbyte.io environment able to make the platform read-only. Is there a variable in the environment or charts we can set to make instance read only?
    j
    m
    • 3
    • 5
  • l

    Lukas Holdorf

    01/04/2023, 2:35 PM
    Hey everyone. I'm running into some issues when trying to sync data from GA 4 with a custom report. I followed the documentation https://docs.airbyte.com/integrations/sources/google-analytics-v4/#custom-reports for the syntax and just tried to start with a simple report such as: [{"name": "pages_test", "dimensions": ["pagePath"], "metrics": ["screenPageViews"]}] as you can see in the screenshot. For some reason I get the following error when running the sync: "Failure Origin: source, Message: Something went wrong in the connector. See the logs for more details." When checking the logs I don't really know what to do with the error: 2023-01-04 141949 INFO i.a.p.j.e.LoggingJobErrorReportingClient(reportJobFailureReason):23 - Report Job Error -> workspaceId: 10236219-10c4-4719-9d79-b21e8e34a97d, dockerImage: airbyte/source-google-analytics-data-api:0.0.3, failureReason: io.airbyte.config.FailureReason@498c0a6d[failureOrigin=source,failureType=system_error,internalMessage='date',externalMessage=Something went wrong in the connector. See the logs for more details.,metadata=io.airbyte.config.Metadata@a20e0ff[additionalProperties={attemptNumber=2, jobId=17, connector_command=read, from_trace_message=true}],stacktrace=Traceback (most recent call last): File "/airbyte/integration_code/main.py", line 13, in <module> launch(source, sys.argv[1:]) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 123, in launch for message in source_entrypoint.run(parsed_args): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 114, in run for message in generator: File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 128, in read raise e File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 114, in read yield from self._read_stream( File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 179, in _read_stream for record in record_iterator: File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 277, in _read_full_refresh for record in records: File "/airbyte/integration_code/source_google_analytics_data_api/source.py", line 288, in read_records next_cursor_value = utils.string_to_date(row[self.cursor_field], self._record_date_format) File "/usr/local/lib/python3.9/collections/__init__.py", line 941, in getitem return self.__missing__(key) # support subclasses that define missing
    s
    • 2
    • 1
  • t

    Tmac Han

    01/19/2023, 3:07 AM
    Hi team, I have wrote a new source connector in this pr https://github.com/airbytehq/airbyte/pull/21302, would you like to help me to review it? Thank you very much!
    m
    • 2
    • 1
  • u

    김건희

    01/19/2023, 3:27 AM
    hi everyone! i tried to use airbyte on my local environment, i don't know the way to change table name at destination source can i do that? or change source table name? thanks
    m
    • 2
    • 2
  • l

    Lihan Li

    01/19/2023, 5:33 AM
    Hi team, do we have any timeline for the reverse ETL feature? Relate to https://airbyte.com/blog/airbyte-acquires-grouparoo-to-accelerate-data-movement
    👍 1
  • a

    Akilesh V

    01/19/2023, 6:40 AM
    Hi ALL i am unable understand below error message, please someone help with this, thanks i don't think cluster has internet access issue
    Copy code
    023-01-18 06:42:02 [33mWARN[m i.a.s.s.AirbyteGithubStore(getLatestSources):69 - Unable to retrieve latest Source list from Github. Using the list bundled with Airbyte. This warning is expected if this Airbyte cluster does not have internet access.
    java.io.IOException: Connection reset
    	at jdk.internal.net.http.HttpClientImpl.send(HttpClientImpl.java:857) ~[java.net.http:?]
    	at jdk.internal.net.http.HttpClientFacade.send(HttpClientFacade.java:123) ~[java.net.http:?]
    	at io.airbyte.server.services.AirbyteGithubStore.getFile(AirbyteGithubStore.java:83) ~[io.airbyte-airbyte-server-0.40.3.jar:?]
    	at io.airbyte.server.services.AirbyteGithubStore.getLatestSources(AirbyteGithubStore.java:67) ~[io.airbyte-airbyte-server-0.40.3.jar:?]
    	at io.airbyte.server.handlers.SourceDefinitionsHandler.getLatestSources(SourceDefinitionsHandler.java:136) ~[io.airbyte-airbyte-server-0.40.3.jar:?]
    	at io.airbyte.server.handlers.SourceDefinitionsHandler.listLatestSourceDefinitions(SourceDefinitionsHandler.java:131) ~[io.airbyte-airbyte-server-0.40.3.jar:?]
    	at io.airbyte.server.apis.ConfigurationApi.execute(ConfigurationApi.java:870) ~[io.airbyte-airbyte-server-0.40.3.jar:?]
    	at io.airbyte.server.apis.ConfigurationApi.listLatestSourceDefinitions(ConfigurationApi.java:326) ~[io.airbyte-airbyte-server-0.40.3.jar:?]
    	at jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104) ~[?:?]
    	at java.lang.reflect.Method.invoke(Method.java:578) ~[?:?]
    	at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory.lambda$static$0(ResourceMethodInvocationHandlerFactory.java:52) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:124) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:167) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:219) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:79) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:469) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:391) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:80) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:253) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.internal.Errors$1.call(Errors.java:248) ~[jersey-common-2.31.jar:?]
    	at org.glassfish.jersey.internal.Errors$1.call(Errors.java:244) ~[jersey-common-2.31.jar:?]
    	at org.glassfish.jersey.internal.Errors.process(Errors.java:292) ~[jersey-common-2.31.jar:?]
    	at org.glassfish.jersey.internal.Errors.process(Errors.java:274) ~[jersey-common-2.31.jar:?]
    	at org.glassfish.jersey.internal.Errors.process(Errors.java:244) ~[jersey-common-2.31.jar:?]
    	at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:265) ~[jersey-common-2.31.jar:?]
    	at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:232) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:680) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:394) ~[jersey-container-servlet-core-2.31.jar:?]
    	at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:346) ~[jersey-container-servlet-core-2.31.jar:?]
    	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:366) ~[jersey-container-servlet-core-2.31.jar:?]
    	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:319) ~[jersey-container-servlet-core-2.31.jar:?]
    	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:205) ~[jersey-container-servlet-core-2.31.jar:?]
    	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:763) ~[jetty-servlet-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:569) ~[jetty-servlet-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1377) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:507) ~[jetty-servlet-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1292) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.Server.handle(Server.java:501) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:556) [jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375) [jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273) [jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) [jetty-io-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) [jetty-io-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) [jetty-io-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:336) [jetty-util-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:313) [jetty-util-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:171) [jetty-util-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:129) [jetty-util-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:375) [jetty-util-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806) [jetty-util-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938) [jetty-util-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at java.lang.Thread.run(Thread.java:1589) [?:?]
    Caused by: java.net.SocketException: Connection reset
    	at sun.nio.ch.SocketChannelImpl.throwConnectionReset(SocketChannelImpl.java:401) ~[?:?]
    	at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:434) ~[?:?]
    	at jdk.internal.net.http.SocketTube.readAvailable(SocketTube.java:1178) ~[java.net.http:?]
    	at jdk.internal.net.http.SocketTube$InternalReadPublisher$InternalReadSubscription.read(SocketTube.java:841) ~[java.net.http:?]
    	at jdk.internal.net.http.SocketTube$SocketFlowTask.run(SocketTube.java:181) ~[java.net.http:?]
    	at jdk.internal.net.http.common.SequentialScheduler$SchedulableTask.run(SequentialScheduler.java:230) ~[java.net.http:?]
    	at jdk.internal.net.http.common.SequentialScheduler.runOrSchedule(SequentialScheduler.java:303) ~[java.net.http:?]
    	at jdk.internal.net.http.common.SequentialScheduler.runOrSchedule(SequentialScheduler.java:256) ~[java.net.http:?]
    	at jdk.internal.net.http.SocketTube$InternalReadPublisher$InternalReadSubscription.signalReadable(SocketTube.java:782) ~[java.net.http:?]
    	at jdk.internal.net.http.SocketTube$InternalReadPublisher$ReadEvent.signalEvent(SocketTube.java:965) ~[java.net.http:?]
    	at jdk.internal.net.http.SocketTube$SocketFlowEvent.handle(SocketTube.java:253) ~[java.net.http:?]
    	at jdk.internal.net.http.HttpClientImpl$SelectorManager.handleEvent(HttpClientImpl.java:1337) ~[java.net.http:?]
    	at jdk.internal.net.http.HttpClientImpl$SelectorManager.lambda$run$3(HttpClientImpl.java:1282) ~[java.net.http:?]
    	at java.util.ArrayList.forEach(ArrayList.java:1511) ~[?:?]
    	at jdk.internal.net.http.HttpClientImpl$SelectorManager.run(HttpClientImpl.java:1282) ~[java.net.http:?]
    x
    r
    • 3
    • 4
  • b

    Ben

    01/19/2023, 8:00 AM
    Hi folks 👋🏼 We currently ingesting our product data from a single postgres db to BQ. We’re planning to split up this single db into multiple regional dbs soon due to performance reasons. At the moment, we’re aiming to have 10 regional db but it will surely increase as the product grows. The regional dbs have the same schema and we would like to ingest it on a single schema in BQ. Does Airbyte have support for this functionality? Any recommendations are appreciated. Thanks in advance! 🙂
    r
    • 2
    • 2
  • j

    José Lúcio Zancan Júnior

    01/19/2023, 8:10 AM
    I'm having a persistent error on the normalization step in a Facebook Marketing to BQ connector. Already tried to recreate the source, the destination, the connection, the BigQuery dataset, reset the data and so on. How can I get more information on whats going on? The connection (Incremental, Dedup) ran just fine on the first day, but it started to fail from the next day sync and after. Denorm and Raw JSON connections with this same Source are completely fine, but I need the normalized one because the deduplication feature. AirByte (0.40.28) on Helm, Facebook Marketing (0.2.83), BigQuery (1.2.11)
    Copy code
    Unhandled error while executing model.airbyte_utils.norm_ads_insights_stg
    Pickling client objects is explicitly not supported.
    Clients have non-trivial state that is local and unpickleable.
    1 of 50 ERROR creating view model _airbyte_medialake_fb.norm_ads_insights_stg........................................... [ERROR in 1.00s]
    Pickling client objects is explicitly not supported.
    Clients have non-trivial state that is local and unpickleable.
    Thanks.
    m
    a
    • 3
    • 3
  • v

    Vincent Koc

    01/19/2023, 8:11 AM
    CI/CD is broken: https://github.com/airbytehq/airbyte/actions/runs/3956439457/jobs/6775686449 https://github.com/airbytehq/airbyte/actions/runs/3954081396/jobs/6771064453 https://github.com/airbytehq/airbyte/actions/runs/3954176473/jobs/6771266411
    ✅ 1
  • v

    Vincent Koc

    01/19/2023, 8:12 AM
    Possible breaking change from https://github.com/airbytehq/airbyte/commit/b9f9e0722f2e0c124d18590c7a0e351531959b4e
  • g

    Giorgos Tzanakis

    01/19/2023, 8:18 AM
    Hi all. I am a bit confused about the fact that some connectors exist in the docs but do not seem to be available in Airbyte cloud. For example DynamoDB is documented as a source however I do not see it as an option in my cloud workspace. I also see that the corresponding code in github has been merged. So, can somebody please explain when connectors in the docs should be expected to be available in the cloud? Thanks in advance!
    • 1
    • 1
  • u

    Umar Hussain

    01/19/2023, 10:59 AM
    Hello all, quick question when storing secrets via the UI I can see the are saved into the associated db (for me its postgres) as plain text is this intentional and if so, is there a more secure method of managing secrets with AWS Secrets Manager / Azure Key vault or others?
    ✅ 2
    m
    c
    j
    • 4
    • 9
  • s

    Srinidhi krishnamurthy

    01/19/2023, 11:53 AM
    Hi Team, i was referring the doc https://docs.airbyte.com/integrations/custom-connectors/#adding-your-connectors-in-the-ui to add our custom connector source image , didnt find the admin section in the airbyte console , can someone point to right path in airbyte ui please , we had built the image and pushed to ECR.
    u
    a
    d
    • 4
    • 5
  • j

    Justen Walker

    01/19/2023, 12:24 PM
    Are there any upgrade instructions for deploying Airbyte to Kubernetes via Helm Charts? The K8s instructions allude to deleting all the deployments; do you have to do something similar with Helm? (helm uninstall + helm install?) or will helm upgrade work as intended.
    👍 1
    f
    u
    u
    • 4
    • 15
  • n

    Naren Kadiri

    01/19/2023, 1:04 PM
    🔴 Hello Everyone. I created a connection to extract data from SQL server to Snowflake. whenever I make changes to the connections (like adding or removing tables for data ingestion). that entire connection is going to reset and pulls data again (by cleaning all the previously extracted data). is there some mechanism to not to do this. any help is appreciated
    u
    • 2
    • 2
1...124125126...245Latest