https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • l

    laila ribke

    11/08/2022, 3:32 PM
    Hi all, again 🤣🤦‍♀️. Which incremental refresh takes less time? Another issue, we are loading the data with batch loading.
  • l

    laila ribke

    11/08/2022, 3:37 PM
    Hi again... I´m trying to figure out the table sizes in the destination. In my Postgres source, I have a table with 80k size. I´m using incremental + history deduped sync and in my redshift destination, the table size is 4G. I created a new table in destination, just without the Airbyte columns, and the table dropped to 10M. Should the size in the destination be SSSOOO high? Is there a way to config it differently?
    a
    • 2
    • 5
  • a

    Austin Poulton

    11/08/2022, 4:08 PM
    Hi all, I am curious about airbyte connectors for mainframe systems - a client of our is considering infromatica integration because it support Mainframe data extraction - any information would be super helpful
    • 1
    • 1
  • d

    Daniel Vengoechea

    11/08/2022, 4:28 PM
    1. The pipedrive source is missing a lot of streams to make it usefull. This missing streams exist on tap-pipedrive. On your website it shows that airbyte is compatible with taps, is there an easy way of adding a tap as a source from github?
    s
    • 2
    • 1
  • d

    Daniel Vengoechea

    11/08/2022, 4:29 PM
    2. How can we expand a source, for example, i want to add a new stream to your pipedrive source. I know that i can extend the catalog json schema through the api but how can a i add a new stream . ?
  • p

    Pavan

    11/08/2022, 5:33 PM
    Geeks, We are new to Airbyte and trying to do POC and wanted to know, how and where can we apply this fetch size parameter, I am using PostGress as Source connector (source-postgres:0.4.28) and want to load millions of records into Snowflake as destination (destination-snowflake:0.4.38) Currently it is taking 3:20 Hrs to load approx 6 GB data with 12 M records.
    • 1
    • 1
  • a

    Alexandre Voyer

    11/08/2022, 7:06 PM
    Hi there, just a heads up that I created a topic ( https://discuss.airbyte.io/t/first-sync-successful-next-syncs-0-bytes-no-records/3135 ) with screenshot + logs attached about the issue I'm having with postgresql -> snowflake with first sync working and other syncs working but without any data. Any help is appreciated 🙂
    n
    • 2
    • 1
  • c

    Coleman Kelleghan

    11/08/2022, 8:24 PM
    Hi Airbyte, We are seeing failed syncs for a large snowflake source, I am attaching the full log file. Is there anything in these logs that indicate the cause? It looks like this may be the relevant error:
    Copy code
    2022-11-08 19:42:36 [32mINFO[m i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):381 - Total records read: 16928693 (31 GB)
    2022-11-08 19:42:36 [32mINFO[m i.a.w.g.DefaultReplicationWorker(cancel):501 - Cancelling destination...
    2022-11-08 19:42:36 [32mINFO[m i.a.w.i.DefaultAirbyteDestination(cancel):121 - Attempting to cancel destination process...
    2022-11-08 19:42:36 [32mINFO[m i.a.w.g.DefaultReplicationWorker(run):190 - One of source or destination thread complete. Waiting on the other.
    2022-11-08 19:42:36 [32mINFO[m i.a.w.i.DefaultAirbyteDestination(cancel):126 - Destination process exists, cancelling...
    2022-11-08 19:42:36 [32mINFO[m i.a.w.p.KubePodProcess(destroy):664 - (pod: homelander-airbyte-external / destination-postgres-write-30-0-cekpy) - Destroying Kube process.
    2022-11-08 19:42:36 [32mINFO[m i.a.w.g.DefaultReplicationWorker(run):192 - Source and destination threads complete.
    2022-11-08 19:42:36 [32mINFO[m i.a.w.p.KubePodProcess(close):737 - (pod: homelander-airbyte-external / destination-postgres-write-30-0-cekpy) - Closed all resources for pod
    2022-11-08 19:42:36 [33mWARN[m i.a.c.i.LineGobbler(voidCall):119 - airbyte-destination gobbler IOException: Socket closed. Typically happens when cancelling a job.
    2022-11-08 19:42:36 [32mINFO[m i.a.w.p.KubePodProcess(destroy):670 - (pod: homelander-airbyte-external / destination-postgres-write-30-0-cekpy) - Destroyed Kube process.
    2022-11-08 19:42:36 [32mINFO[m i.a.w.i.DefaultAirbyteDestination(cancel):128 - Cancelled destination process!
    2022-11-08 19:42:36 [32mINFO[m i.a.w.g.DefaultReplicationWorker(cancel):508 - Cancelling source...
    2022-11-08 19:42:36 [32mINFO[m i.a.w.i.DefaultAirbyteSource(cancel):141 - Attempting to cancel source process...
    2022-11-08 19:42:36 [32mINFO[m i.a.w.i.DefaultAirbyteSource(cancel):146 - Source process exists, cancelling...
    2022-11-08 19:42:36 [32mINFO[m i.a.w.p.KubePodProcess(destroy):664 - (pod: homelander-airbyte-external / source-snowflake-read-30-0-dbrwk) - Destroying Kube process.
    2022-11-08 19:42:36 [32mINFO[m i.a.w.p.KubePodProcess(close):737 - (pod: homelander-airbyte-external / source-snowflake-read-30-0-dbrwk) - Closed all resources for pod
    2022-11-08 19:42:36 [33mWARN[m i.a.c.i.LineGobbler(voidCall):119 - airbyte-source gobbler IOException: Socket closed. Typically happens when cancelling a job.
    2022-11-08 19:42:36 [32mINFO[m i.a.w.p.KubePodProcess(destroy):670 - (pod: homelander-airbyte-external / source-snowflake-read-30-0-dbrwk) - Destroyed Kube process.
    2022-11-08 19:42:36 [32mINFO[m i.a.w.i.DefaultAirbyteSource(cancel):148 - Cancelled source process!
    2022-11-08 19:42:36 [32mINFO[m i.a.w.t.TemporalAttemptExecution(lambda$getCancellationChecker$5):230 - Interrupting worker thread...
    2022-11-08 19:42:36 [32mINFO[m i.a.w.t.TemporalAttemptExecution(lambda$getCancellationChecker$5):233 - Cancelling completable future...
    2022-11-08 19:42:36 [33mWARN[m i.a.c.t.CancellationHandler$TemporalCancellationHandler(checkAndHandleCancellation):53 - Job either timed out or was cancelled.
    2022-11-08 19:42:36 [33mWARN[m i.a.c.t.CancellationHandler$TemporalCancellationHandler(checkAndHandleCancellation):53 - Job either timed out or was cancelled.
    2022-11-08 19:42:36 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):162 - Stopping cancellation check scheduling...
    2022-11-08 19:42:36 [1;31mERROR[m i.a.w.WorkerUtils(gentleClose):53 - Exception while while waiting for process to finish
    022-11-08 19:42:36 [33mWARN[m i.a.c.t.CancellationHandler$TemporalCancellationHandler(checkAndHandleCancellation):53 - Job either timed out or was cancelled.
    2022-11-08 19:42:36 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):162 - Stopping cancellation check scheduling...
    2022-11-08 19:42:36 [1;31mERROR[m i.a.w.WorkerUtils(gentleClose):53 - Exception while while waiting for process to finish
    java.lang.InterruptedException: sleep interrupted
    	at java.lang.Thread.sleep0(Native Method) ~[?:?]
    	at java.lang.Thread.sleep(Thread.java:465) ~[?:?]
    	at java.lang.Process.waitFor(Process.java:468) ~[?:?]
    	at io.airbyte.workers.process.KubePodProcess.waitFor(KubePodProcess.java:653) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
    	at io.airbyte.workers.WorkerUtils.gentleClose(WorkerUtils.java:50) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
    	at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:128) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
    	at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:194) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
    	at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:71) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
    	at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$4(TemporalAttemptExecution.java:190) ~[io.airbyte-airbyte-workers-0.40.17.jar:?]
    	at java.lang.Thread.run(Thread.java:1589) ~[?:?]
    2022-11-08 19:42:36 [32mINFO[m i.a.c.t.TemporalUtils(withBackgroundHeartbeat):283 - Stopping temporal heartbeating...
    2022-11-08 19:42:36 [1;31mERROR[m i.a.w.g.DefaultReplicationWorker(run):196 - Sync worker failed.
    io.airbyte.workers.exception.WorkerException: Source process exit with code 143. This warning is normal if the job was cancelled.
    	at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:135) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
    	at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:194) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
    	at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:71) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
    	at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$4(TemporalAttemptExecution.java:190) ~[io.airbyte-airbyte-workers-0.40.17.jar:?]
    	at java.lang.Thread.run(Thread.java:1589) ~[?:?]
    	Suppressed: io.airbyte.workers.exception.WorkerException: Destination process exit with code 143. This warning is normal if the job was cancelled.
    		at io.airbyte.workers.internal.DefaultAirbyteDestination.close(DefaultAirbyteDestination.java:115) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
    		at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:151) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
    		at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:71) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
    		at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$4(TemporalAttemptExecution.java:190) ~[io.airbyte-airbyte-workers-0.40.17.jar:?]
    		at java.lang.Thread.run(Thread.java:1589) ~[?:?]
    bdac0505_372c_4b7c_8e74_8717b192c9ff_logs_30_txt.txt
    s
    • 2
    • 6
  • l

    Leo G

    11/08/2022, 10:00 PM
    Does the ORACLE source connector support optional schema instead of the user's schema? I was able to set up an oracle source but it does not retain the schema name I put in.
    • 1
    • 2
  • d

    Duncan Reyneke

    11/08/2022, 10:25 PM
    Hi, I was directed here by the support team. I have a project that requires a lot of help and I'm at a level of inexperience that will probably elicit some laughter. Basically I need to set up open source AirByte on a NodeJS server and have gotten as far as SSH'ing into the server lol. Any help would be appreciated, even if you could point me in the direction of a good tutorial to get me started - this is sort of a "throw him in the deep end" type thing and I'm trying to do it the right way by learning as much as I can while asking for help.
    a
    s
    • 3
    • 3
  • b

    Brad Nemetski

    11/09/2022, 12:25 AM
    I'm currently working on a POC of airbyte, and one of the more important features we'd like to vet is using an API to create a new connection from scratch. For the purpose of the POC, I'd like to connect to an S3 bucket, read a csv, and load it into a Snowflake instance. I've already done this manually to make sure that it works. I'm having some trouble figuring out what sort of API calls I need, and what I need to send - for example, I've looked at /v1/source_definitions/create but I don't see how to put in things like the S3 bucket, or the credentials. Is there any more specific documentation that has concrete examples? Alternatively, is there a way I can pull the definitions I already made into JSON that can be fed back into the API? That would make the modifications fairly straightforward. Thanks I'm running a local docker instance v0.40.18
    s
    • 2
    • 2
  • z

    Zack Peacock

    11/09/2022, 2:13 AM
    hi all, when following typical instructions found in this page when opening terminal to relaunch airbyte instance as usual: https://docs.airbyte.com/deploying-airbyte/on-gcp-compute-engine/ Step 1: PROJECT_ID= [xyz] Step 2: Instance_Name= [abc] Step 3:
    Copy code
    gcloud --project=$PROJECT_ID beta compute SSH $INSTANCE_NAME -- -L 8000:localhost:8000 -N -f
    I’m getting the following error in terminal as of this week:
    Copy code
    ERROR: (gcloud.beta.compute) Invalid choice: 'SSH'.
    Maybe you meant:
      gcloud compute ssh
      gcloud compute config-ssh
      gcloud beta compute addresses update
      gcloud beta compute backend-services add-iam-policy-binding
      gcloud beta compute backend-services get-iam-policy
      gcloud beta compute backend-services remove-iam-policy-binding
      gcloud beta compute backend-services set-iam-policy
      gcloud beta compute commitments update-reservations
      gcloud beta compute instance-groups managed all-instances-config delete
      gcloud beta compute instance-groups managed all-instances-config update
    Haven’t been able to figure what I need to do, but feel like it’s got to be simple. Any help would be greatly appreciated.
    n
    • 2
    • 7
  • a

    Alex Sher

    11/09/2022, 5:00 AM
    Hey team, Could you point me out, how to setup production-grade Airbyte on EC2 instance? My main question is about data storage — is enough to just use RDS instead of provided docker based psql, or should I also make an external EBS and mount all Airbyte docker compose volumes to this external EBS? My end goal — ability to easily reboot / recreate Airbyte EC2 instance. I checked the docs, but most info about persistent storages is around k8s deployment. Thanks in advance!
    • 1
    • 4
  • a

    Andreas Nigg

    11/09/2022, 7:20 AM
    Hey folks, basic question: Using airbyte, what's actually stored in minio? Are there only logs? Is there some sort of clean up I can or should do?
    • 1
    • 1
  • p

    Pankaj Gupta

    11/09/2022, 7:58 AM
    Hi Airbyte, I'm just into Getting started phase. Setup done till now - 1. Sets Docker desktop in windows 10. 2. Using cmd (docker-compose up) for running up Docker Containers. 3. Opening up http://localhost:8000/ for Airbyte UI & setup Source (MySQL) & Destination (Postgres) both DB is hosted in our Company's UAT Server. 4. Also synced up (Sync succeeded) Schema from Source DB for dumbing data in destination. But as I see data not getting dumbed into Destination. Apart from these settings I haven't made any additional Configuration for operation. Please help me out what all things I've been missing. I've been stuck into this from last 3 days. Thanks in advance team.
    k
    s
    • 3
    • 8
  • d

    Dheeraj Pranav

    11/09/2022, 8:05 AM
    Hi team, I've installed the docker & airbyte set up once we use docker-compose up & got our airbyte UI in localhost:8000, its running fine, I'm able to create sources , destinations , connections. but when I try to store the configurations of all the airbyte connectors using Octavia: sources , destinations I'm able to download & keep a track. but when I try to store connection information . I'm facing issue , its raising this when I use the command : octavia import connection [connection-ID] error: TypeError: _from_openapi_data() missing 3 required positional arguments: 'schema_change', 'notify_schema_changes', and 'non_breaking_changes_preference' Is there a version issue or any other
    s
    • 2
    • 3
  • c

    Chetan Dalal

    11/09/2022, 9:19 AM
    Hi team - I am using Airbyte open source. Source: Elastic Search Destination : SQL However, I am getting this error while setting up the connection: "2022-11-09 091420 ERROR i.a.w.i.DefaultAirbyteStreamFactory(validate):87 - Validation failed: {"error":{"root_cause":[{"type":"security_exception","reason":"no permissions for [indices:admin/get] and User [name=product, backend_roles=[naive-user], requestedTenant=null]"}],"type":"security_exception","reason":"no permissions for [indices:admin/get] and User [name=product, backend_roles=[naive-user], requestedTenant=null]"},"status":403}" Btw, I am using OpenSearch1.3 as source and we have already added this permission on OpenSearch. How to proceed?
    a
    a
    j
    • 4
    • 16
  • a

    Ashish Narang

    11/09/2022, 11:30 AM
    Hello all, i'm tryting to use airbyte incremental sync with exchange rate api. I've followed the read data documentation - https://docs.airbyte.com/connector-development/tutorials/cdk-tutorial-python-http/read-data Everything seems to be working except that my
    state.json
    is not getting updated. From the checkpoint i'm getting messages logs that new data is fetched but state.json points to the first date which i provided in config.json file. Sharing below the code for Stream:
    Copy code
    class ExchangeRates(HttpStream, IncrementalMixin):
        # url_base = "<https://api.apilayer.com/exchangerates_data/>"
        url_base = "<https://api.exchangerate.host/>"
    
        cursor_field = "date"
        primary_key = "date"
    
        def __init__(self, config: Mapping[str, Any], start_date: datetime, **kwargs):
            super().__init__()
            self.base = config['base']
            self.access_key = config['access_key']
            self.start_date = start_date
            self._cursor_value = None
    
        @property
        def state(self) -> Mapping[str, Any]:
            if self._cursor_value:
                return {self.cursor_field: self._cursor_value.strftime('%Y-%m-%d')}
            else:
                return {self.cursor_field: self.start_date.strftime('%Y-%m-%d')}
        
        @state.setter
        def state(self, value: Mapping[str, Any]):
           self._cursor_value = datetime.strptime(value[self.cursor_field], '%Y-%m-%d')
    
        def read_records(self, *args, **kwargs) -> Iterable[Mapping[str, Any]]:
            for record in super().read_records(*args, **kwargs):
                if self._cursor_value:
                    latest_record_date = datetime.strptime(record[self.cursor_field], '%Y-%m-%d')
                    self._cursor_value = max(self._cursor_value, latest_record_date)
                yield record
    
        def _chunk_date_range(self, start_date: datetime) -> List[Mapping[str, Any]]:
            """
            Returns a list of each day between the start date and now.
            The return value is a list of dicts {'date': date_string}.
            """
            dates = []
            while start_date < datetime.now():
                dates.append({self.cursor_field: start_date.strftime('%Y-%m-%d')})
                start_date += timedelta(days=1)
            return dates
    
        def stream_slices(self, sync_mode, cursor_field: List[str] = None, stream_state: Mapping[str, Any] = None) -> Iterable[Optional[Mapping[str, Any]]]:
            start_date = datetime.strptime(stream_state[self.cursor_field], '%Y-%m-%d') if stream_state and self.cursor_field in stream_state else self.start_date
            return self._chunk_date_range(start_date)
    
        def path(self, stream_state: Mapping[str, Any] = None, stream_slice: Mapping[str, Any] = None, next_page_token: Mapping[str, Any] = None) -> str:
            return stream_slice['date']
    
    
        def request_params(
                self,
                stream_state: Mapping[str, Any],
                stream_slice: Mapping[str, Any] = None,
                next_page_token: Mapping[str, Any] = None,
        ) -> MutableMapping[str, Any]:
            # The api requires that we include access_key as a query param so we do that in this method
            return {'apikey': self.access_key}
    
        def parse_response(
                self,
                response: requests.Response,
                stream_state: Mapping[str, Any],
                stream_slice: Mapping[str, Any] = None,
                next_page_token: Mapping[str, Any] = None,
        ) -> Iterable[Mapping]:
            # The response is a simple JSON whose schema matches our stream's schema exactly, 
            # so we just return a list containing the response
            return [response.json()]
    
        def next_page_token(self, response: requests.Response) -> Optional[Mapping[str, Any]]:
            # The API does not offer pagination, 
            # so we return None to indicate there are no more pages in the response
            return None
            return None
    Below is the command and messages i'm getting in terminal:
    Copy code
    (airbyte) ashish@ashish-ThinkPad-E14-Gen-2:~/repos/airbyte/airbyte-integrations/connectors/source-python-http-rates-api$ python main.py read --config secrets/config.json --catalog sample_files/configured_catalog.json --state sample_files/state.json
    {"type": "LOG", "log": {"level": "INFO", "message": "Starting syncing SourcePythonHttpRatesApi"}}
    {"type": "LOG", "log": {"level": "INFO", "message": "Syncing stream: exchange_rates "}}
    Inside state setter
    -- 2022-11-07 00:00:00
    {"type": "LOG", "log": {"level": "INFO", "message": "Setting state of exchange_rates stream to {'date': '2022-11-07'}"}}
    
    =================================================================================
    
    {"type": "STATE", "state": {"type": "STREAM", "stream": {"stream_descriptor": {"name": "exchange_rates"}, "stream_state": {"date": "2022-11-08"}}, "data": {"exchange_rates": {"date": "2022-11-08"}}}}
    
    ===================================================
    
    
    {"type": "STATE", "state": {"type": "STREAM", "stream": {"stream_descriptor": {"name": "exchange_rates"}, "stream_state": {"date": "2022-11-09"}}, "data": {"exchange_rates": {"date": "2022-11-09"}}}}
    {"type": "LOG", "log": {"level": "INFO", "message": "Read 3 records from exchange_rates stream"}}
    {"type": "LOG", "log": {"level": "INFO", "message": "Finished syncing exchange_rates"}}
    {"type": "LOG", "log": {"level": "INFO", "message": "SourcePythonHttpRatesApi runtimes:\nSyncing stream exchange_rates 0:00:00.384562"}}
    {"type": "LOG", "log": {"level": "INFO", "message": "Finished syncing SourcePythonHttpRatesApi"}}
    In above logs it's clear that we are fetching date for
    2022-11-07
    2022-11-08
    2022-11-09
    dates starting from
    2022-11-07
    But my state.json always has this fixed value
    Copy code
    {
      "exchange_rates": {
        "date": "2022-11-07"
      }
    }
    can someone help what i'm doing wrong?
    • 1
    • 1
  • l

    laila ribke

    11/09/2022, 11:31 AM
    Hi All, In my MySql ->Redshift (S3 staging) connection I see that source emitted 10 million rows of clickout table. But in destination, I have only 400K. I´m using incremental refresh. This source was used in the past with other destinations. May that be the problem why I receive only 400K rows? For me, it´s a new connection, so it should fully refresh, and only next time increment. But it seems it thinks it has only to increment.
    n
    • 2
    • 9
  • s

    Satish Chinthanippu

    11/09/2022, 12:42 PM
    hi team, While building airbyte with SUB_BUILD=PLATFORM ./gradlew build on master branch, getting below error. Can you please check
    Copy code
    > Task :airbyte-api:compileJava
    /airbyte/airbyte-api/build/generated/api/client/src/main/java/io/airbyte/api/client/generated/InternalApi.java:66: error: cannot find symbol
        memberVarAsyncResponseInterceptor = apiClient.getAsyncResponseInterceptor();
                                                     ^
      symbol:   method getAsyncResponseInterceptor()
      location: variable apiClient of type ApiClient
    /airbyte/airbyte-api/build/generated/api/client/src/main/java/io/airbyte/api/client/generated/AttemptApi.java:62: error: cannot find symbol
        memberVarAsyncResponseInterceptor = apiClient.getAsyncResponseInterceptor();
                                                     ^
      symbol:   method getAsyncResponseInterceptor()
      location: variable apiClient of type ApiClient
    Note: /airbyte/airbyte-api/build/generated/api/client/src/main/java/io/airbyte/api/client/invoker/generated/JSON.java uses or overrides a deprecated API.
    Note: Recompile with -Xlint:deprecation for details.
    • 1
    • 3
  • m

    M. Zein Ihza Fahrozi

    11/09/2022, 12:52 PM
    Hi does anyone know why in
    kube/resources/cron.yaml
    have defined volume. but in helm charts isn’t
    Copy code
    volumes:
            - name: airbyte-volume-configs
              persistentVolumeClaim:
                claimName: airbyte-volume-configs
    related issue #16698
    • 1
    • 1
  • d

    Dheeraj Pranav

    11/09/2022, 1:30 PM
    HI airbyte team, is octavia import not going to save our cred info of our source/destinations?
    • 1
    • 2
  • d

    Dusty Shapiro

    11/09/2022, 2:09 PM
    👋 Does anyone have any documentation on how to configure Airbyte via Helm/K8s to work with our own Temporal Cloud Server?
    g
    k
    n
    • 4
    • 70
  • f

    flow

    11/09/2022, 2:34 PM
    Hello, how do i upload or list a connector i have developed?
    s
    • 2
    • 10
  • j

    jonty

    11/09/2022, 4:26 PM
    Can I ask those who are self-hosting what kind of instance size you are using? I've had to upgrade to a t3.xlarge (4 cpus and 16GB ram) and its still falling over when running a new sync. Largest table is about 800m records, but most are much smaller (<1 million)
    👀 1
    • 1
    • 3
  • a

    Adham Suliman

    11/09/2022, 5:26 PM
    Hello everyone, I have a pull request open for CallRail where the code passes all tests except for the
    backwards compatibility test
    . I’ve been told that the issue is due to the test not finding one of the schemas (conversations.json), but it’s clearly there. Any advice would be highly appreciated! Slack Conversation
    s
    • 2
    • 1
  • g

    Guy Feldman

    11/09/2022, 5:47 PM
    does anyone have recommendations on connector SDLC? How do you go about contributing connectors back to the main repo and keeping branches up to date ?
    • 1
    • 6
  • d

    Domenic

    11/09/2022, 6:31 PM
    im new to airbyte and am installing a local version using Docker. I ran the instructions as per the "Deploy Airbyte" page and execute the
    docker-compose up
    command. Its been several hours and it is still not complete. I can see in Powershell that it is executing (isnt stalled), but not sure if this is normal. Thoughts?
    j
    • 2
    • 9
  • z

    Zack Peacock

    11/09/2022, 6:34 PM
    Is it possible for example, w/ facebook marketing in open source, to customize fields requested within default streams? I know we can just setup 1 custom stream per source, but it seems like the selections are somewhat limited.
    • 1
    • 1
  • v

    Venkat Dasari

    11/09/2022, 6:47 PM
    Folks, can we keep the database password etc in a secret manager and then allow airbyte to read from it to do the connection?
    • 1
    • 1
1...909192...245Latest