https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • b

    Brian Hann

    03/16/2023, 2:29 PM
    Quick question: we have a single-tenant setup where each tenant has its own database instance. Is there a way to easily script out and automate source and connection creation? I see in the docs theres an API, but it's only for cloud (beta) users. Edit: oh, maybe the "Configuration API" will do what I need
    n
    m
    • 3
    • 3
  • p

    Peter Kong

    03/16/2023, 4:16 PM
    Hello, I synced a single large table from Postgres to BigQuery. The Airbyte UI displays:
    Sync Succeeded
    Last attempt: 1.03TB
    But a simple
    psql$ \d+
    yields:
    table: 405GB
    Why does Airbyte claim it synced nearly double the expected size? Note: I crosschecked the number of records emitted: both Airbyte and psql yield the same number of total records in the table.
    o
    b
    l
    • 4
    • 9
  • l

    Layth Al-Ani

    03/16/2023, 5:51 PM
    Hello Airbyte team, Where to set
    secretName
    and
    secretValue
    should be set that are in the
    values.yaml
    ? I am setting external database
    x
    s
    • 3
    • 8
  • j

    Jean Lorillon

    03/16/2023, 6:14 PM
    Hey - got started with airbyte OS but I can’t get it to work. localhost 8000 doesn’t load. I’m getting
    irbyte-temporal         | {"level":"info","ts":"2023-03-16T18:14:08.589Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/temporal-sys-add-search-attributes-task-queue/3","wf-task-queue-type":"Workflow","lifecycle":"Stopped","logging-call-at":"taskQueueManager.go:260"}
    I’m on an M1 macbookpro
  • s

    Sam Richardson

    03/16/2023, 10:19 PM
    Hi all, just upgraded to self hosted 0.42.0. Has the memory requirements of AirByte changed? I was previously able to run the system without issue on a machine with 4GB of RAM. Now I seem to require 8GB to avoid the service having issues (I've enabled swap as well).
    n
    • 2
    • 2
  • z

    Zaza Javakhishvili

    03/17/2023, 2:58 AM
    Hi 🙂 • 0.42.0 - Unable to load Setup Guides 😐
  • g

    Gilberto Vilar

    03/17/2023, 3:33 AM
    Is there an easy way to get my airbyte sources schemas? Maybe a python SDK to interact with airbyte database
    w
    m
    • 3
    • 8
  • b

    Brian Castelli

    03/17/2023, 4:19 AM
    Is there a way to increase the number of reader and writer containers on kubernetes? I’ve experimented with different combinations of the
    JOB_MAIN_CONTAINER_*
    parameters in the ConfigMap. I have not been able to change the number of containers (4 for reader, 5 for writer) and the overall transfer performance has only gotten worse. 😞 I doubled my machine type resources and saw a marginal increase in transfer. FWIW, my connection is from BigQuery to S3. I also see the same behavior when the source is Snowflake.
    m
    n
    f
    • 4
    • 14
  • a

    archna singh

    03/17/2023, 5:07 AM
    can we install airbyte lower version .? if yes how can we do it ? any reference
    b
    • 2
    • 2
  • l

    Lenin Mishra

    03/17/2023, 6:31 AM
    My Oauth2Authenticator is returning Key_error - access_token. I am not sure what I am doing wrong.
    Copy code
    # My code
    
        def streams(self, config: Mapping[str, Any]) -> List[Stream]:
            """
            TODO: Replace the streams below with your own streams.
    
            :param config: A Mapping of the user input configuration as defined in the connector spec.
            """
            # TODO remove the authenticator if not required.
            auth = Oauth2Authenticator(
                token_refresh_endpoint="<https://accounts.zoho.eu/oauth/v2/token>",
                client_id=config["client_id"],
                client_secret=config["client_secret"],
                refresh_token=config["refresh_token"],
            )  # Oauth2Authenticator is also available if you need oauth support
            args = {"authenticator": auth,
                    "organization_id": config["organization_id"], 
                    "date_start": config["date_start"]}
            return [Invoices(**args)]
    The result is
    Copy code
    {
      "type": "LOG",
      "log": {
        "level": "FATAL",
        "message": "'access_token'\nTraceback (most recent call last):\n  File \"/Users/pylenin/airbyte/airbyte-integrations/connectors/source-zoho-books/main.py\", line 13, in <module>\n    launch(source, sys.argv[1:])\n  File \"/Users/pylenin/airbyte/airbyte-integrations/connectors/source-zoho-books/.venv/lib/python3.10/site-packages/airbyte_cdk/entrypoint.py\", line 131, in launch\n    for message in source_entrypoint.run(parsed_args):\n  File \"/Users/pylenin/airbyte/airbyte-integrations/connectors/source-zoho-books/.venv/lib/python3.10/site-packages/airbyte_cdk/entrypoint.py\", line 122, in run\n    for message in generator:\n  File \"/Users/pylenin/airbyte/airbyte-integrations/connectors/source-zoho-books/.venv/lib/python3.10/site-packages/airbyte_cdk/sources/abstract_source.py\", line 114, in read\n    stream_is_available, error = stream_instance.check_availability(logger, self)\n  File \"/Users/pylenin/airbyte/airbyte-integrations/connectors/source-zoho-books/.venv/lib/python3.10/site-packages/airbyte_cdk/sources/streams/core.py\", line 190, in check_availability\n    return self.availability_strategy.check_availability(self, logger, source)\n  File \"/Users/pylenin/airbyte/airbyte-integrations/connectors/source-zoho-books/.venv/lib/python3.10/site-packages/airbyte_cdk/sources/streams/http/availability_strategy.py\", line 45, in check_availability\n    get_first_record_for_slice(stream, stream_slice)\n  File \"/Users/pylenin/airbyte/airbyte-integrations/connectors/source-zoho-books/.venv/lib/python3.10/site-packages/airbyte_cdk/sources/streams/utils/stream_helper.py\", line 38, in get_first_record_for_slice\n    return next(records_for_slice)\n  File \"/Users/pylenin/airbyte/airbyte-integrations/connectors/source-zoho-books/.venv/lib/python3.10/site-packages/airbyte_cdk/sources/streams/http/http.py\", line 423, in read_records\n    yield from self._read_pages(\n  File \"/Users/pylenin/airbyte/airbyte-integrations/connectors/source-zoho-books/.venv/lib/python3.10/site-packages/airbyte_cdk/sources/streams/http/http.py\", line 439, in _read_pages\n    request, response = self._fetch_next_page(stream_slice, stream_state, next_page_token)\n  File \"/Users/pylenin/airbyte/airbyte-integrations/connectors/source-zoho-books/.venv/lib/python3.10/site-packages/airbyte_cdk/sources/streams/http/http.py\", line 453, in _fetch_next_page\n    request = self._create_prepared_request(\n  File \"/Users/pylenin/airbyte/airbyte-integrations/connectors/source-zoho-books/.venv/lib/python3.10/site-packages/airbyte_cdk/sources/streams/http/http.py\", line 280, in _create_prepared_request\n    return self._session.prepare_request(requests.Request(**args))\n  File \"/Users/pylenin/airbyte/airbyte-integrations/connectors/source-zoho-books/.venv/lib/python3.10/site-packages/requests/sessions.py\", line 484, in prepare_request\n    p.prepare(\n  File \"/Users/pylenin/airbyte/airbyte-integrations/connectors/source-zoho-books/.venv/lib/python3.10/site-packages/requests/models.py\", line 372, in prepare\n    self.prepare_auth(auth, url)\n  File \"/Users/pylenin/airbyte/airbyte-integrations/connectors/source-zoho-books/.venv/lib/python3.10/site-packages/requests/models.py\", line 603, in prepare_auth\n    r = auth(self)\n  File \"/Users/pylenin/airbyte/airbyte-integrations/connectors/source-zoho-books/.venv/lib/python3.10/site-packages/airbyte_cdk/sources/streams/http/requests_native_auth/abstract_oauth.py\", line 28, in __call__\n    request.headers.update(self.get_auth_header())\n  File \"/Users/pylenin/airbyte/airbyte-integrations/connectors/source-zoho-books/.venv/lib/python3.10/site-packages/airbyte_cdk/sources/streams/http/requests_native_auth/abstract_oauth.py\", line 33, in get_auth_header\n    return {\"Authorization\": f\"Bearer {self.get_access_token()}\"}\n  File \"/Users/pylenin/airbyte/airbyte-integrations/connectors/source-zoho-books/.venv/lib/python3.10/site-packages/airbyte_cdk/sources/streams/http/requests_native_auth/abstract_oauth.py\", line 38, in get_access_token\n    token, expires_in = self.refresh_access_token()\n  File \"/Users/pylenin/airbyte/airbyte-integrations/connectors/source-zoho-books/.venv/lib/python3.10/site-packages/airbyte_cdk/sources/streams/http/requests_native_auth/abstract_oauth.py\", line 99, in refresh_access_token\n    return response_json[self.get_access_token_name()], int(response_json[self.get_expires_in_name()])\nKeyError: 'access_token'"
      }
    }
    n
    b
    • 3
    • 4
  • u

    Ulf Dammertz

    03/17/2023, 9:42 AM
    Hi! I am currently setting up an airbyte (OSS) pipeline to pump Data from an MS SQL-Server to Big Query. This has been a lot easier than I had anticipated. But I am experiencing a problem that I fail to solve myself. And the solutions I stumble upon, involve an amount of knowledge that I can not ingest at the moment (eg. DBT). My issue is, that the SQL-Server datasource converts dates to strings in json. The Big Query destination connector could convert this back into a date if the format-label in json would be set to date-time, but that is not the case. And that seems to be the end of the documentation. There would to be a solution to that, by writing a transformation in DBT myself, but that would make me rewrite that transformation every time I add or remove datafields. Another solution could be to do the data transfer within BigQuery after the basic nomalization, using a stored SQL-Statement within BigQuery. But I would not be able to trigger this with airbyte and it has the same disadvantage as DBT regarding regular manual updates. I could also add another tool for the orchestration but this will also add more flexibility, that I would love to keep out of this fairly simple task. I have tried multiple Datatypes that could transport my MS SQL-Server Date information, but that changes nothing. Airbyte is recognizing the Field a date: ... column Posting Date (type datetime[23], nullable false) -> JsonSchemaType({type=string}) This seems like a very common requirement to me so I do really hope there is a simple solution to this. #tldr: Is there a way to tell the MS SQL-Server Connector that I would really like it to add the correct format-label "date-time" to a string containig a date?
    m
    w
    • 3
    • 3
  • n

    navod perera

    03/17/2023, 9:59 AM
    (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: certificate has expired (_ssl.c:1122)'))) I'm getting above error. So can I disable SSL certificate when creating a custom connector
  • j

    Jonty Knox

    03/17/2023, 11:01 AM
    hey, i was wondering if there was any way to get an idea of when certain community PRs would be merged to master? specifically https://github.com/airbytehq/airbyte/pull/22910
  • k

    Karri Shivaharsha

    03/17/2023, 11:49 AM
    Hello Team, I have a couple of quetions could you please help me know before that i will just try to understand you guys what i am trying to achieve Detail Description: SO I have pulled the opensource airbyte for developing a custom source with sap hana and i was succesful in developing the required functions(check, read, discover) and i checked them using python main.py check, discover read with secrets and catalog during read. After that i have built the image of custom source accorfing to Readme. From here i was Stuck in few problems. Problems: #1 How can i see my custom source in airbyte ui after developing , are there any next steps i need to do after implementing those function like configuring sourcedefnitions etc..If like that can i have any detailed documentation. #2 Do we need to configure Aws cred in .env file in Repo in order to connect to s3 destination #3 Do we have any docker command that reads the configured catalog and writes into local json or some other destination !Please Help me in above Problems .
  • d

    DR

    03/17/2023, 12:31 PM
    How to convert datatype from "integer" to "string" for a particular column. I am moving data from source S3 to destination BigQuery. I have around 20+ columns in my S3 file. One of the column contains all integer values except for a single row where it contains a string value. The mapping is shown as an Integer and hence the above row causes the following error. What would be the best way to rectify this problem? I tried disabling "Infer Data Types" to false, but Airbyte marked all fields as "String". I just need a mechanism to mark just the problem causing column as String but allow Airbyte to infer other data fields as it is. How can I achieve this?
  • l

    Leo Schick

    03/17/2023, 12:36 PM
    Hey 👋 i am new to airbyte. I added a source and want to add now a destination to it but I get the following error message:
    u
    b
    • 3
    • 2
  • t

    T Viswanathan

    03/17/2023, 1:24 PM
    I ran into a problem when I tried to move data from MSSQL to duckdb. Though I could get the target DB created, the same is not getting accessed thru python (irrespective of any version - tried from 0.4 - 0.71), I get this error:
    u
    s
    • 3
    • 5
  • c

    Chen Lin

    03/17/2023, 2:40 PM
    Hi all, I have a s3 connector that had been working fine, couple days ago the source file in s3 got corrupted so the connector started to fail, error msg:
    source > Key: path/to/file/image001.png,Campaign_Metrics.csv
    which we fixed, those files are not in s3 anymore, but somehow the connector is still throwing the same error, what should I do to make the connector know the latest files in the bucket?
  • y

    Yuva

    03/17/2023, 3:22 PM
    Hi all, is there a Slack channel here dedicated to running dbt core , triggered from Airbyte? Based on https://docs.airbyte.com/operator-guides/transformation-and-normalization/transformations-with-airbyte/
    m
    • 2
    • 1
  • d

    Dhruv Saxena

    03/17/2023, 3:24 PM
    Hi, getting this error while setting up Yugabyte as source connector.
    Copy code
    2023-03-17 14:01:33 INFO i.a.w.p.DockerProcessFactory(create):130 - Creating docker container = destination-yugabytedb-check-6bc42e07-15f1-45fe-b43a-d9717200adb6-0-uexlh with resources io.airbyte.config.ResourceRequirements@1f65b8be[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=] and allowedHosts null
    2023-03-17 14:01:33 INFO i.a.w.p.DockerProcessFactory(create):175 - Preparing command: docker run --rm --init -i -w /data/6bc42e07-15f1-45fe-b43a-d9717200adb6/0 --log-driver none --name destination-yugabytedb-check-6bc42e07-15f1-45fe-b43a-d9717200adb6-0-uexlh --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e STRICT_COMPARISON_NORMALIZATION_WORKSPACES= -e WORKER_CONNECTOR_IMAGE=airbyte/destination-yugabytedb:0.1.0 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e STRICT_COMPARISON_NORMALIZATION_TAG=strict_comparison2 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e USE_STREAM_CAPABLE_STATE=true -e FIELD_SELECTION_WORKSPACES= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=0 -e FEATURE_FLAG_CLIENT= -e AIRBYTE_VERSION=0.42.0 -e WORKER_JOB_ID=6bc42e07-15f1-45fe-b43a-d9717200adb6 airbyte/destination-yugabytedb:0.1.0 check --config source_config.json
    2023-03-17 14:01:33 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):107 - Reading messages from protocol version 0.2.0
    2023-03-17 14:01:34 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - starting destination: class io.airbyte.integrations.destination.yugabytedb.YugabytedbDestination
    2023-03-17 14:01:34 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - integration args: {check=null, config=source_config.json}
    2023-03-17 14:01:34 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - Running integration: io.airbyte.integrations.destination.yugabytedb.YugabytedbDestination
    2023-03-17 14:01:34 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - Command: CHECK
    2023-03-17 14:01:34 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'}
    2023-03-17 14:01:34 WARN i.a.w.i.DefaultAirbyteStreamFactory(internalLog):165 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
    2023-03-17 14:01:34 WARN i.a.w.i.DefaultAirbyteStreamFactory(internalLog):165 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
    2023-03-17 14:01:34 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - HikariPool-1 - Starting...
    2023-03-17 14:01:34 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - HikariPool-1 - Start completed.
    2023-03-17 14:02:35 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - HikariPool-1 - Shutdown initiated...
    2023-03-17 14:02:36 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - HikariPool-1 - Shutdown completed.
    2023-03-17 14:02:36 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - Completed integration: io.airbyte.integrations.destination.yugabytedb.YugabytedbDestination
    2023-03-17 14:02:36 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - completed destination: class io.airbyte.integrations.destination.yugabytedb.YugabytedbDestination
    2023-03-17 14:02:36 INFO i.a.w.g.DefaultCheckConnectionWorker(run):120 - Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@556cbef3[status=failed,message=State code: 08001; Message: The connection attempt failed.]
    2023-03-17 14:02:36 INFO i.a.w.t.TemporalAttemptExecution(get):169 - Stopping cancellation check scheduling...
    2023-03-17 14:02:36 INFO i.a.c.i.LineGobbler(voidCall):149 - 
    2023-03-17 14:02:36 INFO i.a.c.i.LineGobbler(voidCall):149 - ----- END CHECK -----
    2023-03-17 14:02:36 INFO i.a.c.i.LineGobbler(voidCall):149 -
    n
    • 2
    • 6
  • k

    Krisjan Oldekamp

    03/17/2023, 3:45 PM
    Hi! I've managed to spin up Airbyte on the GKE Autopilot instance using Helm charts (45.0). Two questions: • When I try to add a source or destination, it takes some time and every time it returns the same error: non-json response ◦ When I go to the workloads in GCP, I can see the following errors (see screenshot), but I have no idea what this means? • Do I need to set the storageClassName: standard-rwo (as I'm on GKE) as described in the documentation for Kubernetes on Kustomize (although I'm using Helm). And if so, there seems to be no setting for this in the Helm chart? -> https://docs.airbyte.com/deploying-airbyte/on-kubernetes/#persistent-storage-on-google-kubernetes-enginegke-regional-cluster?
    m
    • 2
    • 9
  • t

    Théo Bassignani

    03/17/2023, 3:48 PM
    Hello, I am trying to mount an airbyte on my local machine in docker. However I have a temporal error that keeps coming back. So I can't access the webserver. I have tried several versions of airbyte but I have the same problem
    Copy code
    airbyte-worker                      | 2023-03-17 15:43:15 WARN i.a.c.t.TemporalUtils(getTemporalClientWhenConnected):245 - Ignoring exception while trying to request Temporal namespace:
    airbyte-worker                      | io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: Deadline exceeded after 9.999403394s. 
    airbyte-worker                      | 	at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:271) ~[grpc-stub-1.50.2.jar:1.50.2]
    airbyte-worker                      | 	at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:252) ~[grpc-stub-1.50.2.jar:1.50.2]
    airbyte-worker                      | 	at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:165) ~[grpc-stub-1.50.2.jar:1.50.2]
    airbyte-worker                      | 	at io.temporal.api.workflowservice.v1.WorkflowServiceGrpc$WorkflowServiceBlockingStub.getSystemInfo(WorkflowServiceGrpc.java:4139) ~[temporal-serviceclient-1.17.0.jar:?]
    airbyte-worker                      | 	at io.temporal.serviceclient.SystemInfoInterceptor.getServerCapabilitiesOrThrow(SystemInfoInterceptor.java:95) ~[temporal-serviceclient-1.17.0.jar:?]
    airbyte-worker                      | 	at io.temporal.serviceclient.SystemInfoInterceptor$1.start(SystemInfoInterceptor.java:81) ~[temporal-serviceclient-1.17.0.jar:?]
    airbyte-worker                      | 	at io.grpc.stub.ClientCalls.startCall(ClientCalls.java:341) ~[grpc-stub-1.50.2.jar:1.50.2]
    airbyte-worker                      | 	at io.grpc.stub.ClientCalls.asyncUnaryRequestCall(ClientCalls.java:315) ~[grpc-stub-1.50.2.jar:1.50.2]
    airbyte-worker                      | 	at io.grpc.stub.ClientCalls.futureUnaryCall(ClientCalls.java:227) ~[grpc-stub-1.50.2.jar:1.50.2]
    airbyte-worker                      | 	at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:154) ~[grpc-stub-1.50.2.jar:1.50.2]
    airbyte-worker                      | 	at io.temporal.api.workflowservice.v1.WorkflowServiceGrpc$WorkflowServiceBlockingStub.describeNamespace(WorkflowServiceGrpc.java:3662) ~[temporal-serviceclient-1.17.0.jar:?]
  • c

    Chetan M

    03/17/2023, 4:17 PM
    Please share any documentation regarding structure of configured_catalog.json to be passed to the write function
  • d

    Dustin Pearson

    03/17/2023, 4:38 PM
    Infinite Dynamic Properties in Source Stream Context • I am working on a PR to support the Profiles API for Klaviyo • The endpoint returns an array of profiles which has an object that can have infinite dynamic properties. Ex:
    Copy code
    {
      "properties": {
        "CUSTOM_KEY_1": INT|STR|ARRAY,
        "CUSTOM_KEY_2": ...,
        ...
      }
    }
    Problem • There is no way to detect this property list besides listing every item from the endpoint (a full refresh) • A full refresh will take over an hour for users with >= 500,000 leads in Klaviyo ◦ 60 requests per second, 100 max page size => (500000)/(6000) = 83 minutes Investigation • I have looked through the codebase trying to find an example of supporting an object with a dynamic number of properties. I cannot find such a use case. • Codebase Examples ◦ The Airtable integration has top-level "Base" or "Table" concepts which would have very few entities. It extracts the properties from there ◦ The Mixpanel integration seems to be doing a full refresh of every Profile in Mixpanel to do this Options • (Love) A better solution from someone in this chat • (Like) Find a way to store a JSONB column with this dynamic list of properties and allow Airbyte customers to handle extractions themselves • (Dislike) Behave like the Mixpanel integration and do a full refresh to determine properties.
    ✅ 1
    s
    m
    r
    • 4
    • 28
  • a

    Annika Maybin

    03/17/2023, 6:53 PM
    I posted about this before, but I thought the error was gone - it's not unfortunately. Airbyte v.42.0, Redshift connector v0.4.2 and MySQL connector v2.0.3. One of my table syncs says
    Invalid value for DayOfMonth
    which I think makes it fail. The sync succeeded once, but it only synced about 1/10 of records. Sync mode does not influence the result. Logs:
    Copy code
    2023-03-17 18:17:43 [32mINFO[m i.a.w.g.DefaultReplicationWorker(getReplicationOutput):539 - failures: [ {
      "failureOrigin" : "source",
      "failureType" : "system_error",
      "internalMessage" : "java.time.DateTimeException: Invalid value for DayOfMonth (valid values 1 - 28/31): 0",
      "externalMessage" : "Something went wrong in the connector. See the logs for more details.",
      "metadata" : {
        "attemptNumber" : 0,
        "jobId" : 495,
        "from_trace_message" : true,
        "connector_command" : "read"
      },
      "stacktrace" : "java.time.DateTimeException: Invalid value for DayOfMonth (valid values 1 - 28/31): 0\n\tat java.base/java.time.temporal.ValueRange.checkValidValue(ValueRange.java:319)\n\tat java.base/java.time.temporal.ChronoField.checkValidValue(ChronoField.java:718)\n\tat java.base/java.time.LocalDate.of(LocalDate.java:272)\n\tat java.base/java.time.LocalDateTime.of(LocalDateTime.java:363)\n\tat com.mysql.cj.result.LocalDateTimeValueFactory.localCreateFromDatetime(LocalDateTimeValueFactory.java:86)\n\tat com.mysql.cj.result.LocalDateTimeValueFactory.localCreateFromDatetime(LocalDateTimeValueFactory.java:44)\n\tat com.mysql.cj.result.AbstractDateTimeValueFactory.createFromDatetime(AbstractDateTimeValueFactory.java:104)\n\tat com.mysql.cj.protocol.a.MysqlBinaryValueDecoder.decodeDatetime(MysqlBinaryValueDecoder.java:123)\n\tat com.mysql.cj.protocol.result.AbstractResultsetRow.decodeAndCreateReturnValue(AbstractResultsetRow.java:86)\n\tat com.mysql.cj.protocol.result.AbstractResultsetRow.getValueFromBytes(AbstractResultsetRow.java:243)\n\tat com.mysql.cj.protocol.a.result.BinaryBufferRow.getValue(BinaryBufferRow.java:244)\n\tat com.mysql.cj.jdbc.result.ResultSetImpl.getLocalDateTime(ResultSetImpl.java:959)\n\tat com.mysql.cj.jdbc.result.ResultSetImpl.getObject(ResultSetImpl.java:1282)\n\tat com.zaxxer.hikari.pool.HikariProxyResultSet.getObject(HikariProxyResultSet.java)\n\tat io.airbyte.db.jdbc.AbstractJdbcCompatibleSourceOperations.rowToJson(AbstractJdbcCompatibleSourceOperations.java:54)\n\tat io.airbyte.db.jdbc.AbstractJdbcCompatibleSourceOperations.rowToJson(AbstractJdbcCompatibleSourceOperations.java:37)\n\tat io.airbyte.db.jdbc.StreamingJdbcDatabase$1.tryAdvance(StreamingJdbcDatabase.java:102)\n\tat java.base/java.util.Spliterators$1Adapter.hasNext(Spliterators.java:681)\n\tat io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.java:38)\n\tat com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)\n\tat com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)\n\tat io.airbyte.commons.util.LazyAutoCloseableIterator.computeNext(LazyAutoCloseableIterator.java:42)\n\tat com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)\n\tat com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)\n\tat com.google.common.collect.TransformedIterator.hasNext(TransformedIterator.java:46)\n\tat io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.java:38)\n\tat com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)\n\tat com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)\n\tat com.google.common.collect.TransformedIterator.hasNext(TransformedIterator.java:46)\n\tat io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.java:38)\n\tat com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)\n\tat com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)\n\tat io.airbyte.commons.util.CompositeIterator.computeNext(CompositeIterator.java:63)\n\tat com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)\n\tat com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)\n\tat io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.java:38)\n\tat com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)\n\tat com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)\n\tat io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.java:38)\n\tat com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)\n\tat com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)\n\tat java.base/java.util.Iterator.forEachRemaining(Iterator.java:132)\n\tat io.airbyte.integrations.base.IntegrationRunner.lambda$produceMessages$0(IntegrationRunner.java:187)\n\tat io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:237)\n\tat io.airbyte.integrations.base.IntegrationRunner.produceMessages(IntegrationRunner.java:186)\n\tat io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:139)\n\tat io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:98)\n\tat io.airbyte.integrations.source.mysql.MySqlSource.main(MySqlSource.java:400)\n\tSuppressed: java.lang.RuntimeException: java.sql.SQLException: Streaming result set com.mysql.cj.protocol.a.result.ResultsetRowsStreaming@42fcc7e6 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.\n\t\tat
    Thanks guys for any hints on what is going on.
  • v

    Vincent Cotineau

    03/17/2023, 7:51 PM
    Hi all, I am working with airbyte to replicate data from postgres to snowflake. The source database contains ~1000 tables, but only 423 need to be replicated to the target. I did the fastuous work of clicking 423 times in the "replication" part of my connection in order to put that in place. I want to know if I can somehow export this configuration and work with it in an easier way. I have looked in the airbyte-db internal database and the table "connection" seems to have what I need : a column field_selection_data containing a json file with something like "<table_name>":true, "<table_name>":false BUT. It does not contain the proper data : there is apparently no link between this table and the actual configuration of the task. Then, I try to use the API, (yes I should have done it sooner) but when I try to update the connection with the json payload that I got from the API, just to see if the endpoint works correctly and I got java NullPointer exceptions... To put it shortly : is there an easy way to extract a connection configuration, modify it and the import it again ? Many thanks to anyone being able to help 😊
    m
    n
    • 3
    • 4
  • z

    Zoran Stipanicev

    03/17/2023, 10:17 PM
    Hi all, we have developed a connector for vTiger CRM and after upgrading to 0.42.0 we are getting a schema error
    Copy code
    Error: Internal Server Error: json schema nodes should always be object nodes. path: [FieldNameOrList{fieldName='result', isList=false}, FieldNameOrList{fieldName='null', isList=true}] actual: [{"type":"object","properties":{"deleted":{"db_type":"integer","default":"","description":"","type":"string","title":"deleted"},"currency_name":{"db_type":"varchar","default":"","description":"","type":"string","title":"currency name"},"currency_symbol":{"db_type":"varchar","default":"","description":"","type":"string","title":"currency symbol"},"currency_status":{"db_type":"varchar","default":"","description":"","type":"string","title":"currency status"},"id":{"db_type":"varchar","default":"","description":"","type":"string","title":"id"},"defaultid":{"db_type":"varchar","default":"","description":"","type":"string","title":"defaultid"},"conversion_rate":{"db_type":"varchar","default":"","description":"","type":"string","title":"conversion rate"},"currency_code":{"db_type":"varchar","default":"","description":"","type":"string","title":"currency code"}}}]
    Schema file is attached. I have validated the schema and the data against the schema with a few online JSON validators and I got no errors. The error appears after a click on the Review changes button (see screenshot). And here is the sample of data in the raw table in the __airbyte_data_ column
    Copy code
    {
    "result": [{
          "currency_name": "USA, Dollars",
          "deleted": "0",
          "currency_symbol": "$",
          "currency_status": "Active",
          "id": "21x1",
          "defaultid": "-11",
          "conversion_rate": "1.00000",
          "currency_code": "USD"
        }, {
          ...
        }, {
          ...
        }
      ],
      "success": true
    }
    Any help would be much appreciated, thank you!
    currency.json
    j
    • 2
    • 6
  • j

    Johannes Müller

    03/18/2023, 10:41 AM
    Could you share your impression on which orchestrator you think works best for an intial setup with Airbyte/dbt?
  • j

    Johannes Müller

    03/18/2023, 10:41 AM
    I did a course on Airflow and read up on Dagster&Prefect.
  • j

    Johannes Müller

    03/18/2023, 10:41 AM
    I very much like the concept of Dagster and being declarative like auto schedule the necessary parts of a Pipeline when a destination needs to be refreshed. Prefect has some built in caching mechanism that should make development easier, but that I could probably implement myself if necessary.
1...164165166...245Latest