https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • b

    Bob Blandford

    06/24/2023, 11:40 PM
    2023-06-24 233517 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):138 - Cloud storage job log path: /workspace/d4cca46c-1eae-4dc4-afb0-0e931e155a97/0/logs.log 2023-06-24 233517 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: dev-335e7fcea0-cloud 2023-06-24 233517 [32mINFO[m i.a.a.c.AirbyteApiClient(retryWithJitterThrows):229 - Attempt 0 to save workflow id for cancellation 2023-06-24 233517 [32mINFO[m i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0' 2023-06-24 233517 [32mINFO[m i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1' 2023-06-24 233517 [32mINFO[m i.a.c.EnvConfigs(getEnvOrDefault):1228 - Using default value for environment variable OTEL_COLLECTOR_ENDPOINT: '' 2023-06-24 233517 [32mINFO[m i.a.w.p.KubeProcessFactory(create):107 - Attempting to start pod = t-encrypt-discover-d4cca46c-1eae-4dc4-afb0-0e931e155a97-0-lveru for airbyte/source-oracle-strict-encrypt:0.3.17 with resources io.airbyte.config.ResourceRequirements@b76563f[cpuRequest=1,cpuLimit=2,memoryRequest=2Gi,memoryLimit=2Gi,additionalProperties={}] and allowedHosts null 2023-06-24 233517 [32mINFO[m i.a.w.p.KubeProcessFactory(create):111 - t-encrypt-discover-d4cca46c-1eae-4dc4-afb0-0e931e155a97-0-lveru stdoutLocalPort = 9073 2023-06-24 233517 [32mINFO[m i.a.w.p.KubeProcessFactory(create):114 - t-encrypt-discover-d4cca46c-1eae-4dc4-afb0-0e931e155a97-0-lveru stderrLocalPort = 9074 2023-06-24 233517 [32mINFO[m i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):658 - Creating stdout socket server... 2023-06-24 233517 [32mINFO[m i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):676 - Creating stderr socket server... 2023-06-24 233517 [32mINFO[m i.a.w.p.KubePodProcess(<init>):584 - Creating pod t-encrypt-discover-d4cca46c-1eae-4dc4-afb0-0e931e155a97-0-lveru... 2023-06-24 233517 [32mINFO[m i.a.w.p.KubePodProcess(waitForInitPodToRun):362 - Waiting for init container to be ready before copying files... 2023-06-24 233517 [32mINFO[m i.a.w.p.KubePodProcess(waitForInitPodToRun):366 - Init container present.. 2023-06-24 233519 [32mINFO[m i.a.w.p.KubePodProcess(waitForInitPodToRun):369 - Init container ready.. 2023-06-24 233519 [32mINFO[m i.a.w.p.KubePodProcess(<init>):615 - Copying files... 2023-06-24 233519 [32mINFO[m i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: source_config.json 2023-06-24 233519 [32mINFO[m i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/d2b2e2eb-f964-43ea-9cf4-6c015759c214/source_config.json jobs/t-encrypt-discover-d4cca46c-1eae-4dc4-afb0-0e931e155a97-0-lveru:/config/source_config.json -c init 2023-06-24 233519 [32mINFO[m i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-06-24 233519 [32mINFO[m i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-06-24 233519 [32mINFO[m i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):311 - Uploading file: FINISHED_UPLOADING 2023-06-24 233519 [32mINFO[m i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):319 - kubectl cp /tmp/a5a2f9eb-7bbf-4b08-991a-ea19bf1083aa/FINISHED_UPLOADING jobs/t-encrypt-discover-d4cca46c-1eae-4dc4-afb0-0e931e155a97-0-lveru:/config/FINISHED_UPLOADING -c init 2023-06-24 233519 [32mINFO[m i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):322 - Waiting for kubectl cp to complete 2023-06-24 233519 [32mINFO[m i.a.w.p.KubePodProcess(copyFilesToKubeConfigVolume):336 - kubectl cp complete, closing process 2023-06-24 233519 [32mINFO[m i.a.w.p.KubePodProcess(<init>):618 - Waiting until pod is ready... 2023-06-24 233520 [32mINFO[m i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$11):667 - Setting stdout... 2023-06-24 233520 [32mINFO[m i.a.w.p.KubePodProcess(lambda$setupStdOutAndStdErrListeners$12):679 - Setting stderr... 2023-06-24 233521 [32mINFO[m i.a.w.p.KubePodProcess(<init>):634 - Reading pod IP... 2023-06-24 233521 [32mINFO[m i.a.w.p.KubePodProcess(<init>):636 - Pod IP: 172.32.6.115 2023-06-24 233521 [32mINFO[m i.a.w.p.KubePodProcess(<init>):643 - Using null stdin output stream... 2023-06-24 233521 [32mINFO[m i.a.w.i.VersionedAirbyteStreamFactory(create):177 - Reading messages from protocol version 0.2.0 2023-06-24 233521 [32mINFO[m i.a.w.i.VersionedAirbyteStreamFactory(toAirbyteMessage):364 - 2023-06-24 233521 [32mINFO[m i.a.i.s.o.OracleStrictEncryptSource(main):37 - starting source: class io.airbyte.integrations.source.oracle_strict_encrypt.OracleStrictEncryptSource 2023-06-24 233521 [32mINFO[m i.a.w.i.VersionedAirbyteStreamFactory(toAirbyteMessage):364 - 2023-06-24 233521 [32mINFO[m i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {discover=null, config=source_config.json} 2023-06-24 233521 [32mINFO[m i.a.w.i.VersionedAirbyteStreamFactory(toAirbyteMessage):364 - 2023-06-24 233521 [32mINFO[m i.a.i.b.IntegrationRunner(runInternal):104 - Running integration: io.airbyte.integrations.source.oracle_strict_encrypt.OracleStrictEncryptSource 2023-06-24 233521 [32mINFO[m i.a.w.i.VersionedAirbyteStreamFactory(toAirbyteMessage):364 - 2023-06-24 233521 [32mINFO[m i.a.i.b.IntegrationRunner(runInternal):105 - Command: DISCOVER 2023-06-24 233521 [32mINFO[m i.a.w.i.VersionedAirbyteStreamFactory(toAirbyteMessage):364 - 2023-06-24 233521 [32mINFO[m i.a.i.b.IntegrationRunner(runInternal):106 - Integration config: IntegrationConfig{command=DISCOVER, configPath='source_config.json', catalogPath='null', statePath='null'} 2023-06-24 233521 [32mINFO[m i.a.w.i.VersionedAirbyteStreamFactory(toAirbyteMessage):364 - 2023-06-24 233521 [33mWARN[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-06-24 233521 [32mINFO[m i.a.w.i.VersionedAirbyteStreamFactory(toAirbyteMessage):364 - 2023-06-24 233521 [33mWARN[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-06-24 233521 [32mINFO[m i.a.w.i.VersionedAirbyteStreamFactory(toAirbyteMessage):364 - 2023-06-24 233521 [33mWARN[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-06-24 233521 [32mINFO[m i.a.w.i.VersionedAirbyteStreamFactory(toAirbyteMessage):364 - 2023-06-24 233521 [33mWARN[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-06-24 233522 [32mINFO[m i.a.w.i.VersionedAirbyteStreamFactory(toAirbyteMessage):364 - 2023-06-24 233522 [32mINFO[m i.a.i.b.s.SshTunnel(getInstance):170 - Starting connection with method: NO_TUNNEL 2023-06-24 233522 [32mINFO[m i.a.w.i.VersionedAirbyteStreamFactory(toAirbyteMessage):364 - 2023-06-24 233522 [32mINFO[m c.z.h.HikariDataSource(<init>):80 - HikariPool-1 - Starting... 2023-06-24 233522 [32mINFO[m i.a.w.i.VersionedAirbyteStreamFactory(toAirbyteMessage):364 - 2023-06-24 233522 [32mINFO[m c.z.h.HikariDataSource(<init>):82 - HikariPool-1 - Start completed. 2023-06-24 233523 [32mINFO[m i.a.w.i.VersionedAirbyteStreamFactory(toAirbyteMessage):364 - 2023-06-24 233523 [32mINFO[m i.a.i.s.j.AbstractJdbcSource(discoverInternal):123 - Internal schemas to exclude: []
    k
    • 2
    • 2
  • b

    Bob Blandford

    06/24/2023, 11:40 PM
    max tables for oracle schema?
    k
    • 2
    • 2
  • b

    Bob Blandford

    06/24/2023, 11:56 PM
    Is excel a supported destination?
    k
    • 2
    • 2
  • b

    Bob Blandford

    06/24/2023, 11:57 PM
    Is Microsoft Excel a supported CSV destination?
    k
    • 2
    • 2
  • r

    Richard W

    06/25/2023, 2:20 AM
    Hi, I'm trying to create a replicated database from my PlanetScale to Mysql 8 internal server. The one column is 1800 chars text field that is getting this message. mysql Truncated incorrect CHAR(1024) At first I thought it was because it was a json column type. So I switched to text and still no go.
    k
    l
    • 3
    • 3
  • b

    Bob Blandford

    06/25/2023, 10:19 AM
    ERROR i.a.c.i.LineGobbler(voidCall):149 - SLF4J: Class path contains multiple SLF4J bindings.
    k
    • 2
    • 3
  • b

    Bob Blandford

    06/25/2023, 10:22 AM
    ERROR i.a.c.i.LineGobbler(voidCall):149 - SLF4J: Found binding in [jarfile/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    k
    • 2
    • 2
  • s

    Sivakumar Ramaswamy

    06/25/2023, 10:23 AM
    Could you let me know how to build repor-metrics image as I am facing java cast issue and i want to fix it however, there is no document how to build the image from source. it would be helpful if you can shed some light on this ?
    k
    • 2
    • 2
  • b

    Bob Blandford

    06/25/2023, 10:25 AM
    Multiple bindings were found on the class path
    k
    • 2
    • 2
  • s

    Scott Sung

    06/25/2023, 10:24 PM
    Morning all, Just looking to get some insight/help from anyone that is successful with Amazon seller partner API connection. I have had a week of successful data replication until it suddenly decided to stop working. Retesting the source connection gives a "HTTPError('400 Client Error: Bad Request for url: https://sellingpartnerapi-na.amazon.com/reports/2021-06-30/reports')". I have made no changes at all and it seems exactly a week after the first data replication day, this error has started. Anyone know a possible cause/fix to this? Much appreciated.
    check_connection_source-failure-59830ee1-1cd6-4b5e-92c6-186532469fa2.txt
  • j

    Jere Halligan

    06/26/2023, 2:16 AM
    Hey everyone - I’m having trouble authenticating to Facebook Pages
    beta
    . What I’ve done so far is to authenticate with Facebook (success message) and enter the Page ID, and then the test never succeeds and times out. I’ve tried it a few times with the same error.
    Copy code
    Configuration check failed
    The check connection failed because of an internal error
    Any ideas what could be the issue? Thanks.
    check_connection_source-failure-b2b031d8-99ef-4034-b42a-3373aafe4373.txt
    s
    • 2
    • 1
  • n

    Nipuna Prashan

    06/26/2023, 4:18 AM
    Hi, how to increase mssql connector concurrency thread count. it is always shows as 5.
    k
    • 2
    • 2
  • s

    Slackbot

    06/26/2023, 5:48 AM
    This message was deleted.
    k
    • 2
    • 2
  • n

    Nikhil Garakapati

    06/26/2023, 7:12 AM
    Hi folks I'm trying to move MySQL data to BigQuery using Incremental Deduped history. But the sync is failing due Failure Origin: normalization, Message: Something went wrong during normalization. I'm also attaching the log file. Looking for help.
    0fc7191d_758d_47a0_a3a3_3e2624677c46_logs_28_txt.txt
    k
    • 2
    • 3
  • j

    Jack Reid

    06/26/2023, 8:24 AM
    👋 I am currently using the Facebook Marketing connector and have noticed that the
    campaigns
    stream is missing the
    status
    &
    configured_status
    fields - are these something that can be added?
    k
    s
    • 3
    • 3
  • k

    Kevin Conseil

    06/26/2023, 9:08 AM
    @kapa.ai Can airbyte read data from a google sheet where the first column is hidden?
    k
    • 2
    • 2
  • s

    Slackbot

    06/26/2023, 10:03 AM
    This message was deleted.
    k
    • 2
    • 2
  • o

    Oliver Iglesias

    06/26/2023, 10:11 AM
    Hi, I have updated my local airbyte to v0.50.4. When I do a connection between Postgres and Bigquery I can not find anymore the ‘basic normalization’ feature. Instead now when I go to the transformation tab of the connection it says: ‘Normalization and Transformation operations are not supported for this connection.’ In my prod airbyte environment (is running an old version) I have same type of connections (Postgres -> BQ) but normalizing the data into tables in the destination. I have made something wrong in my local airbyte instance or v0.50.4 does not allow basic normalization anymore? Thanks!
    plus1 1
    k
    • 2
    • 2
  • a

    Arumugam S

    06/26/2023, 10:20 AM
    hi team. i'm using airbyte oss v0.40.32 deployed in ec2 . i have built a connection from google analytics (UA) to postgres with incremental deduped+history . the sync will run on the daily basis. it works fine until 21-june-2023 after that it throws duplicate key value violates unique constraint "73eb860dc700e88e96781dd9482ff850" 2023-06-26 101233 normalization > DETAIL: Key (_airbyte_unique_key)=(00033ac442e9dcb3545933f1362bfe88) already exists. 2023-06-26 101233 normalization > compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_incremental/public/traffic_sources.sql
    k
    • 2
    • 3
  • c

    Clemens Meyer zu Rheda

    06/26/2023, 2:16 PM
    Hi Everyone! I am using the Postgres Destination Connector and have trouble setting it up with SSH tunnelling. The data replication works just fine but in the normalisation stage the sync fails with a connection error. Has anybody any hints? I'm using Airbyte Cloud and the Postgres Destination Version 0.3.27.
    Copy code
    Database Error
      connection to server at "localhost" (127.0.0.1), port 50001 failed: Connection refused
      	Is the server running on that host and accepting TCP/IP connections?
      connection to server at "localhost" (::1), port 50001 failed: Cannot assign requested address
      	Is the server running on that host and accepting TCP/IP connections?
      ,retryable=<null>,timestamp=1687788354840,additionalProperties={}], io.airbyte.config.FailureReason@f1b006b[failureOrigin=normalization,failureType=system_error,internalMessage=connection to server at "localhost" (127.0.0.1), port 50001 failed: Connection refused,externalMessage=Normalization failed during the dbt run. This may indicate a problem with the data itself.,metadata=io.airbyte.config.Metadata@2c2dfde3[additionalProperties={attemptNumber=2, jobId=2838848, from_trace_message=true}],stacktrace=AirbyteDbtError:
    k
    • 2
    • 2
  • a

    akash sathish kumar

    06/26/2023, 2:41 PM
    Hi Team , I am using airbyte in my kubernetes cluster and since our company does not allow internet in our cluster , while adding new connections check_connection api is failing and we are unable to add connections , I believe every time we add a connection airbyte checks using the following url https://connectors.airbyte.com/ . Is there any workaround to this .
    k
    • 2
    • 2
  • c

    CobbleWeb

    06/26/2023, 4:53 PM
    Hi Everyone! I am trying to connect my local instance of Airbyte to a Pipedrive as a source using the Pipedrive connector that is currently in beta. When trying to connect I get the following error: '*ERROR* i.a.c.i.LineGobbler(voidCall):149 - docker: Error response from daemon: cannot share the host's network namespace when user namespaces are enabled.' I have even tried turning off namespacing temporarily on my linux machine using the following: sudo sysctl kernel.unprivileged_userns_clone=0 and then restarted my docker instance and also restarted Airbyte, but I am still getting the same error. Has anyone faced this issue before or have any idea how to fix this? Any help would be much appreciated. If needed I can provide the full Stack log. Thanks!
    k
    • 2
    • 2
  • m

    Madison Mae

    06/26/2023, 5:37 PM
    I had to denormalize the data in my sync and it said it was written to this stream name however that table is empty. If data is de-normalized is it sent to the airbyte raw version of the stream name?
  • t

    Thiago Villani

    06/26/2023, 6:25 PM
    Hello, I have a connection to the source MSSQL server, it is not generating data in the CDC table. It was working but it stopped, then for the airbyte it always goes 0 bytes in the sync.
    k
    • 2
    • 2
  • v

    Vivek PG

    06/26/2023, 7:22 PM
    Hi I'm trying to sync data from ~14k repositories using Github connector. But I'm facing issue while saving the source when i configure all repositories (ORG/*). assuming it is not able to complete the check operation and getting timed out. So I configure only the one repo (ORG/sample-repo), so that the check passes. And then I configure the destination and create a connection. After that I got to Airbyte configuration database and change it to all repos(ORG/*). But when running sync job, there is a Github check operation that is being executed in some of the jobs. Whenever the check is executed the sync job results in failure. Below is the error:
    Copy code
    2023-06-24 23:15:19 INFO i.a.w.p.DockerProcessFactory(create):192 - Preparing command: docker run --rm --init -i -w /data/220/1 --log-driver none --name source-github-check-220-1-isvii --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local
     -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=airbyte/source-github:1.0.1 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e USE_STREAM_CAPABLE_STATE=true -e FIELD_SELECTION_WORKSPACES= -e WORKER_ENVIRONMENT=DOCKER
     -e AIRBYTE_ROLE= -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=1 -e OTEL_COLLECTOR_ENDPOINT=<http://host.docker.internal:4317>
     -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.44.12 -e WORKER_JOB_ID=220 airbyte/source-github:1.0.1 check --config source_config.json 2023-06-24 23:16:09 INFO i.a.w.t.TemporalAttemptExecution(lambda$getCancellationChecker$6):231 - Running sync worker
     cancellation...
    2023-06-24 23:16:09 WARN i.a.w.g.DefaultCheckConnectionWorker(run):108 - Check connection job subprocess finished with exit code 143
    2023-06-24 23:16:09 INFO i.a.w.t.TemporalAttemptExecution(lambda$getCancellationChecker$6):235 - Interrupting worker thread...
    2023-06-24 23:16:09 INFO i.a.w.t.TemporalAttemptExecution(lambda$getCancellationChecker$6):238 - Cancelling completable future...
    2023-06-24 23:16:09 ERROR i.a.w.g.DefaultCheckConnectionWorker(run):125 - Unexpected error while checking connection: 
    java.io.IOException: Stream closed
    Executed the check operation on local and it takes ~12 minutes to complete for all repos . So we changed the timeouts in Airbyte to 15 minutes and restarted the sync, but the check operation didn't get invoked. ACTIVITY_MAX_TIMEOUT_SECOND=900 ACTIVITY_CHECK_TIMEOUT=15 Not able to understand why the Github check connection operation is executed in some sync jobs and not in others. What conditions will trigger the check operation when running the sync? Is there any solution for this issue?
    k
    • 2
    • 2
  • o

    Octavia Squidington III

    06/26/2023, 7:45 PM
    🔥 Community Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 1pm PDT click here to join us on Zoom!
  • c

    Carlos Salido

    06/26/2023, 9:17 PM
    hello people, i'm new and just getting started with airbyte, i'm having issues connecting the source MySQL, i get this error , i am aware is an error on mysql server side i follow the documentation with no luck, any advice?
    n
    • 2
    • 2
  • j

    Joey Taleño

    06/27/2023, 6:15 AM
    Hi Team, How do you connect a PostgresDB in an AWS VPC to Airbyte Cloud? Thanks in advance! 🙏
    k
    • 2
    • 2
  • e

    Elad Rabinovich

    06/27/2023, 7:21 AM
    Hi Everyone! We are trying to sync 2 TB of data (full+cdc) from MySQL to Redshift. We are running Airbyte on EC2 m5.xlarge (4vCPU, 16GB). The performance is relatively low; we are ingesting around 5GB in 10 minutes, in a rough calculation, which means around 68 hours for 2TB. Our goal is to minimize the full refresh times by at least half(or even more). We can see that the MySQL connector is fetching 5000 rows per fetch(attached screenshot 1). We are trying to figure out if this is something configurable or a derivative of the memory allocated to the worker container. We have tried to increase the request & limit both CPU and memory (attach screenshot2), but the 5000 rows fetch stays as is. We encountered the following: https://discuss.airbyte.io/t/mysql-source-connector-performance/1092 https://docs.airbyte.com/operator-guides/scaling-airbyte#memory But still, we couldn’t figure out if this is a limitation of the source connector or something else (configuration). Any advice will be much appreciated. Thanks, Elad Rabinovich
    👀 3
    m
    • 2
    • 1
  • j

    Joey Taleño

    06/27/2023, 7:45 AM
    Hi Team, We are trying to connect to Zoom however, it seems like the JWT Token just got recently deprecated... Please help. Thanks!
1...206207208...245Latest