https://discord.cloudflare.com logo
Join Discord
Powered by
# workers-discussions
  • h

    HardAtWork

    04/18/2023, 12:43 PM
    You would have to store the Server Socket somewhere, maybe in a Durable Object?
  • s

    Skye

    04/18/2023, 1:16 PM
    Yeah a DO would have to be what you use there
  • b

    bret_pat

    04/18/2023, 1:20 PM
    Bounded workers may be evicted after waiting for 30 seconds. Is there a way to know if/when you're about to be evicted?
  • k

    kian

    04/18/2023, 1:25 PM
    Nope
  • s

    sathoro

    04/18/2023, 1:29 PM
    Copy code
    TypeError: fetch failed
        at Object.processResponse (C:\---\node_modules\undici\lib\fetch\index.js:199:23)
        at C:\---\node_modules\undici\lib\fetch\index.js:928:38
        at node:internal/process/task_queues:140:7
        at AsyncResource.runInAsyncScope (node:async_hooks:204:9)
        at AsyncResource.runMicrotask (node:internal/process/task_queues:137:8)
    this is happening when running wrangler with --local when it tries to respond to a request after doing some background stuff for a couple minutes. is this expected? I can't really tell where the code is throwing
  • s

    sathoro

    04/18/2023, 1:32 PM
    actually I'm thinking this is likely due to a fetch timeout. we are requesting something that can take a few minutes to respond
  • s

    sathoro

    04/18/2023, 1:32 PM
    but I thought fetch didn't have a default timeout
  • s

    sathoro

    04/18/2023, 1:32 PM
    does wrangler have some sort of fetch timeout?
  • s

    sathoro

    04/18/2023, 1:35 PM
    I think it is timing out at 120 seconds. any thoughts?
  • k

    kian

    04/18/2023, 1:37 PM
    That's the default NodeJS timeout
  • s

    sathoro

    04/18/2023, 1:37 PM
    oh! I didn't realize there was one built-in
  • s

    sathoro

    04/18/2023, 1:38 PM
    do you know how to increase it?
  • k

    kian

    04/18/2023, 1:38 PM
    Nothing user-facing that I'm aware of, that's set within Undici (https://github.com/nodejs/undici/issues/1373#issuecomment-1110260144)
  • k

    kian

    04/18/2023, 1:39 PM
    Post in #891052295410835476 as that's what
    --local
    uses, they can take a look at the issue & fix it if needed
  • s

    sathoro

    04/18/2023, 1:40 PM
    do production workers use different values? I could live with it for now if it will work on a real worker
  • k

    kian

    04/18/2023, 1:41 PM
    Cloudflare uses 100 seconds
  • k

    kian

    04/18/2023, 1:41 PM
    You definitely don't want a connection held open for that long
  • s

    sathoro

    04/18/2023, 1:42 PM
    hmm why not? the alternative is using server sent events which isn't necessarily more reliable I think
  • s

    sathoro

    04/18/2023, 1:42 PM
    we are calling the OpenAI API
  • s

    sathoro

    04/18/2023, 1:42 PM
    GPT-4 + token heavy responses = long response time
  • s

    sathoro

    04/18/2023, 1:44 PM
    I suppose I'll just switch this code to using the SSE version. there is just extra overhead that I avoid it when we don't need to stream the response back. this is returning JSON so we gotta parse the full response anyways
  • s

    sathoro

    04/18/2023, 2:53 PM
    alright switched it to using server sent events and it is working now... the connection is still open for the same amount of time 😛
  • d

    dave

    04/18/2023, 5:58 PM
    @Erisa | Support Engineer can you paste the PoC you have written so I can modify it to trigger the bug? 🙂
  • e

    Erisa | Support Engineer

    04/18/2023, 5:59 PM
    Uh the one in the ticket?
  • d

    dave

    04/18/2023, 5:59 PM
    ya.
  • d

    dave

    04/18/2023, 5:59 PM
    also I thought Ms. Erisa was funny
  • e

    Erisa | Support Engineer

    04/18/2023, 5:59 PM
    you should be able to copy it out of there
  • d

    dave

    04/18/2023, 6:00 PM
    fudge
  • d

    dave

    04/18/2023, 6:00 PM
    I missed it
  • d

    dave

    04/18/2023, 6:32 PM
    I blame using an Apple Watch again for reading emails
1...240324042405...2509Latest