https://discord.cloudflare.com logo
Join Discord
Powered by
# workers-discussions
  • t

    Tom Sherman

    05/03/2023, 1:47 PM
    a rust+serde worker might be faster...
  • b

    bret_pat

    05/03/2023, 1:47 PM
    Right, forgot to mention that i'm using
    Copy code
    lossless-json
    to parse since I need to not lose accuracy when dealing with small numbers or really big ones. i.e. convert a big number to a string instead of losing accuracy.
  • t

    Tom Sherman

    05/03/2023, 1:47 PM
    ah 😅
  • t

    Tom Sherman

    05/03/2023, 1:48 PM
    rust+serde will definitely be faster then haha
  • b

    bret_pat

    05/03/2023, 1:48 PM
    Can you use rust on workers? Wouldn't you need to convert it to wasm which has a host of other issues?
  • t

    Tom Sherman

    05/03/2023, 1:48 PM
    you have to build to wasm, yes
  • t

    Tom Sherman

    05/03/2023, 1:49 PM
    https://github.com/cloudflare/workers-rs
  • k

    kian

    05/03/2023, 1:53 PM
    🦀
  • b

    bret_pat

    05/03/2023, 1:57 PM
    Have you built workers with rust? Do you have an opinion on the state of affairs? i.e. is it production ready assuming simple use cases
  • b

    bret_pat

    05/03/2023, 1:58 PM
    Oh there is a separate channel!
  • k

    kian

    05/03/2023, 1:58 PM
    It's plenty usable, but there'll be rough edges here or there
  • k

    kian

    05/03/2023, 1:59 PM
    Lots of the Rust ecosystem has WASM support, I can't say I have run into any crates I wanted to use that didn't work or didn't have a viable alternative
  • k

    kian

    05/03/2023, 1:59 PM
    Observability is so-so
  • t

    Tom Sherman

    05/03/2023, 1:59 PM
    hardest part is probably finding an off the shelf rust library to parse big numbers
  • t

    Tom Sherman

    05/03/2023, 2:00 PM
    dunno if you can plug that into serde
  • k

    kian

    05/03/2023, 2:00 PM
    num-bigint
    seems to have a
    serde
    feature
  • k

    kian

    05/03/2023, 2:01 PM
    Last publish was Nov. 2021 so there might be a newer alternative, but this one could also just be feature complete
  • c

    comagoosie

    05/03/2023, 2:01 PM
    afaik serde has arbitrary precision as a builtin feature: https://github.com/serde-rs/json/blob/master/Cargo.toml#L78
  • s

    sathoro

    05/03/2023, 2:14 PM
    our Workers last much longer than 30 seconds when streaming responses, but it seems like when immediately returning a response and trying to do a long running task in
    waitUntil
    they are getting killed after ~30 seconds. I didn't realize this would happen - is this expected and is there a workaround?
  • s

    sathoro

    05/03/2023, 2:14 PM
    unfortunately this doesn't happen when running locally, so I didn't realize this limitation until testing in staging
  • k

    kian

    05/03/2023, 2:14 PM
    it's a bug with waitUntil
  • k

    kian

    05/03/2023, 2:15 PM
    it should be 30 seconds after the response has finished streaming but it's currently 30 seconds after the response headers are sent
  • s

    sathoro

    05/03/2023, 2:15 PM
    ahh shoot
  • s

    sathoro

    05/03/2023, 2:15 PM
    neither will work for our use case because this particular response we are not streaming, just immediately returning a 200 and kicking off a long running task
  • s

    sathoro

    05/03/2023, 2:16 PM
    😦
  • k

    kian

    05/03/2023, 2:16 PM
    Sounds like Queues might be a better option?
  • s

    sathoro

    05/03/2023, 2:17 PM
    yeah that is was what I was looking at, will probably switch to it
  • s

    sathoro

    05/03/2023, 2:18 PM
    I've gotten bitten by these small inconsistencies several times now, a couple times they even made it to production 😦 for example if you don't
    await
    a DO request in a Worker before returning a response locally then it still goes through. but when pushed live the request doesn't go through
  • s

    sathoro

    05/03/2023, 2:20 PM
    > To send and receive messages, you must use a Worker. @kian can I send to queue from DO directly?
  • a

    anthony.potappel

    05/03/2023, 2:21 PM
    I read on this forum that workers-ai is no longer a thing. But am wondering, now with WebGPU support in the browser enabled, is there work being done on supporting that in Cloudflare workers (given its also v8 based) ? Hopefully someone can shine a light on this. If its not coming anytime soon I need to figure out an alternate solution.
1...243224332434...2509Latest