https://discord.cloudflare.com logo
Join Discord
Powered by
# durable-objects
  • u

    Unsmart | Tech debt

    01/01/2023, 4:29 AM
    And on the free plan it definitely is pretty limited in customization of how it's defined
  • p

    p0

    01/01/2023, 4:29 AM
    would I even need one tho?
  • s

    sks

    01/01/2023, 4:30 AM
    Yes. But you would be still have to run the worker. You see the requests will still come through. The worker would still be executed. Even if to check that your account balance has expired. So it would still cost you. And If someone is spamming the api. Like with 100req/s then your worker and all that code for doing the checking of the limit is still getting executed. So you’ll will be billed for all that requests.
  • u

    Unsmart | Tech debt

    01/01/2023, 4:30 AM
    Adding one wouldn't hurt
  • d

    Deleted User

    01/01/2023, 4:30 AM
    Just to prevent spam burst requests
  • d

    Deleted User

    01/01/2023, 4:30 AM
    i have mine set at like 100 per 10 seconds (or maybe 10 i forgot)
  • p

    p0

    01/01/2023, 4:30 AM
    I see
  • p

    p0

    01/01/2023, 4:31 AM
    that's fine, I don't need user-level rate limit customization.
  • u

    Unsmart | Tech debt

    01/01/2023, 4:31 AM
    $1.40/million requests seems really nice considering my work uses AWS where that's the minimum basically
  • s

    sks

    01/01/2023, 4:31 AM
    Is your api highly dynamic ?
  • p

    p0

    01/01/2023, 4:32 AM
    wdym?
  • s

    sks

    01/01/2023, 4:33 AM
    Since you said in your api you’re just fetching the request from another website and presenting it in a nice format. So what I ask is that is that very dynamic ? Like maybe per request it would change drastically. Or like there could be a time limit for when the content updates. Beacuse if that’s the case you can use cache and reduce requests a lot.
  • p

    p0

    01/01/2023, 4:35 AM
    not likely to change too often - I'm already using cache, but it's unlikely to reduce requests too much because it's not very often that a user will repeat the same request
  • s

    sks

    01/01/2023, 4:37 AM
    Is the content per customer ? Or can the same content be served to multiple customers ?
  • p

    p0

    01/01/2023, 4:38 AM
    multiple customers
  • s

    sks

    01/01/2023, 4:47 AM
    Would the customers use the API directly from the frontend ? or from their own servers ? Or it can depend on the customer and basically unpredictable ?
  • p

    p0

    01/01/2023, 4:51 AM
    from their own servers
  • s

    sks

    01/01/2023, 4:52 AM
    So, it would never be frontend ?
  • p

    p0

    01/01/2023, 4:52 AM
    never
  • p

    p0

    01/01/2023, 4:52 AM
    yep
  • s

    sks

    01/01/2023, 4:53 AM
    Okay,
  • s

    sks

    01/01/2023, 4:56 AM
    Then @p0 I believe you would benefit a lot from external services in your worker such as PlanetScale. $1 per billion reads are quite cheap, and if you design your sql efficiently, you can make it 1 read per execution of your worker. So, then that way you don't have to use either KV or DO. And PlanetScale is atomic by nature, so you don't have to worry about it.
  • p

    p0

    01/01/2023, 4:57 AM
    I'll look into that
  • p

    p0

    01/01/2023, 4:57 AM
    thanks for the help
  • s

    sks

    01/01/2023, 4:57 AM
    You're welcome.
  • s

    sks

    01/01/2023, 4:59 AM
    Check it out here. https://planetscale.com/.
  • p

    p0

    01/01/2023, 4:59 AM
    alr
  • p

    p0

    01/01/2023, 5:03 AM
    but wouldn't it still have to write to increase the value?
  • s

    sks

    01/01/2023, 5:05 AM
    Yes, but writes are comparatively priced. But once you have hit the limit, you are just doing reads thereafter. So no more writes. Something like this: x <- read if x > 0 then do work write (decrement) end if
  • p

    p0

    01/01/2023, 5:05 AM
    writes are 1.5 per million
1...470471472...567Latest