https://discord.cloudflare.com logo
Join Discord
Powered by
# r2
  • p

    Plotzes

    04/22/2023, 1:46 PM
    rclone is set up correctly, i can do
    rclone tree r2:mc-players
    just fine
  • m

    Mattèo

    04/22/2023, 3:57 PM
    Thanks for your reply, but I still have the issue with the AllowedOrigins set to ["http://localhost:3000"]
  • m

    Mattèo

    04/22/2023, 4:12 PM
    Resolved my issue, the authorization step was missing. But I don't understand why it's a CORS HTTP 400 issue that is returned.
  • c

    chientrm

    04/23/2023, 7:03 AM
    How do I quickly clone R2 bucket? I want to test migrations on R2 so that I need to backup the whole bucket.
  • h

    HardAtWork

    04/23/2023, 7:33 AM
    rclone
  • i

    itsme

    04/23/2023, 5:20 PM
    acess to XMLHttpRequest at 'xxxxxxxxxx' from origin 'https://domain.com' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. what to do? i have set the cors[ { "AllowedOrigins": [ "https://domain.com" ], "AllowedMethods": [ "GET", "PUT", "POST" ] } ]
  • s

    Sid | R2

    04/23/2023, 5:47 PM
    Try getting rid of the trailing / in your origin (I’ll see if I can get the API to do this automatically) Make sure you also include any extra headers you may be sending in an AllowedHeaders key
  • i

    itsme

    04/23/2023, 6:15 PM
    here i typed wrong. i have no trailing slash in original setup. actually i have django backend and i am able to generate the presigned url but when i am uploading the file that error comes.
    s
    • 2
    • 33
  • e

    emd

    04/23/2023, 7:49 PM
    hey there folks, I'm trying to use Livewire's tmp upload feature and getting a 403 CORS error on the options call. I do have a CORS policy set to my public bucket. Is this supported right now in R2?
    s
    • 2
    • 109
  • e

    emd

    04/23/2023, 7:50 PM
    what information do you need to help me debug this?
  • e

    emd

    04/23/2023, 7:51 PM
  • e

    emd

    04/23/2023, 7:52 PM
    I believe that is a pre-signed url
  • b

    biscuit

    04/23/2023, 7:56 PM
    Hello, what would be the best way to upload 140,000 different 120byte json files to an r2 bucket? Is rclone the best choice or should i use FUSE or is there some cloudflare api secret sauce?
  • h

    HardAtWork

    04/23/2023, 8:06 PM
    If you have them locally, then rclone would probably be the best option
  • h

    HardAtWork

    04/23/2023, 8:07 PM
    You can crank up the concurrency to run more files at once
  • b

    biscuit

    04/23/2023, 8:08 PM
    I also have the full 13mb json file but it seemed infeasable/wasteful to split them into their respective files with workers
  • b

    biscuit

    04/23/2023, 8:08 PM
    after uploading to r2*
  • 👋 I know folders are not a thing but
    v

    vvo

    04/24/2023, 5:52 AM
    👋 I know folders are not a thing, but still, what would be the best way to delete by prefix? Use case: we will use folders to store "per customer" data, and ideally, we'd like to delete all files from a prefix quickly.
    k
    s
    • 3
    • 4
  • i

    itsme

    04/24/2023, 2:53 PM
    hello sid reply in cors thread. i really need your help.
  • d

    dAn

    04/24/2023, 4:28 PM
    hi there. i've created an IAM policy with the policy provided in the R2 migration docs, assigned it to a user and given it some credentials. i put the credentials into the R2 migration wizard and its saying that access to the bucket was denied. i have tried using these credentials to list objects using the AWS CLI, and they work fine? the bucket name definitely matches up between the IAM policy and the R2 wizard. have i done something wrong here?
  • j

    jeromes

    04/24/2023, 5:00 PM
    @dAn no this setup looks fine based on what I have access to. Please do: * check the S3 credentials (Access Key ID, Secret Access Key) you're passing in the Migration UI; I'm sure you've done it a couple times, but maybe it could be that; the S3 Access Key ID typically starts with `AKIA`; check that there are no leading or trailing spaces if you're copy/pasting them * same checks for the bucket name If that does not work, try different credentials for the same user. If that does not work, something not obvious is malfunctioning and we need engineering to investigate, so please let us know.
  • d

    dAn

    04/24/2023, 6:10 PM
    - doubled checked the creds, 100% correct, no whitespace (checked what is being passed to the API in devtools and its all fine) - bucket name is 100% correct, copy pasted it from AWS UI, no whitespace - new creds for same user, still not working. these new creds do work in the AWS CLI - new user, new policy granting every S3 permission, still doesnt work (i know this is bad practice, just testing it)
  • Multipart uploads
    s

    Siclude

    04/24/2023, 8:41 PM
    Hi, I did a search but was not able to find a proper answer. What is the proper programmatic way to upload large (multi gig) files? I need large video calls uploaded to R2 for AI purposes. I am guessing that if I use Workers, they will timeout (right?)
    s
    • 2
    • 23
  • z

    zendev

    04/25/2023, 1:43 AM
    Hi, this might be a basic question but I've been struggling to understand it. I want to use R2 to store audio and image files uploaded by users on my app. I thought I could then get the url of the object and store it in a mysql database so that it's easily accessible throughout the app via simple calls to my own database. The signed urls expire so that's obviously not what I want to store, but whenever I even try to visit the S3 url provided in the Cloudflare R2 dashboard I get something like this: InvalidArgument Invalid Argument: Authorization Can anyone explain what's happening here, and if/how I can achieve what I'm looking for? Thanks in advance!
  • k

    kian

    04/25/2023, 1:44 AM
    The S3 URL requires auth, either at request time or by signing the URL ahead of time.
  • k

    kian

    04/25/2023, 1:44 AM
    If you want a public read-only bucket, add a custom domain
  • z

    zendev

    04/25/2023, 2:39 AM
    Ok that’s what I was thinking, just seen a lot of warnings about public buckets but if it’s read-only it shouldn’t be a security concern right?
  • c

    Chris M

    04/25/2023, 3:33 AM
    I haven't been able to find an answer, but is it possible to get versions of an R2 object? I don't see it listed in either the supported or not supported docs. Any help is greatly appreciated.
  • k

    Karew

    04/25/2023, 3:45 AM
    There is no versioning on R2 right now
  • c

    Chris M

    04/25/2023, 3:51 AM
    Thanks for the confirmation!
1...99910001001...1050Latest