https://discord.cloudflare.com logo
Join Discord
Powered by
# r2
  • i

    iway1

    05/05/2023, 3:55 AM
    on the bright side my CORS settings are correct it seems
  • t

    thangnm

    05/05/2023, 4:03 PM
    can anyone help me to answer it: https://discord.com/channels/595317990191398933/1104075421047336970
  • d

    DanCodes

    05/05/2023, 5:13 PM
    Oh I'm having the same issue today lmao, except a presigned putobject
  • t

    Tony!

    05/05/2023, 5:24 PM
    I want to congratulate the @User team for shipping the bare minimum S3 compatibility. Making their R2 service almost unusable.
  • w

    Walshy | Pages

    05/05/2023, 5:25 PM
    None of us are on the R2 team :p
  • w

    Walshy | Pages

    05/05/2023, 5:25 PM
    but seriously, that isn't how you report issues... tagging employees being rude won't solve any problems.
  • j

    James

    05/05/2023, 5:27 PM
    If you have any useful and productive feedback, the R2 team actively follow this chat and would love to hear about your issues 🙂
  • d

    DanCodes

    05/05/2023, 5:27 PM
    ^ just to confirm I am using a put request on the frontend
  • d

    DanCodes

    05/05/2023, 5:36 PM
    It seems that that the access-control-allow-origin isn’t being sent back, resulting in the issue, my cors is setup correctly though
  • s

    Sid | R2

    05/05/2023, 8:44 PM
    Are you actually using a PUT? POST uploads aren’t supported at the moment (but should be soon). The easiest way to debug CORS issues is to inspect your outgoing request (the presigned PUT in your case) 1. Check if the Origin header of your outgoing PUT request matches one of the CORS origins you’ve set up 2. Another common source of problems is extra headers. If you see any extra headers in your outgoing requests, you’ll want to add them to your CORS policy as well. A quick way to debug is to allow “*” in AllowedHeaders and see if that fixes things. If it does, try replacing the wildcard with explicit headers you’re sending. I wouldn’t recommend leaving he wildcard in there permanently
  • s

    Sid | R2

    05/05/2023, 8:48 PM
    Sorry, by “extra headers”, I mean additional headers you’re signing
  • h

    Haukur

    05/05/2023, 9:55 PM
    I've been trying to figure out if there's a way to create API tokens that are only valid for a specific bucket in my account, but in the API tokens page it only seems to allow you to select "Workers R2 Storage" and "Edit/Read", without any further granularity. Is that accurate?
  • j

    James

    05/05/2023, 9:58 PM
    That is currently accurate, yes. Dane mentioned on Twitter that bucket scoped tokens would be coming this quarter, so hopefully we'll hear more about them soon.
  • Can not copy file from source path to target path on R2 using PHP AWS SDK
    t

    thangnm

    05/06/2023, 1:52 AM
    As the title of the post, I’m using the PHP AWS SDK package to connect to CF R2. The connection is successful, and I’ve tried uploading and deleting a file, which both worked fine. However, when I try to copy a file from one location to another on R2, it doesn’t work. There are no errors or logs available. I'm using Laravel, so below is my code filesystem.php
    Copy code
    'disks' => [
            's3' => [
                'driver' => 's3',
                'key' => env('R2_ACCESS_KEY_ID'),
                'secret' => env('R2_SECRET_ACCESS_KEY'),
                'region' => env('R2_DEFAULT_REGION'),
                'bucket' => env('R2_BUCKET'),
                'url' => env('R2_URL'),
                'endpoint' => env('R2_URL'),
            ],
    
        ],
    Code try to copy an object
    Copy code
    try {
                $s3 = Storage::disk('s3');
                $client = $s3->getDriver()->getAdapter();
                $client->getClient()->getCommand('CopyObject', [
                    'Bucket' => 'thangnm',
                    'Key'    => 'tmp/2023/5/5/c48df6b206a58d0360c55f3932260fd6.png',
                    'CopySource' => 'p/2251799814000360/c48df6b206a58d0360c55f3932260fd6.png'
                ]);
            } catch (Throwable $ex) {
                var_dump($ex->getMessage());
            }
    What could be causing this issue, and if someone has experienced this before, could you please guide me on how to resolve it? Thank you very much.
    s
    • 2
    • 13
  • j

    Jadugar_Jaggu

    05/06/2023, 5:03 PM
    Hello Guys I was confussed with billing. If I will never exceed storage usage of 500GB in a month and I perform delete and upload and maintain the 500GB limit of total storage usage then what will be monthly cost ?
  • j

    Jadugar_Jaggu

    05/06/2023, 5:03 PM
    500GB-mont?
  • j

    Jadugar_Jaggu

    05/06/2023, 5:04 PM
    Like I will perform many upload and delete operation but overall storage usage will be under 500GB
  • u

    Unsmart | Tech debt

    05/06/2023, 5:22 PM
    It uses the peak storage usage everyday and averages that over the month
  • u

    Unsmart | Tech debt

    05/06/2023, 5:23 PM
  • r

    raghavTinker

    05/06/2023, 7:23 PM
    Please help out here. So I have moved alot of our data from S3 to using R2. Now everything is working, on django I have made a custom storage class, and some models are using it. While everything is working there is one small issue. I am trying to get the file size but when I try to get the file size I get the 400 head object error. And this happens on the endpoint url and not on the custom domain. How to force botocore to use the custom domain, else where its using custom domain as it is...so why here.
    Copy code
    class R2Storage(S3Boto3Storage):   
        ------
        endpoint_url = settings.R2_ENDPOINT_URL
        ------
    s
    • 2
    • 34
  • j

    Jadugar_Jaggu

    05/06/2023, 7:30 PM
    yes I am saying if that peak never cross the 500 gb then it means I have to pay for 500GB-month or below then this right?
  • u

    Unsmart | Tech debt

    05/06/2023, 7:30 PM
    yes
  • j

    Jadugar_Jaggu

    05/06/2023, 7:33 PM
    Ok thanks for the help, I have one more issue that is not related to R2 but related to Cloudflare ips , cause recently in my many Indian users reported that in their country cloudflare protected sites are taking too much time to load and sometimes not working. what you think about it?
  • u

    Unsmart | Tech debt

    05/06/2023, 7:34 PM
    I think India sometimes bans cloudflare ips not really anything anyone but the government can fix
  • j

    Jadugar_Jaggu

    05/06/2023, 7:34 PM
    yea it can be possible
  • t

    tricked

    05/06/2023, 9:29 PM
    is it normal for it to take 1.6 seconds to upload a 1.2mb file to r2 this is my code:? i posted this before in https://discord.com/channels/595317990191398933/846453104382836766/1103035968296075394 but got no response so im asking here too https://github.com/ascellahost/ascellav3/blob/master/src/api.ts#L126-L252
  • k

    kian

    05/06/2023, 9:40 PM
    Have you benchmarked just the R2 code?
  • k

    kian

    05/06/2023, 9:40 PM
    It looks like there's a DB and KV involved here which will definitely add overhead
  • t

    tricked

    05/06/2023, 9:42 PM
    yeah small files are still 700ms
  • t

    tricked

    05/06/2023, 9:46 PM
    the 2 db queries are prob 200ms each not sure about the kv one tho
1...101110121013...1050Latest