https://discord.cloudflare.com logo
Join Discord
Powered by
# r2
  • t

    thomasg

    03/12/2023, 1:31 AM
    Thanks James, I'm a bit new to all these (have used other clouds), but it seems like wrangler dev/miniflare are moreso for workers development, but can I emulate an r2 bucket without a worker entry point? trying to simulate an r2 bucket to develop a sveltekit app accessing it (using an s3 client in my sveltekit app calling a local server/local address) (might not exist, but that's what I'm looking for)
  • t

    thomasg

    03/12/2023, 1:33 AM
    If that doesn't exist, I might just wait for r2 to be enabled on my account and develop against the remote r2
  • t

    thomasg

    03/12/2023, 1:34 AM
    I guess I might be able to use a local s3 emulator instead (since api should be compatible)?
  • t

    thomasg

    03/12/2023, 2:06 AM
    Working now, having a status in the dashboard while the payment is being validated would be nice to have!
  • i

    ian

    03/12/2023, 2:26 AM
    I'm trying to upload a 150GB file to R2 using awscli like
    aws s3 cp --endpoint-url=... localfile.txt s3://bucket/
    . It gets a few gigabytes into the upload but then fails with
    Copy code
    An error occurred (InternalError) when calling the UploadPart operation (reached max retries: 4): We encountered an internal error. Please try again.
    . This happens every time I run the upload. Any ideas for how to resolve?
    s
    • 2
    • 10
  • s

    skyadmin

    03/12/2023, 4:51 AM
    i meet same error, how did you resolve it?
  • s

    Sid | R2

    03/12/2023, 10:53 AM
    It looks like there’s a significant lag (about 30 mins) between buying R2 and it getting activated this weekend. If you just try again in a bit, things should work!
  • m

    minester16

    03/12/2023, 11:54 AM
    Hi, I'm having problem while PUT'ing and GET'ing content to R2 from Wrangler cli Commands are below, I uploaded test file with this command through wrangler:
    Copy code
    npx wrangler r2 object put r2-test/100MB.bin --file=100MB.bin
    After I got this response:
    Copy code
    Creating object "100MB.bin" in bucket "r2-test".
    Upload complete.
    Then I tried to download file:
    Copy code
    npx wrangler r2 object get "r2-test/100MB.bin"
    I got this error:
    Copy code
    Downloading "100MB.bin" from "r2-test".
    
    X [ERROR] Failed to fetch /accounts/$JOEDOE/r2/buckets/r2-test/objects/100MB.bin - 404: Not Found);
    I can see file in Cloudflare dashboard btw and able to download. What's wrong? What I am doing wrong? Am I need to set permissions even for reaching from wrangler?
  • m

    mattw

    03/12/2023, 9:42 PM
    Hi, I got the following error:
    completeMultipartUpload: Your proposed upload is smaller than the minimum allowed object size.
    . What is the minimum size of a multipart upload? Is this in the documentation? (I cannot seem to find anything)
  • t

    TetraEtc

    03/12/2023, 9:59 PM
    Howdy all, is there a convenient supported way of connecting to R2 via FTP? Something like AWS Transfer?
  • s

    Sid | R2

    03/12/2023, 10:12 PM
    Is your file less than 5MB in size?
  • s

    Sid | R2

    03/12/2023, 10:14 PM
    Hm R2 doesn’t have an FTP endpoint yet. Do you specifically need FTP? Have you checked out rclone / UI tools like Cyberduck? These will work over the S3 API
  • m

    mattw

    03/12/2023, 10:16 PM
    yes
  • m

    mattw

    03/12/2023, 10:16 PM
    5mb is the minimum?
  • s

    Sid | R2

    03/12/2023, 10:17 PM
    Yeah. For files that small you should be perfectly fine with PutObjects as well, no? Using multipart uploads uses thrice as many operations.
  • j

    Joselito

    03/13/2023, 12:41 AM
    hello guys! any guidance on how to use R2 with android S3 transfer utility?
  • t

    TetraEtc

    03/13/2023, 1:30 AM
    at this point yes... using a tool that can push to a URL or S/FTP/S server only.
  • m

    mattw

    03/13/2023, 3:07 AM
    yeah, it was a situation where it needs to also support multipart upload across requests, but i ended up handling it differently for small files. Thanks for the help 🙏
  • s

    strxkeskit

    03/13/2023, 7:48 AM
    is there a way i can trigger worker whenever there is a new file added on r2? (Seems like No from Prev Conversations)\
  • h

    HardAtWork

    03/13/2023, 7:53 AM
    Not unless you run all uploads through a Worker first
  • a

    Angelo Bestetti

    03/14/2023, 11:21 AM
    Hi community, I was migrating my buckets from S3 to R2 and I was having problems with AWS SDK for .NET, was not working at all, so I´m sharing my discoveries with the community Here it´s the "correct" code: public static async Task ListBuckets() { try { ListBucketsRequest request = new ListBucketsRequest(); ListBucketsResponse result = await s3Client.ListBucketsAsync(); } catch (Exception e) { Console.WriteLine("Unknown encountered on server. Message:'{0}' when writing an object", e.Message); } } Turns out "AWAIT" throws an error, well in my case just STOP VS2022, after few tests AWAIT does not work at all, here it´s the final code: public static async Task ListBuckets() { try { ListBucketsRequest request = new ListBucketsRequest(); ListBucketsResponse result = s3Client.ListBucketsAsync().Result; } catch (Exception e) { Console.WriteLine("Unknown encountered on server. Message:'{0}' when writing an object", e.Message); } } So do NOT use AWAIT, use .Result at the end now all the functions in AWS-SDK for .NET works fine
  • s

    Sid | R2

    03/14/2023, 12:07 PM
    Thanks for the PSA! I know exactly 0 .NET, but do the examples on https://developers.cloudflare.com/r2/examples/aws-sdk-net/ not work for you then (since they use await)?
  • e

    Eli

    03/14/2023, 8:18 PM
    Hello! I was wondering if someone knew how I could enforce the file type and size with the pre-signed urls? I'm currently able to generate the url using the 'aws4fetch' package as shown in this example from the docs https://developers.cloudflare.com/r2/data-access/s3-api/presigned-urls/ My question is do I need to modify/add some URL search params or a header where this is generated? Use case is uploading profile pictures to R2 bucket, would like to prevent uploads of anything other than png/jpg <= 5mb. Any help would be much appreciated!
    Copy code
    const r2 = new AwsClient({
        accessKeyId: config.r2.accessKey,
        secretAccessKey: config.r2.secretKey
      })
    
      const url =
        `https://${config.r2.bucketName}.${config.r2.accountId}.r2.cloudflarestorage.com` +
        '/profile/image-name.png' +
      '?' + 'X-Amz-Expires=3600' // expires in 1 hour
    
      const signed = await r2.sign(
        new Request(url, {
          method: 'PUT',
          headers: {
            'Content-Type': fileType
          }
        }),
        {
          aws: { signQuery: true }
        }
      )
    
      return signed.url
  • k

    kian

    03/14/2023, 8:37 PM
    You can’t enforce a size min or max, only an exact size
  • e

    Eli

    03/14/2023, 8:39 PM
    This is ok because I can send the size to the worker, but having trouble understanding where to configure that here
  • a

    albert

    03/14/2023, 8:45 PM
    You would enforce size by setting
    Content-Length: <size>
    in the signed request.
  • e

    Eli

    03/14/2023, 8:56 PM
    I've tried a few different variations of this, not sure if it belongs in headers or url search params
    Copy code
    const url =
        `https://${config.r2.bucketName}.${config.r2.accountId}.r2.cloudflarestorage.com` +
        '/profile/' +
        `${nanoid(6)}.${fileType.split('/')[1]}`
      '?' + `X-Amz-Expires=3600&Content-Length=${fileSize}&Content-Type=${fileType}`
    
      const signed = await r2.sign(
        new Request(url, {
          method: 'PUT',
          headers: {
            'Content-Type': fileType,
            'Content-Length': `${fileSize}}`
          }
        }),
        {
          aws: { signQuery: true }
        }
      )
    After creating the URL i'm able to upload anything I want of any size using this and get a 200 OK
    Copy code
    curl -X PUT \
      'https://someurl.r2.cloudflarestorage.com/profile/GBEoCP.png?X-Amz-Date=20230314T205007Z&X-Amz-Expires=86400&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=837367ca5b70597148acc8e7808fb6af%2F20230314%2Fauto%2Fs3%2Faws4_request&X-Amz-SignedHeaders=host&X-Amz-Signature=4c290f05240b76e1d1a9134eecc8464f55747660b9ff18a93c353469a51670ae' \
      --header 'Content-Type: image/png' \
      --data-binary '@/Users/eli/Downloads/16.4.zip'
  • b

    Brame

    03/15/2023, 2:36 AM
    Hi. I’ve created a bucket and added a domain, but I still don’t have public access. The domain states “Status : Active | Access to Bucket : Allowed” However “Public URL Access : Not Allowed” Am I missing something obvious?!
  • a

    Angelo Bestetti

    03/15/2023, 2:40 AM
    Hi Sid, exactly, all the examples are using 'await' for some reason it does not work at all
  • b

    Brame

    03/15/2023, 2:41 AM
    Interestingly, it springs to life if I enable r2.dev, but breaks again as soon as I disable that
1...941942943...1050Latest