https://discord.cloudflare.com logo
Join Discord
Powered by
# r2
  • d

    dav1d

    04/02/2023, 2:06 PM
    see if that is faster
  • x

    xing

    04/02/2023, 10:38 PM
    Hello, I have a question about r2 object's checksums. When put or completing a multi-upload, if a checksum is supplied, does r2 verify the checksum (as oppose to simply record as a metadata)?
  • i

    ickerio

    04/03/2023, 11:22 AM
    Genius UI design to maximise user retention
  • m

    Mike - Dev

    04/03/2023, 1:22 PM
    Hi, I need help getting access to the files stored on my R2. When accessing the bucket through the subdomain I get this:
  • m

    Mike - Dev

    04/03/2023, 1:22 PM
    And when accessing it through the URL given by R2 I get this:
  • m

    Mike - Dev

    04/03/2023, 1:22 PM
    I am developing in Unity (C#) and I need my files to be accessable through that
  • m

    Mike - Dev

    04/03/2023, 1:22 PM
    Can anyone help me?
  • m

    Mike - Dev

    04/03/2023, 1:23 PM
    Public URL Access: Allowed
  • z

    zegevlier

    04/03/2023, 2:43 PM
    You need to load a specific file, not just the subdomain itself. If you have a file called
    myimage.png
    in your bucket, you can access it at
    https://[your r2 subdomain].r2.dev/myimage.png
    .
  • h

    HardAtWork

    04/03/2023, 3:10 PM
    What command are you running?
  • s

    Sid | R2

    04/03/2023, 3:18 PM
    It's got nothing to do with retention, we have no incentive to keep people on the dashboard. If anything, people spending time on the UI is more frustrating for them. The reason you cannot "empty-and-delete" a bucket is because deleting (potentially) millions of objects cannot be made instantaneous. This creates a bunch of non-trivial technical/product/billing problems. Instead of releasing something janky and unpredictable, we decided to let users control their objects (for now). There's a reason why even S3 has this limitation.
  • s

    Sid | R2

    04/03/2023, 3:19 PM
    You won't be able to validate the checksum of the entire multipart upload when completing, but you should be able to verify checksums of individual parts as you upload them. Would that work for you?
  • s

    Sid | R2

    04/03/2023, 3:20 PM
    Looks like you might be using the S3 API's URL (.r2.cloudflarestorage.com) expecting it to behave like a public bucket's URL (.r2.dev or a custom domain)?
  • m

    Mike - Dev

    04/03/2023, 3:20 PM
    Thanks this worked haha
  • t

    Towy

    04/03/2023, 4:48 PM
    I get this when enabling IP filtering: 2023/04/03 18:48:06 Failed to ls: InternalError: We encountered an internal error. Please try again. status code: 500, request id: , host id:
  • t

    Towy

    04/03/2023, 4:48 PM
    Yes my IP is correct, both IPv4 and IPv6
  • c

    Chaika

    04/03/2023, 5:07 PM
    You're the same guy as before. https://discord.com/channels/595317990191398933/940663374377783388/1088608444505923615 Did you ever get a chance to try --bind={your_ipv4} (or disabling IPv6 just to test)?
  • f

    FLIPNEUS

    04/03/2023, 5:13 PM
    I’m still struggling with this issue: https://community.cloudflare.com/t/cannot-delete-r2-custom-domain/421058/3 Would anyone be able to help getting rid of this record?
    s
    • 2
    • 1
  • t

    Towy

    04/03/2023, 5:33 PM
    Right yes, sorry I forgot to try your command as I was in bed when I read it.
  • c

    Chaika

    04/03/2023, 5:33 PM
    No worries, just curious if my suspicious are correct and IP filtering only works over IPv4
  • t

    Towy

    04/03/2023, 5:35 PM
    Bind works!
  • t

    Towy

    04/03/2023, 6:11 PM
    Any way to make it default behaviour in the config file?
  • c

    Chaika

    04/03/2023, 6:16 PM
    Not that I know of. If It truly does not support IPv6 IP Filtering over the S3 API, that's a bug that should be fixed imo. I haven't done enough testing to be 100% sure that it is though, could be something rclone specific or something weird like that.
  • x

    xing

    04/03/2023, 7:39 PM
    In my case, the client includes the checksum for the entire content in its request to complete the upload. As such just verifying the parts won't work for me. Does it mean I have to conduct the verification myself? It's a bit more code, not a big deal. But then is there anyway to attach the checksum metadata? (I notice the s3 api supports a x-amz-checksum-algorithm query param, is this supported in r2?)
  • k

    Karbust

    04/03/2023, 8:04 PM
    Anyone has an update on this? Not having any kind on integrity check is not good... https://community.cloudflare.com/t/does-r2-support-checksum-when-uploading-doesnt-seem-so/488674 I have a few big files (100MB+) that keep getting uploaded incorrectly...
  • k

    Karbust

    04/03/2023, 8:07 PM
    I'm uploading the files like this:
    Copy code
    csharp
                    var fileTransferUtility = new TransferUtility(s3client);
                
                    var uploadRequest = new TransferUtilityUploadRequest
                    {
                        Key = fileKey,
                        FilePath = $"{filePath}",
                        BucketName = settings.CloudflareR2BucketName,
                        DisablePayloadSigning = true,
                        ContentType = "application/octet-stream",
                        DisableMD5Stream = true,
                        Headers =
                        {
                            ["if-none-match"] = $"\"{md5}\"",
                        }
                    };
    
                    Console.WriteLine("\nUploading file {0} ({1:N0} bytes)", file.FileName, new FileInfo($"{filePath}").Length);
    
                    using var progress = new ProgressBar();
                    
                    uploadRequest.UploadProgressEvent += 
                        (_, e) => progress.Report(e.TransferredBytes / (double) e.TotalBytes); 
    
                    await fileTransferUtility.UploadAsync(uploadRequest);
    It's only happening with all files above 100MB, the MD5 uploaded doesn't match the local one...
  • k

    kian

    04/03/2023, 8:09 PM
    Sounds like it's using multipart uploads above 100MB
  • k

    kian

    04/03/2023, 8:10 PM
    https://discord.com/channels/595317990191398933/940663374377783388/971797656810688673
  • k

    Karbust

    04/03/2023, 8:14 PM
    and doesn't cloudflare support multipart uploads?
  • x

    xing

    04/03/2023, 8:15 PM
    It does, but not clear how it handles checksum. I have a similar question above.
1...974975976...1050Latest