Hello everyone i have a scnerio in project inwhich...
# orm-help
m
Hello everyone i have a scnerio in project inwhich i have to upload large file to gcp bucket but i don't know how to resolve this problem my backend is on nodejs and frontend is on reactjs.Now how i can send chunks of a file and supload file to gcp bucket chunk by chunk and want a whole file uploaded url after every chunk get uploaded to gcp bucket successfully i also want a resumable functionality so that if internet connection lost or something happen bad it resume the file upload process i'm searching for 3 days but i'm not able to figure out any solution yet plz help.
1
m
Trying to understand but so vague info.,
I am not able to understand how i will make it resumable and upload chunk of files after generating thie url on front end?
n
By Glancing through the docs the process should look like this: • Initiating Resumable Upload Session: https://cloud.google.com/storage/docs/performing-resumable-uploads#initiate-session • Creaking multiple chunks and uploading: https://cloud.google.com/storage/docs/performing-resumable-uploads#chunked-upload These links are also helpful: Checking status of an upload: https://cloud.google.com/storage/docs/performing-resumable-uploads#status-check Resuming an interrupted upload: https://cloud.google.com/storage/docs/performing-resumable-uploads#resume-upload
m
I have seen this but i want front end example after getting signedurl from nodejs backend what we have to do on front end im looking for that example.
n
I am not able to find the exact frontend example but I came across this pretty famous npm library which you might want to have a look at: https://www.npmjs.com/package/gcs-resumable-upload
m
@Nurul can we use on front end?