How to Upload a Web Based Zip File to Gcp Bucket

Including AppEngine with Firebase Resumable Uploads

For this article I will break downwards downwards a few different ways to collaborate with Google Cloud Storage (GCS). The GCP docs state the following ways to upload your data: via the UI, via the gsutil CLI tool, or via JSON API in various languages. I'll go over a few specific use cases and approaches for the ingress options to GCS below.

1. upload grade on Google App Engine (GAE) using the JSON api
Use case: public upload portal (pocket-size files)
two. upload form with firebase on GAE using the JSON api
Use example: public upload portal (large files), uploads within mobile app
three. gsutil command line integrated with scripts or schedulers similar cron
Utilize example: backups/archive, integration with scripts, migrations
4. S3 / GCP compatible file management programs such as Cyberduck
Employ instance: cloud storage management via desktop, migrations
v. Cloud function (GCF)
Use example: Integration, changes in buckets, HTTP requests
6. Cloud console
Use case: cloud storage management via desktop, migrations

1. App Engine nodejs with JSON API for smaller files

You can launch a pocket-sized nodejs app on GAE for accepting smaller files directly to GCS ~20MB pretty hands. I started with the nodejs GCS sample for GAE on the GCP github account here.

This is a nice solution for integrating uploads around 20MB. Just remember the nginx servers behind GAE take a file upload limit. And then if you try and upload something say effectually 50MB, yous'll receive an nginx error: ☹️

Nginx file upload fault

You can endeavor and upload the file size limit in the js file but still the web servers behind GAE will have a limit for file uploads. So, if you programme to create an upload form on App Engine, be sure to take a file size limitation in your UI.

Nodejs upload form for small files — I'll likely accept this app down at some point.

two. App Engine nodejs Firebase with JSON API and Resumable uploads for large files

Since the previous instance only works for smaller files, I wondered how can we solve for uploading larger files say 100MB or 1GB? I started with the nodejs app engine storage instance here.

After attempting to use resumable uploads in GCS API with TUS and failing I enlisted help from my friend Nathan @ world wide web.incline.digital to help with some other arroyo.

With the assist of Nathan we integrated resumable uploads with firebase SDK. Lawmaking can be found here
https://github.com/mkahn5/gcloud-resumable-uploads.

Reference: https://firebase.google.com/docs/storage/spider web/upload-files

User interaction with the Firebase powered GAE Upload Form

While non very elegant with no status bar or anything fancy this solution does work for uploading large files from the spider web. 🙌🏻

Resumable File upload form on GAE — I'll likely take this app down at some signal.

GCS in the Firebase UI

3. gsutil from local or remote

gsutil makes information technology easy to copy files to and from cloud storage buckets

Merely brand certain you have the google cloud sdk on your workstation or remote server (https://deject.google.com/sdk/downloads), set project and authenticate and thats information technology.

          mkahnucf@meanstack-three-vm:~$            gsutil ls            
gs://artifacts.testing-31337.appspot.com/
gs://staging.testing-31337.appspot.com/
gs://testing-31337-public/
gs://testing-31337.appspot.com/
gs://us.artifacts.testing-31337.appspot.com/
gs://vm-config.testing-31337.appspot.com/
gs://vm-containers.testing-31337.appspot.com/
mkahnucf@meanstack-3-vm:~/nodejs-docs-samples/appengine/storage$ gsutil cp app.js gs://testing-31337-public Copying file://app.js [Content-Type=application/javascript]... / [1 files][ 2.vii KiB/ 2.7 KiB] Operation completed over 1 objects/ii.7 KiB.

More details here.

gsutil makes information technology just piece of cake to automate fill-in of directories, sync changes in directories, backup database dumps, and easily integrate with apps or schedulers for scripted file uploads to GCS.

Beneath is the rsync cron I have for my cloud storage saucepan and the html files on my blog. This fashion I accept consistency between my GCS bucket and my GCE instances if I make up one's mind to upload a file via www or via GCS UI.

Using gsutil on GCP to backup and sync GCS files
          root@mkahncom-example-grouping-multizone-kr5q:~# crontab -fifty          */two * * * *            gsutil rsync -r /var/www/html gs://mkahnarchive/mkahncombackup                    */ii * * * *            gsutil rsync -r gs://mkahnarchive/mkahncombackup /var/www/html                  

4. Cyberduck (MacOS) or whatsoever awarding with an s3 interface

Enjoy an client ftp type feel with Cyberduck on MacOS for GCS.

Cyberduck has very nice oauth integration for connecting to the GCS API built into the interface.

After authenticating with oauth you lot tin browse all of your buckets and upload to them via the cyberduck app. Nice pick to have for moving many directories or folders into multiple buckets.

More info on CyberDuck here.

five. Cloud Function

Yous can also configure a Google Cloud Role (GCF) to upload files to GCS from a remote or local location. This tutorial beneath is but for uploading files in a directory to GCS. Run the cloud function and it zips a local directory files and puts the zip into the GCS stage bucket.

Try the tutorial:
https://cloud.google.com/functions/docs/tutorials/storage

          Michaels-iMac:gcf_gcs mkahnimac$            gcloud beta functions deploy helloGCS -stage-bucket mike-kahn-functions -trigger-bucket mikekahn-public-upload            
Copying file:///var/folders/kq/5kq2pt090nx3ghp667nwygz80000gn/T/tmp6PXJmJ/fun.null [Content-Type=awarding/zip]…
- [1 files][ 634.0 B/ 634.0 B]
Performance completed over one objects/634.0 B.
Deploying function (may take a while — up to 2 minutes)…done.
availableMemoryMb: 256
entryPoint: helloGCS
eventTrigger:
eventType: providers/cloud.storage/eventTypes/object.change
resource: projects/_/buckets/mikekahn-public-upload
latestOperation: operations/bWlrZS1rYWhuLXBlcnNvbmFsL3VzLWNlbnRyYWwxL2hlbGxvR0NTL1VFNmhlY1RZQV9j
name: projects/mike-kahn-personal/locations/us-central1/functions/helloGCS
serviceAccount: mike-kahn-personal@appspot.gserviceaccount.com
sourceArchiveUrl: gs://mike-kahn-functions/us-central1-helloGCS-wghzlmkeemix.zip
condition: READY
timeout: 60s
updateTime: '2017–05–31T03:08:05Z'

Y'all can also utilise cloud functions created to display bucket logs. Below shows a file uploaded via my public upload form and deleted via the panel ui. This could be handy for pub/sub notifications or for reporting.

          Michaels-iMac:gcf_gcs mkahnimac$            gcloud beta functions logs read helloGCS                    LEVEL  Proper noun      EXECUTION_ID     TIME_UTC                 LOG          D      helloGCS  127516914299587  2017-05-31 03:46:19.412  Function execution started
I helloGCS 127516914299587 2017-05-31 03:46:19.502 File FLIGHTS BANGKOK.xlsx metadata updated.
D helloGCS 127516914299587 2017-05-31 03:46:19.523 Function execution took 113 ms, finished with status: 'ok'
D helloGCS 127581619801475 2017-05-31 xviii:31:00.156 Function execution started
I helloGCS 127581619801475 2017-05-31 18:31:00.379 File FLIGHTS BANGKOK.xlsx deleted.
D helloGCS 127581619801475 2017-05-31 18:31:00.478 Function execution took 323 ms, finished with status: 'ok'

Cloud Functions tin come in handy for background tasks like regular maintenance from events on your GCP infrastructure or from activeness on HTTP applications. Check out the how-to guides for writing and deploying cloud functions here.

6. Cloud Console UI

The UI works well for GCS administration. GCP fifty-fifty has a transfer service for files on S3 buckets on AWS or other s3 buckets elsewhere. One thing that is defective in the portal currently would be object lifecycle management. This is nice for automated archiving to coldline cheaper object storage for infrequently accessed files or files over a certain age in buckets. For now you can only modify object lifecycle via gsutil or via API. Similar most GCP features they offset at the function/API level so make their way into that portal (the way it should be IMO) and I'm fine with that. I expect object lifecycle rules to exist implemented into the GCP portal at some signal in the futurity. 😃

GCS UI

In summary I've used a few GCP samples and tutorials that are available to display to different ways to get files onto GCS. GCS is flexible with many ingress options that can exist integrated into systems or applications quite easily! In 2017 the use cases for object storage are abundant and GCP makes it easy to send and receive files in GCS.

Leave a annotate for any interesting utilise cases for GCS that I may have missed or that we should explore. Cheers!

Cheque my blog for more updates.

wrightayed1981.blogspot.com

Source: https://medium.com/google-cloud/use-cases-and-a-few-different-ways-to-get-files-into-google-cloud-storage-c8dce8f4f25a

0 Response to "How to Upload a Web Based Zip File to Gcp Bucket"

Postar um comentário

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel