File Storage

This page will describe how to configure File Storage. For more details on how ScaffoldHub secures the file uploads and downloads, please read Architecture > File Storage.

Localhost

By default, ScaffoldHub saves the uploaded files on the temp directory of the server.

You will want to change the location to a persisted folder if you want to continue using the localhost strategy.

To do that, go to backend/src/services/file/localhostFileStorage.ts and replace this variable:

/**
 * The directory where the files should be uploaded.
 * Change this to a persisted folder.
 */
const UPLOAD_DIR = os.tmpdir();

Cloud

If you plan on hosting your server on the cloud, is better to use a file storage provider on the cloud.

ScaffoldHub has two built-in cloud strategies: Google Cloud Storage and Amazon S3.

Google Cloud Storage

To use Google Cloud Storage, change the FILE_STORAGE_PROVIDER variable fo the backend/.env file to use gcp.

# File Storage Provider
# You must add the credentials of the provider at the
# keys/storage directory.   
# localhost
# gcp (Google Cloud Platform)
# aws (Amazon Web Services)
FILE_STORAGE_PROVIDER = "gcp"

Project and Bucket

Go to https://cloud.google.com/ and create an account.

Create a new project for the development environment.

Go to Storage > Browser and create a new Bucket.

Save the bucket name on the FILE_STORAGE_BUCKET variable. The value must be your bucket-name.

In the example of this image, it will be scaffoldhub-doc-file-storage.appspot.com.

# Bucket used for file storage
# Only for GCP and AWS
FILE_STORAGE_BUCKET="scaffoldhub-doc-file-storage.appspot.com"

The service key

Now our app needs a service key to be able to access the bucket.

Create a service account key that has permission to manage Google Cloud Storage buckets. Follow this: https://cloud.google.com/iam/docs/creating-managing-service-account-keys#creating_service_account_keys.

After you download the JSON file, you will see that it looks like this:

{
  "type": "service_account",
  "project_id": "...",
  "private_key_id": "...",
  "private_key": "...",
  "client_email": "...",
  "client_id": "...",
  "auth_uri": "...",
  "token_uri": "...",
  "auth_provider_x509_cert_url": "...",
  "client_x509_cert_url": "..."
}

Now we must place this entire file into a single environment variable.

If you use VSCode, you can use the join lines command.

Now place this line on theGOOGLE_CLOUD_PLATFORM_CREDENTIALS variable.

GOOGLE_CLOUD_PLATFORM_CREDENTIALS = { "type": "service_account", "project_id": "...", "private_key_id": "...", "private_key": "...", "client_email": "...", "client_id": "...", "auth_uri": "...", "token_uri": "...", "auth_provider_x509_cert_url": "...", "client_x509_cert_url": "..." }

CORS

Install the https://cloud.google.com/sdk.

Sign in to your account by calling gcloud auth login .

Create a gcp-cors.json file on the same folder that you are at the command line.

[
  {
    "maxAgeSeconds": 3600,
    "method": ["GET", "HEAD", "POST", "PUT"],
    "origin": ["*"],
    "responseHeader": [
      "Content-Type",
      "Access-Control-Allow-Origin"
    ]
  }
]

Run this script. Make sure you replace the your-bucket-name by the bucket you created!

gsutil cors set gcp-cors.json gs://your-bucket-name

Done! You are now ready to use the Google Cloud File storage on your project.

Amazon S3

To use Amazon S3, change the FILE_STORAGE_PROVIDER variable fo the backend/.env file to use aws.

# File Storage Provider
# You must add the credentials of the provider at the
# keys/storage directory.   
# localhost
# gcp (Google Cloud Platform)
# aws (Amazon Web Services)
FILE_STORAGE_PROVIDER = "aws"

Project and Bucket

Go to https://aws.amazon.com/console/ and create an account.

Go to Services > S3 and create a new bucket.

Make sure you do not block public files. Things like the workspace background image, workspace logo, and user avatar use files with public permissions for speeding purposes. Other files are by default private.

Save the bucket name on the FILE_STORAGE_BUCKET variable.

# Bucket used for file storage
# Only for GCP and AWS
FILE_STORAGE_BUCKET="scaffoldhub-doc-file-storage"

The AWS credentials

Create your AWS credentials by following this tutorial: https://aws.amazon.com/blogs/security/wheres-my-secret-access-key/.

Save the keys on the backend/.env file.

# Only needed if using aws as the File storage provider
AWS_ACCESS_KEY_ID=""
AWS_SECRET_ACCESS_KEY=""
AWS_REGION="

CORS

  • Go to the Amazon S3 console.

  • Go to Permissions > CORS configuration

  • Add this configuration:

[
    {
        "AllowedHeaders": [
            "*"
        ],
        "AllowedMethods": [
            "PUT",
            "POST",
            "DELETE",
            "GET"
        ],
        "AllowedOrigins": [
            "*"
        ],
        "ExposeHeaders": []
    }
]

Done! You are now ready to use the Amazon S3 storage on your project.

Last updated