-
Navigate to the Cloud Storage Console
-
Click "Create" to create a new bucket
-
Give your bucket a name. Make sure to note it down
-
Choose "Region" for storing the data with low cost (europe-west1-belgium is preferred)
-
Leave storage class, as default
-
Uncheck the box for
Enforce public access prevention on this bucket
-
Uncheck
Soft delete policy
-
Click create. Once it is created, navigate to the
Permissions
tab -
Under Permissions, click
Grant access
, add allUsers as new principal and assign theStorage Object Viewer
role, click save -
A popup will appear, click on
Allow Public Access
-
Navigate to Service Accounts (From GCP Console, choose "IAM & Admin", then select "Service Accounts" tab)
-
Select your project if its not already selected, then click "Create Service Account" and give it a name
-
IMPORTANT! Grant 3 roles to the service account:
- Storage Admin
- Storage Object Admin
- Storage Object Creator
-
Leave "Grant User Access" part empty, click "Done"
-
Then, from the Service Accounts tab, choose your Service Account and choose "Actions", then click "Manage Keys"
-
Click "Add Key", and then "Create new key", choose JSON and create
-
JSON file with your credentials should have been downloaded, store it somewhere safe
These steps assume the Google Cloud SQL Database has been setup and restored following the guide
- Connect to database instance
psql -h <hostname> -U postgres -d ovatify
- Extract the song id and image URL's to a csv file
psql -h <hostname> -U postgres -d ovatify -c "COPY (SELECT id, img_url FROM songs_song) TO STDOUT WITH (FORMAT CSV, HEADER);" > "\path\to\file\song_data.csv"
-
Make sure that you have your CSV file for song data and JSON file for credentials
-
From GCP Console, get the name of the bucket that you created, and also ID of the project that you work on
-
Provide project id, bucket name, to the python script in this directory (change corresponding variables in the script):
image-bucket-upload.py
-
Also make sure that the script can see your JSON and CSV files (put them under the same folder with python script)
-
Run the script. This should upload the images under a folder named "images" to the bucket provided. Change destination_blob_name to change or remove the directory in cloud storage.
- Connect to your database.
psql -h <hostname> -U postgres -d ovatify
- Run the following query:
- IMPORTANT! Please make sure that you change the bucket-name part of the URL with your real bucket name. Also this query assumes your images are under the folder named images. If you've changed the directory while running the Python script, please give the correct path according to your bucket config.
UPDATE songs_song SET img_url = 'https://storage.cloud.google.com/bucket-name/images/' || id || '.jpg';
Make sure to change bucket-name in the query above to your actual bucket name. This updates the img_url field for all songs to refer to your bucket.