How to use Google Secret Manager to improve data security

Learn how to use Google Secret Manager to create secure environmental variables to hold your sensitive credentials and avoid storing them in your scripts.

How to use Google Secret Manager to improve data security
Picture by James Sutton, Unsplash.
5 minutes to read

Google Cloud Functions make it easy to deploy Python data science applications and models in the cloud as serverless applications. Since it’s inevitable that these applications need to access sensitive project credentials, such as server locations, usernames, and passwords, which may change from time to time, you’ll need a way to store these values securely. This is where Google Secret Manager comes in.

Google Secret Manager is a Google Cloud Platform application that allows you to store secret or sensitive data, such as usernames, passwords, API keys, or encryption keys, and access them from your GCP projects, such as Google Cloud Functions.

Google Secret Manager allows you to create and store secrets, add labels, monitor their usage, and edit the details and update them seamlessly across your projects. Here’s how you can get started using it in your data science projects.

Enable Secret Manager in the GCP console

Like other applications in GCP, there are various ways to set up and administer Secret Manager. You can use the front-end at com to enable Secret Manager, or you can do it from the Cloud Shell. You can access Cloud Shell hereand enter the below command to enable Secret Manager.

gcloud services enable

Create your secrets

As a simple example, we’ll use the Cloud Shell to create some secrets to store our MySQL username and password. We’ll add a key=value pair to this containing a label for platform which is set to mysql to help keep the secrets organised. Run the commands below, then go to Secret Manager in the Google Cloud Console and you should see the credentials stored in your project.

printf "root" | gcloud secrets create mysql_username \
--data-file=- \
--replication-policy=automatic \

printf "SecretPasswordHere" | gcloud secrets create mysql_password \
--data-file=- \
--replication-policy=automatic \

You can check that your secrets have been stored correctly using the below commands in Cloud Shell.

gcloud secrets versions access 1 --secret="mysql_username"
gcloud secrets versions access 1 --secret="mysql_password"

Grant permissions to your Service Account

Before you can deploy your function, you’ll need to provide its Service Account with the appropriate permissions to access your secrets. To do this you’ll need the email address for the project’s Service Account, which you can find in the Cloud Console under IAM & Identity > Service Accounts. Run the two below commands to set up access.

gcloud secrets add-iam-policy-binding mysql_username \
    --role roles/secretmanager.secretAccessor \

gcloud secrets add-iam-policy-binding mysql_password \
    --role roles/secretmanager.secretAccessor \

Create a Google Cloud Function

Finally, we’re going to create a Google Cloud Function Python script which is going to access the variables stored in Google Secret Manager. Go to Cloud Functions in the Google Cloud Console and create a new function:

  • Function name: secret_example
  • Region: europe-west2
  • Trigger type: HTTP
  • Authentication: Allow unauthenticated invocations
  • Runtime: Python 3.8
  • Entry point: get_secret

In you need to import the secretmanager package from, then create a client object using client = secretmanager.SecretManagerServiceClient(). We’ll then pass a dictionary containing the project and secret details to the service client, decode the payload and assign it to a variable. This simple dummy script just returns the username.

from import secretmanager

client = secretmanager.SecretManagerServiceClient()

project_id = "api-project-25998202202"

response = client.access_secret_version({"name": "projects/"+project_id+"/secrets/mysql_username/versions/latest"})

def get_secret(request):

Since we’re loading the Python package google-cloud-secret-manager we need to add this to the requirements.txt file so Python runs it when deploying and loads the required packages to run the script.


Run the Cloud Function

You can test your function by either accessing its URL to trigger it, or you can trigger it via the Cloud Shell. You can do this using the command below. You’ll usually need to define the region in which you created your function, otherwise you’ll receive an error telling you the project does not exist in the default region.

gcloud functions call secret_example --region europe-west2

Matt Clarke, Thursday, March 04, 2021

Matt Clarke Matt is an Ecommerce and Marketing Director who uses data science to help in his work. Matt has a Master's degree in Internet Retailing (plus two other Master's degrees in different fields) and specialises in the technical side of ecommerce and marketing.