- Gavin In The Cloud
- Posts
- Automating Creation of a Google Cloud Storage Bucket, Object, and Cloud Function in GCP with Terraform and GitLab CI/CD
Automating Creation of a Google Cloud Storage Bucket, Object, and Cloud Function in GCP with Terraform and GitLab CI/CD
Streamlining Cloud Workflows for Efficient Deployment of Google Cloud Function
Automating Creation of a Google Cloud Storage Bucket, Object, and Cloud Function in GCP with Terraform and GitLab CI/CD
Introduction: In today's cloud-centric world, automating infrastructure provisioning and application deployment is critical for efficient development and operations. In this blog post, we will explore how to automate the creation of a Google Cloud Storage (GCS) bucket, upload an object to it, and deploy a Google Cloud Function using Terraform. Additionally, we will leverage GitLab CI/CD to set up a continuous integration and deployment pipeline, automating the entire process for seamless and reliable cloud workflows.

Prerequisites: Before diving into the implementation, make sure you have the following prerequisites in place:
A Google Cloud Platform (GCP) account with the necessary permissions to create GCS buckets and Cloud Functions.
A GitLab account with a repository set up to manage your Terraform code.
Repo Structure: To maintain a well-organized project structure, we will follow this directory structure within our GitLab repository: GitLab-Repo

You can simply clone my public repository: GitLab-Repo
Terraform Configuration: Let's explore the details of each component of our Terraform code:
main.tf: The main.tf
file contains the core Terraform configuration, defining the resources and their properties to be provisioned in the google cloud environment. In this context, it sets up a Google Cloud Storage (GCS) bucket, uploads an object (index.zip) to it, and deploys a Google Cloud Function.
#Step1.google_storage_bucket
resource "google_storage_bucket" "gitlab-bucket" {
name = "sandbox-cloud-functions-bucket-gitlab"
location = "us-central1" # Replace with the desired location for your bucket
# Additional bucket configuration options can be specified here
}
#Step2.google_storage_bucket_object
resource "google_storage_bucket_object" "object" {
name = "objects"
source = "index.zip" #Use index-v2.zip for generation 2
bucket = google_storage_bucket.gitlab-bucket.name
}
#Step3.google_cloudfunctions_functions
resource "google_cloudfunctions_function" "function" {
name = "cloud-function-gitlab"
description = "This is my first cloud function from terraform script"
runtime = "nodejs16"
available_memory_mb = 128
source_archive_bucket = google_storage_bucket.gitlab-bucket.name
source_archive_object = google_storage_bucket_object.object.name
trigger_http = true
entry_point = "helloWorld"
}
#Step4.google_cloudfunctions_functions_iam_member
resource "google_cloudfunctions_function_iam_member" "allow_access_tf" {
region = google_cloudfunctions_function.function.region
cloud_function = google_cloudfunctions_function.function.name
role = "roles/cloudfunctions.invoker"
member = "allUsers"
}
provider.tf: The provider.tf
file specifies the configuration for the Terraform provider, defining the target cloud platform and its necessary details, such as backend bucket, project ID, region, and zone. It allows Terraform to interact with GCP and manage resources in the designated project.
terraform {
required_providers {
google = {
source = "hashicorp/google"
version = "4.58.0"
}
}
backend "gcs" {
bucket = "your-backend-bucket" // Replace with your backend bucket name
prefix = "terraform/state"
}
}
provider "google" {
project = "your-project-id" // Replace with your project ID
region = "us-central1" // Replace with your desired region
zone = "us-central1-c" // Replace with your desired zone
}
index.zip: This index.zip
section contains the code and configuration files necessary for deploying a simple Google Cloud Function using Node.js runtime. The index.json file defines a Cloud Function named "helloWorld" that responds to HTTP requests by sending a customizable message. The function checks for a message parameter in the request's query or body and sends the provided message or 'Hello World!' if none is provided. The package.json file provides metadata for the Cloud Function, including its name and version.
index.json:
/**
* Responds to any HTTP request.
*
* @param {!express:Request} req HTTP request context.
* @param {!express:Response} res HTTP response context.
*/
exports.helloWorld = (req, res) => {
let message = req.query.message || req.body.message || 'Hello World!';
res.status(200).send(message);
};
package.json:
{
"name": "sample-http",
"version": "0.0.1"
}
GitLab CI/CD Configuration: The .gitlab-ci.yml
file sets up the CI/CD pipeline for automating the infrastructure deployment process. It defines stages, jobs, and associated scripts to perform tasks such as validation, planning, applying, and destroying Terraform changes.
---
workflow:
rules:
- if: $CI_COMMIT_BRANCH != "main" && $CI_PIPELINE_SOURCE != "merge_request_event"
when: never
- when: always
variables:
TF_DIR: ${CI_PROJECT_DIR}/terraform
STATE_NAME: "gitlab-terraform-gcp-tf"
stages:
- validate
- plan
- apply
- destroy
image:
name: hashicorp/terraform:light
entrypoint: [""]
before_script:
- terraform --version
- cd ${TF_DIR}
- terraform init -reconfigure
validate:
stage: validate
script:
- terraform validate
cache:
key: ${CI_COMMIT_REF_NAME}
paths:
- ${TF_DIR}/.terraform
policy: pull-push
plan:
stage: plan
script:
- terraform plan
dependencies:
- validate
cache:
key: ${CI_COMMIT_REF_NAME}
paths:
- ${TF_DIR}/.terraform
policy: pull
apply:
stage: apply
script:
- terraform apply -auto-approve
dependencies:
- plan
cache:
key: ${CI_COMMIT_REF_NAME}
paths:
- ${TF_DIR}/.terraform
policy: pull
destroy:
stage: destroy
script:
- terraform destroy -auto-approve
dependencies:
- plan
- apply
cache:
key: ${CI_COMMIT_REF_NAME}
paths:
- ${TF_DIR}/.terraform
policy: pull
when: manual
Implementation Steps: Now that we have our code and pipeline set up, let's walk through the implementation steps to automate the creation of a Google Cloud Storage (GCS) bucket, upload an object, and deploy a Google Cloud Function using Terraform and GitLab CI/CD.
Set up GitLab Repository: Create a new repository on GitLab or use an existing one to host your Terraform code. If you haven't already, clone the repository from the following link: GitLab-Repo
Configure GCP Provider: In the provider.tf file, configure the GCP provider by specifying your GCP backend bucket, project ID, region, and zone.
Set Secrets in GitLab: In your GitLab repository, navigate to Settings > CI/CD > Variables. Add a new variable named "GOOGLE_CREDENTIALS" and paste the contents of your Google Cloud service account key file into the value field. This securely provides the necessary credentials for Terraform to authenticate with GCP.

Note: Make sure to remove any white spaces in your token content before pasting it.
Upload index.zip: Upload index.zip file into your GitLab repository, add the index.zip file containing the necessary files for the Google Cloud Function deployment.
Run the Pipeline: Commit and push your Terraform code, along with the index.zip file, to the GitLab repository. This action will trigger the GitLab CI/CD pipeline. Monitor the pipeline execution in the CI/CD section of your repository to ensure it completes successfully.
Verify Resource Creation in GCP: Verify GCS Bucket, Object, and Cloud Function Creation in GCP After the pipeline is finished, verify the creation of the GCS bucket, the object upload, and the Google Cloud Function deployment in the Google Cloud Platform (GCP) Console. Ensure that the resources have been provisioned accurately.
Conclusion: In this blog post, we successfully automated the creation of a Google Cloud Storage (GCS) bucket, uploaded an object, and deployed a Google Cloud Function using Terraform and GitLab CI/CD. By following the steps outlined above, you can now efficiently manage and automate your GCP resources. Remember to regularly update your Terraform code and pipeline to reflect any changes in your cloud infrastructure requirements. By combining Terraform and GitLab CI/CD, you can streamline cloud workflows, improve consistency, and minimize manual intervention. Happy automating!
References: GitLab-Repo