Dive into the world of serverless - GCP Edition
Suraj Kamal
Posted on July 29, 2024
First of all, I would like to extend my gratitude to Rishab for creating the "Cloud Resume API Challenge." This Challenge presents an excellent opportunity to build and deploy a serverless API using various cloud providers of your choice, integrated seamlessly with GitHub Actions for continuous integration and deployment.
This challenge is especially important given the current trends in cloud computing. As more businesses transition to the cloud, knowing how to build and deploy serverless solutions becomes increasingly valuable. Serverless architectures offer scalability, cost-efficiency, and lower maintenance, making them an ideal choice for modern applications.
My Journey with the Cloud Resume API Challenge
Embarking on the Cloud Resume API Challenge, I chose Google Cloud Platform (GCP) as my cloud service provider. The objective was to build a serverless function that fetches resume data from a NoSQL database and returns it in JSON format, all while integrating GitHub Actions for automatic deployment. This journey not only honed my cloud computing skills but also provided valuable insights into the world of serverless architectures.
High-Level Design
This design leverages the power of Google Cloud Platform services to manage API requests, handle authentication, execute backend logic, and retrieve data efficiently. The integration with GitHub and GitHub Actions ensures a seamless development and deployment workflow, enhancing productivity while maintaining high standards of security and performance.
Key Components
Important components
User Interaction:
- API Request & Response: Users send API requests and receive responses through the Apigee Proxy.
API Management with Apigee Proxy:
- Apigee Proxy: Acts as a facade for the serverless function, providing a secure and scalable API endpoint for external users. This is crucial because exposing the cloud function to public users can overwhelm the system, causing it to go down and increasing costs.
- Quota Check: Ensures that the API usage complies with defined quotas.
- Token Generation & Assignment: Generates and assigns an ID token to the request URL invoking the cloud function, as the function requires authentication to execute the code.
Cloud Functions:
- Cloud Function: The serverless function that fetches resume data from Firestore and returns it in JSON format. It is triggered by an HTTP request and requires an ID token for authentication.
- Deployment: Cloud Functions are deployed using the gcloud functions deploy command, initiated by GitHub Actions.
Cloud IAM (Identity and Access Management):
- Token Authentication: Ensures that only authorized users can access the cloud function by validating the ID token in the request URL.
- Data Storage: Stores the resume data in a structured format, enabling efficient retrieval by the cloud function.
Development & Deployment Process:
- GitHub Repository: Contains the source code for the cloud function and the GitHub Actions workflow.
- GitHub Actions: Automatically deploys the cloud function whenever changes are pushed to the repository, ensuring a continuous integration and deployment pipeline.
Getting Started
In this guide, I will walk you through the detailed steps required to set up and build your cloud function using Google Cloud Platform. Whether you are a beginner or have some experience with cloud computing, this step-by-step tutorial is designed to help you understand each component involved in the process. By following along, you will gain hands-on experience with serverless architectures, API management, and continuous deployment using GitHub Actions.
Prerequisites:
- Google Cloud Platform account with billing enabled.
-
gcloud
CLI installed and configured with your GCP account. - Apigee account for managing APIs (Optional) - You can use the Cloud Functions URL directly if you don't want to use Apigee.
- GitHub account for version control and CI/CD.
Setup & Deployment:
- Create a new project in GCP.
- Open the GCP Console and navigate to the project selector.
- Click on
New Project
and enter a name for your project. - Click on
Create
to create the project. - Note down the Project ID, as you will need it later.
- Enable billing for the project.
- Enable APIs.
- Navigate to the
APIs & Services
section in the GCP Console. - Click on
Enable APIs and Services
. - Enable the following APIs:
- Cloud Functions API
- Cloud Run Admin API
- Firestore API
- Cloud Resource Manager API
- Apigee API (Optional)
- Navigate to the
- Create Service Accounts.
- Navigate to the
IAM & Admin
section in the GCP Console. - Click on
Service Accounts
and create the following service accounts.-
firestore-reader
for reading data from Firestore with following roles. To be used by the cloud function. Cloud Datastore User
-
github-actions
for deploying the cloud function using GitHub Actions. Cloud Functions Admin, Cloud Run Admin, Service Account User
-
apigee-proxy
for invoking the cloud function using Apigee (Optional). Cloud Functions Invoker, Cloud Run Invoker
-
- Navigate to the
- Create a Firestore Database.
- Navigate to the
Firestore
section in the GCP Console. - Click on
Create Database
. - Choose the
Native Mode
and click onNext
. - Select a location for your database and click on
Create
.
- Navigate to the
- Load Data into Firestore.
- Create a new collection in Firestore and add the resume data in JSON format.
- Refer schema in
resume.json
for the resume data structure.
- Create a Cloud Function.
- Write the cloud function code to fetch data from Firestore and return it in JSON format. Refer this repository for sample code
src/main/java/demo/GetResumeData.java
.
- Write the cloud function code to fetch data from Firestore and return it in JSON format. Refer this repository for sample code
- Set up GitHub actions.
- Create a new repository in GitHub and add the cloud function code.
- Create a new GitHub Actions workflow to deploy the cloud function whenever changes are pushed to the repository. Refer this repository for sample workflow
.github/workflows/deploy.yml
. - You will need to add the following secrets to your repository:
-
GCP_PROJECT_ID
: The ID of your GCP project. -
GCP_REGION
: The region where you want to deploy the cloud function. -
GCP_SA_KEY
: The service account key for thegithub-actions
service account.
-
- Build and deploy the cloud function.
- Push the changes to your repository to trigger the GitHub Actions workflow.
- The workflow will build and deploy the cloud function to GCP.
- Command used by GitHub Actions to deploy the cloud function:
- Please note that the
-no-allow-unauthenticated
flag is used to enforce authentication for the cloud function and it requires an ID token to be passed in the request URL. This will be generated by the Apigee Proxy in the next step. You can remove this flag if you want to allow unauthenticated access.
- Set up Apigee Proxy (Optional) - If you want to use Apigee for API management.
- Create a new API proxy in Apigee and configure it to invoke the cloud function.
- Configure the cloud function url as the target endpoint for the proxy.
- Add a quota policy to limit the number of requests per minute. For example, you can set the quota to 100 requests per minute.
- Add another policy to generate an ID token and pass it to the cloud function in the request URL. You can read more about generating ID tokens in Apigee here.
- How to run it?:
- Once the cloud function is deployed, you can access it using the URL generated by the deployment command.
- If you are using Apigee, you can access the cloud function through the Apigee proxy URL.
Testing:
Demo URL: resume-api
Sample Response:
Sample Quota Exceeded Response:
The quota policy is set to 10 requests per minute in the Apigee proxy. If the quota is exceeded, the user will receive the following response.
How to Test Cloud Function Locally? - Bonus
By following these steps, you can set up a local environment to test your cloud function, ensuring it works as expected before deploying it to Google Cloud Platform. This process helps in identifying and fixing issues early, making the deployment smoother and more reliable.
-
Install the Google Cloud SDK:
- Download and install the Google Cloud SDK from the official Google Cloud website.
- Run
gcloud init
to authenticate with your GCP account and configure the SDK. - Set the project ID and region using the following commands:
gcloud config set project <PROJECT_ID> gcloud config set run/region <REGION>
-
Run the cloud function locally:
- Add the necessary dependencies to your
pom.xml
file to enable local testing.
<dependency> <groupId>com.google.cloud.functions.invoker</groupId> <artifactId>java-function-invoker</artifactId> <version>1.0.0</version> </dependency>
- Add the necessary dependencies to your
-
Create a
Main
function in your project and add the following code to invoke the cloud function locally:
Invoker.main(new String[] { "--target", "demo.GetResumeData", "--port", "8081" });
Future Enhancements
- Caching: Add caching mechanisms to improve the performance of the cloud function by reducing the number of requests to Firestore.
- Monitoring & Logging: Set up monitoring and logging for the cloud function to track usage, performance, and errors.
- Extend the cloud function functionality: Add more features to the cloud function, such as filtering data based on user input, adding search functionality, etc.
- AI/ML Integration: Integrate AI/ML models to analyze the resume data and provide insights to the users.
By following this guide, you can build a robust and scalable serverless resume API using Google Cloud Platform, ensuring a seamless and efficient deployment process with GitHub Actions.
I have not included the detailed steps for the Apigee setup and proxy development in this blog, as these require a different set of skills and additional development effort. However, if you are interested in learning about the Apigee proxy development process, please let me know in the comments. I will be happy to create a new blog post dedicated to this topic.
Posted on July 29, 2024
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.