Building and Testing the Gemini API with CI/CD Pipeline

nirajan_mahara

Nirajan Mahara

Posted on November 26, 2024

Building and Testing the Gemini API with CI/CD Pipeline

Building and Testing the Gemini API with CI/CD Pipeline

In today’s fast-paced world of software development, efficiency, reliability, and scalability are key to delivering high-quality applications. One way to achieve this is by automating the software development lifecycle using Continuous Integration and Continuous Deployment (CI/CD). This technical blog explores the process of building, testing, and deploying a Gemini API using Flask and Docker, while also integrating a CI/CD pipeline using GitHub Actions.

Introduction

The Gemini API serves as a backend interface for a chatbot application that generates responses to specific prompts. This API is designed using Flask and interacts with the Google Gemini AI model, configured with specific instructions to handle particular domains of queries. By implementing a CI/CD pipeline, we can automate the build, testing, and deployment processes, ensuring a smoother and more reliable workflow for future updates to the application.

In This blog I will walk through the setup and functionality of the Gemini API, how we tested it, and how we integrated it with GitHub Actions to automate the deployment process. Additionally, we'll explore how CI/CD enhances the efficiency of software delivery and helps catch potential issues early.

Setting Up the Gemini API

The core of the project is the Gemini API, a Flask-based web application that processes user input and generates responses using Google’s Gemini API. The API interacts with the Gemini AI model, which is specifically configured to provide responses related to DevOps, Flask, Docker, and GitHub CI/CD pipelines.

Here’s an overview of the functionality:

  • User Input (Prompt Handling): When the user sends a request to the /chat endpoint with a prompt (e.g., "How do I set up Docker for Flask?"), the API retrieves the prompt and sends it to the Gemini AI model.
  • Model Configuration: The model is configured with a system instruction that restricts the responses to a specific domain of knowledge (DevOps-related topics, in this case). This ensures that the chatbot only provides relevant answers.
  • Generating Content: The generate_content() function sends the prompt to the Gemini AI model and returns the response.

Here’s the relevant code snippet for generating content in gemini_api.py:

import google.generativeai as genai
import os
from dotenv import load_dotenv

load_dotenv()

genai.configure(api_key=os.environ["GEMINI_API_KEY"])

instructions = """
You are a DevOps Engineer, you must provide details about how to build, test, and deploy a flask application
in GitHub. You can answer questions about flask in Python, GitHub CI/CD pipeline and docker container.
Any question outside of this topic must be answered with a response as 'I can only answer questions about flask,
GitHub CI/CD pipeline, and Docker container.'
Format your answer using markdown formatting and try to provide reasoning and details about your answer.
"""

model = genai.GenerativeModel(
    model_name="gemini-1.5-flash",
    system_instruction=instructions
)

def generate_content(user_prompt):
    response = model.generate_content(user_prompt)
    return response.text
Enter fullscreen mode Exit fullscreen mode

code snippet for generating content in  raw `gemini_api.py` endraw

Testing the Gemini API

Testing is a critical aspect of ensuring that the Gemini API works correctly and reliably. For this project, we employed both unit tests and integration tests.

  • Unit Testing: We wrote unit tests to verify that the generate_content() function handles different types of prompts correctly. For example, we tested the API's response to valid prompts (e.g., "How do I deploy a Flask app with GitHub Actions?") and invalid prompts (e.g., "Tell me about quantum mechanics.").
  • Integration Testing: In addition to unit tests, we also wrote integration tests to ensure that the Flask routes behave as expected. For example, we tested the /chat route to ensure it responds correctly to POST requests.

Here’s an example of the test for a valid prompt in tests/test_gemini_api.py:

import unittest
from gemini_api import generate_content
from dotenv import load_dotenv


class TestGeminiAPI(unittest.TestCase):
    # Load the environment variables in the setUp function
    # Since it's already defined in gemini_api.py, this function is not doing anything for us here
    def setUp(self):
        load_dotenv("../.env")

    def test_valid_prompt_1(self):
        # Test for a valid prompt within the topic
        prompt = "How do I deploy a Flask app with GitHub Actions?"
        response = generate_content(user_prompt=prompt)
        # print(response)
        # check if "Flask" is part of the response
        self.assertIn("Flask", response)
        # check if "GitHub" is part of the response
        self.assertIn("GitHub", response)

    def test_invalid_prompt_1(self):
        # Test for an invalid prompt within the topic
        prompt = "Tell me about quantum mechanics."
        response = generate_content(user_prompt=prompt)
        print(response)
        # check if generic response is part of the response
        self.assertIn("I can only answer questions about flask, GitHub CI/CD pipeline, and Docker container", response)
Enter fullscreen mode Exit fullscreen mode

Testing the Gemini API

API Request Using Postman with text valid prompt

API Request Using Postman with text invalid prompt

We also implemented tests for invalid prompts to ensure that the API returns the correct message when the user asks about topics outside the allowed scope.

Setting Up CI/CD with GitHub Actions

CI/CD helps automate repetitive tasks such as building Docker images, running tests, and deploying the application. In this section, we will walk through the configuration of our CI/CD pipeline using GitHub Actions.

Building the Docker Image

We first create Docker images to containerize the application. A Dockerfile defines the steps to build the application image. The Dockerfile installs dependencies, copies the code into the container, and starts the Flask application:

# Use the official Python 3.11 slim image as the base image
FROM python:3.11-slim

# Set the working directory in the container
WORKDIR /

# Copy the contents of the current directory into the container
COPY . .

# Install the dependencies specified in the requirements.txt
RUN pip install -r requirements.txt

# Expose port 5001 (since we're running our app on this port)
EXPOSE 5001

# Set the default command to run when the container starts
CMD ["python", "main.py"]

Enter fullscreen mode Exit fullscreen mode

This Dockerfile will:

  1. Build the image using Python 3.11 slim.
  2. Set the working directory to / and copy all the project files into the container.
  3. Install the dependencies listed in requirements.txt.
  4. Expose port 5001 to make the Flask app accessible from outside the container.
  5. Set the default command to run the Flask app using python main.py.

Docker build

The DockerfileTest is used for testing purposes. It installs dependencies and runs unit tests:

# Use the official Python 3.11 slim image as the base image
FROM python:3.11-slim

# Set the working directory inside the container
WORKDIR /

# Copy the entire current directory into the container
COPY . .

# Install the dependencies specified in requirements.txt
RUN pip install -r requirements.txt

# Set the default command to run when the container starts: run unit tests
CMD ["python", "-m", "unittest", "discover", "-s", "tests"]
Enter fullscreen mode Exit fullscreen mode

This DockerfileTest will:

  1. Build the image using Python 3.11 slim.
  2. Set the working directory to /.
  3. Copy all files from our project into the container.
  4. Install the dependencies listed in requirements.txt.
  5. By default, run unit tests from the tests directory using the unittest discover command when the container starts.

dockerfiletest build

Configuring GitHub Actions for Continuous Integration and Deployment

GitHub Actions is a powerful tool for automating workflows in our repository. By configuring actions, we can automate tasks like testing, building, and deploying our code each time we push a change to our repository. Here, we will discuss how to configure GitHub Actions in the context of building, testing, and deploying Docker images using a simple CI/CD pipeline.

Overview of the Pipeline

In this case, we define a workflow in .github/workflows/pipeline.yml. This pipeline automates the process of building Docker images, running tests on them, and deploying them to a development environment. The pipeline consists of three main jobs:

  1. Build Job: This job is responsible for building the Docker images. It builds both the main application image and the test image, which will be used in subsequent jobs.

  2. Test Job: This job is responsible for running unit tests on the test Docker image. The test image runs tests against the application to ensure the code is functioning as expected.

  3. Deploy Job: After the tests are successfully run, the deploy job will deploy the main application Docker image to a development environment.

Structure of the GitHub Actions Workflow

Let's break down the pipeline configuration:

on:
  push:
    branches:
      - main

env:
  IMAGE_TAG_MAIN: geminiapi
  IMAGE_TAG_TEST: geminiapitest
  IMAGE_VERSION: v1.0.0

jobs:
  build_job:
    name: Build Docker Image
    runs-on: gemini-win-runner
    steps:
      - name: Clone and checkout
        uses: actions/checkout@v2

      - name: Build Main Image
        run: docker build -t ${{ env.IMAGE_TAG_MAIN }}:${{ env.IMAGE_VERSION }} -f Dockerfile .

      - name: Build Test Image
        run: |
          mv .dockerignore .dockerignore.temp
          docker build -t ${{ env.IMAGE_TAG_TEST }}:${{ env.IMAGE_VERSION }} -f DockerfileTest .
          mv dockerignore.temp .dockerignore

  test_job:
    name: Test Docker Image
    runs-on: gemini-win-runner
    environment: development
    steps:
      - name: Unit testing
        run: |
          docker run --rm \
            -e GEMINI_API_KEY=${{ secrets.GEMINI_API_KEY }} \
            ${{ env.IMAGE_TAG_TEST }}:${{ env.IMAGE_VERSION }}

  deploy_job:
    name: Deploy Docker Image
    runs-on: gemini-win-runner
    environment: development
    steps:
      - name: Remove Running Containers
        run: docker rm -f ${{ env.IMAGE_TAG_MAIN }} || true

      - name: Deploy Main Image
        run: docker run -d -p 5001:5001 -e GEMINI_API_KEY=${{ secrets.GEMINI_API_KEY }} --name ${{ env.IMAGE_TAG_MAIN }} ${{ env.IMAGE_TAG_MAIN }}:${{ env.IMAGE_VERSION }}
Enter fullscreen mode Exit fullscreen mode

Breakdown of the Pipeline Configuration

1. Triggering the Pipeline

The pipeline is triggered on a push event to the main branch. This means that every time a commit is pushed to the main branch, this workflow will automatically start.

on:
  push:
    branches:
      - main
Enter fullscreen mode Exit fullscreen mode

2. Environment Variables

We define some environment variables that are used throughout the workflow:

  • IMAGE_TAG_MAIN: Tag for the main Docker image.
  • IMAGE_TAG_TEST: Tag for the test Docker image.
  • IMAGE_VERSION: Version of the Docker images (in this case, v1.0.0).
env:
  IMAGE_TAG_MAIN: geminiapi
  IMAGE_TAG_TEST: geminiapitest
  IMAGE_VERSION: v1.0.0
Enter fullscreen mode Exit fullscreen mode

3. Build Job

The Build Job is responsible for building both the main and test Docker images. This job is executed on a GitHub-hosted runner using the gemini-win-runner environment. It consists of the following steps:

  1. Checkout Code: The first step uses the actions/checkout@v2 action to clone the repository to the runner.
   - name: Clone and checkout
     uses: actions/checkout@v2
Enter fullscreen mode Exit fullscreen mode
  1. Build Main Image: The second step builds the main Docker image using the Dockerfile. The image is tagged using the IMAGE_TAG_MAIN and IMAGE_VERSION environment variables.
   - name: Build Main Image
     run: docker build -t ${{ env.IMAGE_TAG_MAIN }}:${{ env.IMAGE_VERSION }} -f Dockerfile .
Enter fullscreen mode Exit fullscreen mode
  1. Build Test Image: In the third step, the .dockerignore file is temporarily renamed to ensure that the docker build command uses the correct configuration for the test image. The test image is built using the DockerfileTest, and then the .dockerignore file is restored to its original state.
   - name: Build Test Image
     run: |
       mv .dockerignore .dockerignore.temp
       docker build -t ${{ env.IMAGE_TAG_TEST }}:${{ env.IMAGE_VERSION }} -f DockerfileTest .
       mv dockerignore.temp .dockerignore
Enter fullscreen mode Exit fullscreen mode

4. Test Job

The Test Job runs unit tests on the test Docker image. The docker run command executes the test image, passing the GEMINI_API_KEY from GitHub secrets as an environment variable. This job is also executed in the gemini-win-runner environment.

test_job:
  name: Test Docker Image
  runs-on: gemini-win-runner
  environment: development
  steps:
    - name: Unit testing
      run: |
        docker run --rm \
          -e GEMINI_API_KEY=${{ secrets.GEMINI_API_KEY }} \
          ${{ env.IMAGE_TAG_TEST }}:${{ env.IMAGE_VERSION }}
Enter fullscreen mode Exit fullscreen mode

5. Deploy Job

The Deploy Job deploys the main Docker image to a development environment. It performs the following steps:

  1. Remove Running Containers: This step ensures that any previously running container with the same name (geminiapi) is removed before starting the new container.
   - name: Remove Running Containers
     run: docker rm -f ${{ env.IMAGE_TAG_MAIN }} || true
Enter fullscreen mode Exit fullscreen mode
  1. Deploy Main Image: The main Docker image is then deployed, running the container in detached mode (-d) and mapping port 5001 on the host to 5001 in the container. The GEMINI_API_KEY is passed as an environment variable for the container.
   - name: Deploy Main Image
     run: docker run -d -p 5001:5001 -e GEMINI_API_KEY=${{ secrets.GEMINI_API_KEY }} --name ${{ env.IMAGE_TAG_MAIN }} ${{ env.IMAGE_TAG_MAIN }}:${{ env.IMAGE_VERSION }}
Enter fullscreen mode Exit fullscreen mode

This configuration defines a simple CI/CD pipeline in GitHub Actions that automates the process of building, testing, and deploying Docker images. The key jobs in the pipeline are the build, test, and deploy jobs. Each job ensures that the Docker images are properly built, tested, and deployed to the correct environment.

With this configuration in place, we can ensure that every change pushed to the main branch will trigger an automated process to build, test, and deploy our application, providing an efficient and streamlined workflow for continuous integration and deployment.

Benefits of Automation

Implementing CI/CD automation streamlines the software development lifecycle by eliminating manual, repetitive tasks, reducing the risk of human error, and ensuring consistency across the build, test, and deployment stages. Every code change is automatically validated through predefined workflows, enhancing reliability and catching issues early in the pipeline. This results in faster feedback loops, enabling developers to identify and address problems quickly. Automation also accelerates the release process, allowing teams to deploy updates or fixes more frequently and efficiently. Ultimately, it fosters a more agile, scalable, and error-resilient development environment.

Challenges and Solutions

Managing sensitive data, like the GEMINI_API_KEY, posed a challenge as we needed to ensure its security while still making it accessible to the CI/CD pipeline. To address this, we utilized GitHub’s Secrets feature, which securely stores sensitive information and allows us to reference it in workflows without exposing it in the codebase. This ensured that the API key remained protected while still being available for deployment and testing.

Another challenge was maintaining consistency across test environments. To solve this, we leveraged Docker containers, ensuring that both the application and tests ran in an isolated, consistent environment. This approach eliminates discrepancies between local development setups and the CI/CD pipeline, guaranteeing that the tests produce reliable, reproducible results regardless of where they are executed.

Conclusion

Building the Gemini API and setting up a CI/CD pipeline has been an invaluable learning experience, enabling me to gain practical knowledge in developing and deploying a Flask application. I was able to effectively integrate testing and automation, streamlining the build and deployment process using Docker and GitHub Actions. This not only improved efficiency but also reduced the risk of errors during the deployment phase.

Looking ahead, there are numerous opportunities to enhance the project. I could introduce more comprehensive test cases to ensure better coverage, enhance the chatbot’s capabilities for more dynamic and accurate responses, and integrate advanced features such as user authentication and sophisticated natural language processing techniques.

In conclusion, CI/CD has proven to be an essential component of modern software development. It empowers teams to deliver high-quality features more quickly, with fewer bugs, and with increased reliability. This project has laid a strong foundation for further exploration into DevOps practices, containerization, and automated deployment, and has equipped me with the skills to continue refining and scaling software delivery processes in the future.

Creately's CI/CD Pipeline Example
Check out Creately's customizable CI/CD pipeline templates here

More Articles



cover image courtesy: https://shalb.com/blog/what-is-devops-and-where-is-it-applied/

💖 💪 🙅 🚩
nirajan_mahara
Nirajan Mahara

Posted on November 26, 2024

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related