2 Clouds 1 Pipeline

liamchampton

Liam Conroy Hampton

Posted on February 5, 2021

2 Clouds 1 Pipeline

You've probably heard of CI/CD or DevOps. Maybe you have read my Toolchain article or listened to the Tech Jam podcast I co-host.

Wherever you have heard it, it is a hot topic and there are so many elements to it. In this blog I will show you how I deploy two microservices into two different clouds. However, I have a plot twist. I am only using 1 pipeline. This is called multi-cloud deployment, a method that has many benefits from cost savings to security and reliability. This is becoming a more popular method of cloud deployment and this article just scratches the surface.

The 3 takeaways from this article:

  1. Learn something new about CI/CD
  2. Learn how to write a Tekton pipeline
  3. Understand how easy it is to deploy into more than one cloud

For context, I have written a client app in Golang and a server app in Node.js. The Go app will submit/POST a piece of data through a form input box and the Node app will receive this and display it. Pretty simple but it does the job.

If you also want to do this, here are some pre-reqs:

  1. Node.js server app
  2. Golang client app

Building a TekTon pipeline

TekTon is purpose build for Knative applications. It handles containers and deployment into Knative environments seemlessly. Whereas you would previously have a Jenkinsfile or travis.yaml in the root of your project you dont neceessarily need this in TekTon (at least for this demo anyway).

A TekTon pipeline provides new resources through Custom Resource Definitions (CRD's). These are:

  • Pipeline
  • Task
  • TaskRuns
  • PipelineRuns
  • PipelineResources

A TekTon pipeline can consist of many tasks and within the tasks can be many steps. This will become clearer as we go along.

The most fundamental part to always remember is the pipeline is decoupled and the building blocks are isolated containers.

Step 1 - Set up an IBM and DigitalOcean Cloud Accounts and acquire access keys

Just follow these links to sign up (they are free)

IBM token creation

  1. Sign into the IBM Cloud Managemant Console
  2. Manage -> Access (IAM) -> API keys -> Create an IBM Cloud API key

DigitalOcean token creation

  1. Sing into the DigitalOcean Management Console
  2. Select API from the list on the left
  3. Select the Token/Keys tab
  4. Click Generate New Token

Keep both of these api keys safe and ensure you know which is which. You will not be able to see them again.

Step 2 - Write the pipeline yaml

File 1 - deployment-pipeline.yaml

In this file you can see the main flow of the pipeline. It will firstly execute pipeline-ibm-cloud-install-task and then afterwards it will execute pipeline-do-cloud-install-task.

apiVersion: tekton.dev/v1beta1
kind: Pipeline
metadata:
  name: deployment-pipeline
spec:
  params:
    - name: ibmcloudapikey
      description: The IBM Cloud API Key
    - name: doauth
      description: The DigitalOcean API Key
    - name: file
      description: The node app repository contents to download
    - name: uuid
      description: A unique identifier
  tasks:
    - name: pipeline-ibm-cloud-install-task
      taskRef:
        name: ibm-cloud-install-task
      params:
        - name: ibmcloudapikey
          value: $(params.ibmcloudapikey)
        - name: file
          value: $(params.file)
        - name: uuid
          value: $(params.uuid)
    - name: pipeline-do-cloud-install-task
      runAfter: pipeline-ibm-cloud-install-task
      taskRef:
        name: do-cloud-install-task
      params:
        - name: doauth
          value: $(params.doauth)
        - name: uuid
          value: $(params.uuid)
Enter fullscreen mode Exit fullscreen mode

File 2 - ibm-cloud-install-task.yaml

This is a Task and it has a single step. The primary image declared is ibmcom/ibm-cloud-developer-tools-amd64:1.2.3 and it has the ibmcloud cli pre-installed and set up. First it downloads a zip of my project from GitHub. Secondly, it uses the ibmcloud cli to authenticate my IBM Cloud account and pushed my project up to CloudFoundry.

apiVersion: tekton.dev/v1beta1
kind: Task
metadata:
  name: ibm-cloud-install-task
spec:
  params:
    - name: ibmcloudapikey
      type: string
    - name: file
      type: string
    - name: uuid
      type: string
  steps:
    - name: login-and-deploy-function
      image: ibmcom/ibm-cloud-developer-tools-amd64:1.2.3
      script: |
        #!/usr/bin/env sh
        apk add zip
        mkdir -p ./application
        wget $(params.file) -O nodeapp.zip
        unzip ./nodeapp.zip -d ./application
        rm nodeapp.zip
        echo $(params.uuid)
        cd application/microservice-node-demo-main

        ibmcloud login -a cloud.ibm.com -r eu-gb -g Default -apikey $(params.ibmcloudapikey)
        ibmcloud cf install
        ibmcloud target --cf
        ibmcloud cf push liams-node-app -m 128M
        echo PUSHED TO IBM CLOUD CF
Enter fullscreen mode Exit fullscreen mode

Be sure to change the name of your app by changing the line ibmcloud cf push.

File 3 - do-cloud-install-task.yaml

This Task is using a vanilla Alpine image, installing the DigitalOcean CLI (doctl), creating an app spec (a deployment .yaml file DigitalOcean expects) and then finally deploying the app. The app spec consists of the constraints I wish to impose on my application upon deployment. As you can see, I have hard coded an environment variable (APPROUTE) for my application to use to connect to my other app deployed in IBM Cloud. This could be subsituted with a parameter but this is just a basic demo.

In the code below, make sure you copy your own values into the bits that need changing (repo_clone_url and APPROUTE value) and tailor it to your applications needs. DigitalOcean App Spec.

If you are using my repositories just sub in the values.

apiVersion: tekton.dev/v1beta1
kind: Task
metadata:
  name: do-cloud-install-task
spec:
  params:
      - name: doauth
        type: string
      - name: uuid
        type: string
  steps:
    - name: login-and-deploy-function
      image: alpine:3.13.0
      script: |
        #!/usr/bin/env sh
        echo pipeline-uuid-$(params.uuid)
        apk add wget
        wget https://github.com/digitalocean/doctl/releases/download/v1.54.0/doctl-1.54.0-linux-amd64.tar.gz
        tar xf doctl-1.54.0-linux-amd64.tar.gz
        mv doctl /usr/local/bin
        mkdir -p ./appspec
        cd appspec/
        touch client-go-app.yaml
        ls -la
        cat >> client-go-app.yaml <<EOF
        name: client-go-app
        services:
          - name: go-app
            git:
              repo_clone_url: <your-github-project-url-to-deploy-into-digital-ocean>
              branch: do-deploy
            build_command: go build -o goapp
            run_command: ./goapp
            envs:
              - key: APPROUTE
                value: https://<your-ibmcloud-app-name>.eu-gb.mybluemix.net
            instance_count: 1
        EOF
        pwd
        doctl auth init -t $(params.doauth)
        doctl apps create --spec client-go-app.yaml
        echo App deployed to DigitalOcean

Enter fullscreen mode Exit fullscreen mode

As it stands this will not work and you'll find out why shortly

Once these files have been created, create a new repository in GitHub and push these files into it.

Step 3 - Build your Toolchain in IBM Cloud

  • Login to your IBM Cloud account
  • Navigate to the DevOps section in IBM Cloud using the navigation menu on the left (its about half way down the list)
  • Select your region (I chose London) and then click on Create toolchain
  • Scroll down to the bottom and select "Build your own toolchain"
  • Give it a name and leave the region and resource group as the defaults

You have now created a foundation to build on. At this point your toolchain will be empty, so lets fill it.

  • In the top right, click on the button Add tool and search for GitHub. This will be used to access the microservices in GitHub or an equivalent server.

Fill out the details:

  1. Server - Where the projects lives (default is GitHub).
  2. Repository type - Change this to Existing since your projects already exists.
  3. Repository URL - This is the HTTPS project URL (not the .git URL).
  4. Integration Owner - This is your GitHub username. You may be propted to allow Toolchain access to your GitHub profile. Once connected, it will auto-populte.

Ignore the checkboxes Enable GitHub Issues and Track deployment of code changes as we do not need them for this demo.

Repeat these steps and add a new GitHub tool in the Toolchain for each GitHub repository. 1 for each microservice and 1 for the Tekton pipeline repository.

Once you have done this, add one final tool. Thats right, FINAL tool! Click Add tool and search for Delivery Pipeline. This is the workhorse in this Toolchain.

Select it and proceed to fill out the following:

  1. Pipeline name - Give it a unique name
  2. Pipeline type - Here you have 2 options. Classic, which provides a graphical interface and allows you to deploy your applications into CloudFoundry and Tekton, a cloud native CI/CD pipeline tool. Select Tekton from the list.

Now click on Create Integration.

If you get an error box saying "Continuous Delivery service required" dont panic. In the main IBM Cloud search bar, search for continuous delivery and select "Continuous Delivery" - You only need to do this once!
Its very simple to set up this service. Make sure the region is set to the same as your toolchain (mine is London), give it a sensible name such as "toolchain delivery service" and click Create. Now head back to your Toolchain and the error should be gone.

It should now look like this:
toolchai tools

Step 4 - Configure the Tekton pipeline

Click on the Delivery Pipeline tool and this will take you to the TektonPipeline Configuration dashboard.

On the left side of the screen you will see a list. These are as follows:

Definitions - This specifies the pipeline code to be run upon a trigger. We created 3 Tekton .yaml files in Step 2. Click on Add and you will be given a popout. For the Repository select the Tekton pipeline repository that has the .yaml files we created earlier in. Chose the main branch in the next box and leave Path blank if you have left the files in the root directory of the project repository. Click Add. Once the code has loaded in, click on Validate, wait for this to finish and then click Save.

Worker - This is essentially how your pipeline will run. You can have dedicated private workers or ones that from a shared pool of resources. We will just use a shared resource so select IBM Managed workers (Tekton Pipelines vx.xx.x) in LONDON and click Save.

Triggers - These specify what happens when an event occurs. More often than not, they will be event triggers from a webhook on GitHub, such as a pull request or a new commit or merge. However in this blog we will be using a manual trigger. We'll set this up shortly.

Environment properties - Key:Value pairs that can be used from within your pipeline. These are referenced as $(PARAMS) in the .yaml files. In here we need to Add 4 evironment properties for this pipeine to run correctly and they are:

Key Type Value
doauth Secure DigitalOcean api key
file Text .zip url of 1 project repsitory (mine is my node project)
ibmcloudapikey Secure IBM Cloud api key
uuid Text Unique identifier e.g 1234

.zip url = url + /archive/main.zip for example: https://github.com/liamchampton/microservice-node-demo/archive/main.zip

Other settings - Use these settings for more granular control over builds. For this demo we will not be touching these.

Now everything has been set up. The pipeline code has been imported, the workers have been set and the evironment properties have been set. Seems like it is good to go.. right? Try and click on Run Pipeline in the top right of your screen.

PSYCH! Got cha! This will not work because we have not got a trigger. It doesn't know what to do when that button is clicked so we need to tell it.

Like I mentioned, it would usually be a trigger from an event that happens on a repository, such as a pull request or merge but we will create a manual trigger.

We need to revisit our pipeline code repository and add some more files. I have named these hello-xxx.yaml to give a sense I am envoking a new conversation with it but you can call them whatever you like, just make sure to change the corrosponding code.

File 4 - hello-lister.yaml

apiVersion: tekton.dev/v1beta1
kind: EventListener
metadata:
  name: hello-listener
spec:
  triggers:
    - binding:
        name: hello-trigger-binding
      template:
        name: hello-trigger-template
Enter fullscreen mode Exit fullscreen mode

File 5 - hello-trigger-binding.yaml

apiVersion: tekton.dev/v1beta1
kind: TriggerBinding
metadata:
  name: hello-trigger-binding
spec:
  params:
    - name: ibmcloudapikey
      value: $(event.ibmcloudapikey)
    - name: doauth
      value: $(event.doauth)
    - name: file
      value: $(event.file)
    - name: uuid
      value: $(event.uuid)
Enter fullscreen mode Exit fullscreen mode

File 6 - hello-trigger-template.yaml

apiVersion: tekton.dev/v1beta1
kind: TriggerTemplate
metadata:
  name: hello-trigger-template
spec:
  params:
    - name: ibmcloudapikey
      description: The IBM Cloud API Key
    - name: doauth
      description: The DigitalOcean API Key
    - name: file
      description: The node app repository to be downloaded
    - name: uuid
      description: A unique identifier
  resourcetemplates:
    - apiVersion: tekton.dev/v1beta1
      kind: PipelineRun
      metadata:
        name: pipelinerun-$(params.uuid)
      spec:
        pipelineRef:
          name: deployment-pipeline
        params:
          - name: ibmcloudapikey
            value: $(params.ibmcloudapikey)
          - name: doauth
            value: $(params.doauth)
          - name: file
            value: $(params.file)
          - name: uuid
            value: $(params.uuid)
Enter fullscreen mode Exit fullscreen mode

Push these new files into your main branch on GitHub and inside your pipeline dashboard in IBM Cloud, go to Definitions -> click the 3 dots on the repository already imported -> Edit -> click on the Update button -> Validate -> Save

Head to the Triggers page and then click Add trigger. Select Manual. You should only have 1 EventListener to choose from and it should be called hello-listener. On the Worker box, select the option Inherit from Pipeline Configuration as this will use the shared pool or works, just like we set up earlier. Click Save.

Wahoo! You have finished setting up your Tekton pipeline!

Now all thats left to do is run it.. click Run Pipeline, enter a uuid value for the pipeline run and click Run. Watch the magic happen 🎉

My challenge for you

Create a new Task to clean up each deployment at the end of the pipeline run and let me know how you got on 😄

If you have any questions or want to see more content like this, drop me a line!

GitHub
Twitter
Instagram
LinkedIn

💖 💪 🙅 🚩
liamchampton
Liam Conroy Hampton

Posted on February 5, 2021

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related

2 Clouds 1 Pipeline
cloud 2 Clouds 1 Pipeline

February 5, 2021