Julian
Posted on October 27, 2019
Please note ❤️
I created the solution I did and wrote what I wrote because there was a problem I was trying to solve within a set of requirements / "boundaries" lets say.
Using this solution requires the usage of SonarQube "analysis mode", which is deprecated since SonarQube 6.6. Once this solution is implemented you can't easily upgrade, without breaking the pull request analysis.
Please use this solution at your own discretion 🔥
I'll start by saying, the solution was way easier than I originally anticipated... I thought I was gonna have to run a Docker container with a microservice of some kind written in Java, Node.JS or Golang and run Sonar Scanner when a pull request was created or updated based on a webhook from Bitbucket.
But... As always, eventually the solution was so much easier, but what's the fun in writing a blog when you don't write about the way you failed miserably before succeeding?
Where it all started, a long... long time ago
From the very moment I met her, SonarQube, I knew I was in love. Knowing that it had a tiny, little, cute container that you could easily run with a single keystroke in the command line, with an embedded H2 database for convenience, and even an embedded instance of Elasticsearch, I couldn't be more intrigued 😍
I looked into some different static analysis tools, such as Code Climate, SonarCloud and Exakat, but they were either priced based on the size of your organization (Code Climate), or your projects (pricing based on LOC for SonarCloud), which might've caused scaling issues in the future. Or they weren't as easy to integrate with the existing tools such as Bitbucket Cloud, or didn't really have a friendly UI for the developer using it in case of Exakat.
Integrating SonarQube with Bitbucket Cloud ☁️
Mibex Software has a wonderful SonarQube plugin, Sonar Bitbucket Cloud Plugin, which analyzes and comments on your pull requests.
I decided to give it a go, I cloned the repository, built it with Maven so that I had a beatiful JAR to rsync to my Droplet on DigitalOcean and installed it to my SonarQube instance.
I installed the Sonar for Bitbucket Cloud plugin through the Bitbucket Marketplace, and expected that it'd have a built-in webhook to inform the SonarQube plugin whenever a pull request was created or updated, ran the analysis and comment on the pull request with its findings.
But, it wasn't that easy. It turns out, that if I had read the documentation fully, from top to bottom, it documents the fact that I had to run Sonar using Maven, or as I later found out the Sonar Scanner, to trigger an analysis on SonarQube.
It was fun for the time being, but... We'll stay friends! ❤️
Sonar for Bitbucket Cloud seemed to really only add the statistics on the overview page of your Bitbucket repository, it didn't really seem to contribute to commenting with found issues on your pull requests.
So... Before running it in automation, lets try it manually? 🔧🔨
It turns out that the Sonar Bitbucket Cloud plugin that I installed on my SonarQube installation also works when I simply run the Sonar Scanner with a few more arguments than I'd normally pass to it.
I hear you thinking, "uhm... but... wasn't that so obvious?!", and I know that now. But in the moment itself I wanted to pull the hairs, that I still have at this age, out of my head because of the frustration.
Originally I read this article (Decorate Bitbucket Cloud Pull Request with SonarQube Server Comments) to get familiar, so I got confused by the fact that the author was using a Maven goal to run the scanner.
When I ran the Sonar Scanner manually the pull request was analyzed and the Sonar Bitbucket Cloud plugin successfully commented on it with the issues that were found, in this case none.
sonar-scanner -Dsonar.projectBaseDir=$(pwd)
-Dproject.settings=sonar.properties
-Dsonar.analysis.mode=issues
-Dsonar.bitbucket.repoSlug=<repoSlug>
-Dsonar.bitbucket.accountName=<accountName>
-Dsonar.bitbucket.branchName=<branch>
-Dsonar.bitbucket.oauthClientKey=<oauth-client-key>
-Dsonar.bitbucket.oauthClientSecret=<oauth-client-secret>
-Dsonar.login=<sonar-token>
-Dsonar.bitbucket.pullRequestId=<pull-request-id>
Should we integrate this with our CI/CD? Of course! 🌹
When I knew that the Sonar Scanner worked, I wanted to integrate it with our CI/CD to automate the analysis of both our integration branch as well as pull requests.
In the article I mentioned earlier, our beloved Jenkins was mentioned as well as some kind of microservice written in Java that was meant to trigger an analysis on SonarQube whenever a pull request was created or updated, based on a Bitbucket webhook.
So... I tried writing a microservice myself using Node.JS and Express.js to trigger an analysis based on a pull request creation or update event, but I quickly found out that Bitbucket has support for so called "Pull Request Pipelines".
I don't like over-engineering, but... It's my first instinct sometimes, and I fear that it's a curse... 💀👀
Using the blessing which are "Pull Request Pipelines" 🙈💘
When I figured out how the pull request pipelines worked, I was (still am) in love. What I ended up with was the pipeline configuration below 😍
pipelines:
pull-requests:
'**':
- step:
name: SonarQube Analysis
image: newtmitch/sonar-scanner:4.0.0-alpine
script:
- sonar-scanner -Dsonar.projectBaseDir=$(pwd)
-Dproject.settings=sonar.properties
-Dsonar.analysis.mode=issues
-Dsonar.bitbucket.repoSlug=$BITBUCKET_REPO_SLUG
-Dsonar.bitbucket.accountName=$BITBUCKET_REPO_OWNER
-Dsonar.bitbucket.branchName=$BITBUCKET_BRANCH
-Dsonar.bitbucket.oauthClientKey=$OAUTH_CLIENT_KEY
-Dsonar.bitbucket.oauthClientSecret=$OAUTH_CLIENT_SECRET
-Dsonar.login=$SONAR_LOGIN
-Dsonar.bitbucket.pullRequestId=$BITBUCKET_PR_ID
branches:
master:
- step:
name: SonarQube Analysis
image: newtmitch/sonar-scanner:4.0.0-alpine
script:
- sonar-scanner -Dsonar.projectBaseDir=$(pwd)
-Dproject.settings=sonar.properties
-Dsonar.login=$SONAR_LOGIN
Bitbucket has a bunch of pre-defined environment variables that you can use in these kind of situations. Environment variables that you need to define yourself are:
-
SONAR_LOGIN
which is a SonarQube User Token -
OAUTH_CLIENT_KEY
andOAUTH_CLIENT_SECRET
require an OAuth consumer to be configured with read access to the account and write access to pull requests.
For some reason a "Callback URL" is required, otherwise you can't create the OAuth consumer.
Telling SonarQube what project the repository is for! 📃📌
As you might have noticed within the command that I use to run the Sonar Scanner, it has an argument.
Dproject.settings=sonar.properties
is the argument I'm talking about. It tells the scanner where to find the properties for this specific project, in this case it's the sonar.properties
file at the root of the repository.
sonar.host.url=https://sonarqube.dev
sonar.projectKey=sonarqube-experimentation
sonar.projectName=SonarQube Experimentation
What are my next steps going to be in this journey? 🐒
What I noticed while I was researching, trying stuff out and miserably failing is that I had many individual resources but no way to consistently reproduce them.
What would happen if something went wrong. I corrupt one of my DigitalOcean droplets that was running Docker with SonarQube for example? I couldn't easily spin up another and be certain that it would be exactly the same, I didn't really document the steps I took through the way of research
Because of this I decided to become more familiar with the idea of "Infrastructure as Code", or shortened by the acronym "IaC"!
Terraform kinda fell into my lap at this point 👀💯
Terraform is one of the tools in the HashiStack by HashiCorp. It enables you to write your infrastructure in a declarative configuration, thus "Infrastructure as Code".
I plan to write a different blog about it where I'll reflect onto the stuff I learned, mistakes I made and times I fell face-first into seaming unsolvable problems!
As always, I hope you learned something from my failures (💩), and if not, I'm praying that you at least had fun reading about them 🔥
Posted on October 27, 2019
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.