Kong plugin development with breakpoint debugging
Mehul Sharma
Posted on April 14, 2024
Kong is inarguably one of the most popular open-source API gateways, and chances are, if you're using Kong as part of your work or for your projects, you might end up writing your own plugins.
It's pretty easy to get started with Kong plugin development following Kong's documentation and their Plugin Development Kit (PDK). However, once you start writing your plugins in Lua (you can also write your plugins in Golang & Javascript but it's likely to have performance implications that you might not want to deal with), it gets harder to debug your plugin execution without using a lot of print statements, which is tiresome and time-consuming. Also, you do not get a chance to look at a lot of things that are happening under the hood.
I faced similar problems while writing plugins for Kong at work. It took some time and diving into a lot of fragmented pieces on the internet, but finally, I was able to set up a system that enabled me to do breakpoint debugging of my Lua code in IntelliJ, and as a bonus, also do breakpoint debugging into Kong source code to gain more insight into what was happening under the hood.
Step 1: Set up Kong locally inside Docker
Kong publishes images of all OSS releases to their Docker Hub registry. If you've worked with Docker before (which you likely would've if you've found this post), you just need to run a few commands to run Kong locally on your machine.
However, to make things easier to manage, I wrote this quick docker-compose.yml
file that you can just run with the docker compose up command to run Kong locally along with a Postgres container that serves as the database for Kong to save all routes and service information along with the plugin configurations.
If you notice, we're referring to a Dockerfile
in our docker-compose.yml
that's because we do not want to run default Kong but we also want to add our own plugin to Kong and also debug it, so also we will create a new Dockerfile
which for now will just be an image using the Kong provided image as base. Here's how our Dockerfile
will look like
Now we'll run docker compose up --build
to build and run our own Kong image, if everything works our well, you should be able to verify that Kong is running successfully by running
curl localhost:8001
It should emit a lot of information about Kong if everything is up and running.
Once we have verified that Kong is running successfully, we'll create a new Service and a Route using
We'll also run a httpbin
container to make sure that our Kong service points to a valid target and returns 200 for our requests.
docker run -p 80:80 kennethreitz/httpbin
To create a service
curl --location 'localhost:8001/services' \
--header 'Content-Type: application/json' \
--data '{
"name": "httpbin",
"url" : "http://localhost:80"
}'
To create a Route on the Service
curl --location 'localhost:8001/services/httpbin/routes' \
--header 'Content-Type: application/json' \
--data '{
"name": "get-route",
"paths": ["/get"],
"strip_path" : false
}'
Once Route and Service are created, we'll make a proxy request via our Kong to make sure that Kong proxies our request correctly to the right Route and Service.
curl --location 'localhost:8000/get'
Step 2: Writing a plugin for Kong
Once you have Kong running locally, we'll go and write our first plugin for Kong.
Kong already has a blog on writing your first plugin for Kong, but it is quite outdated and also does not help you verify what your plugin is doing but rather just lets you do assertions using tests, which I think isn't as thrilling as seeing your code work for yourself.
I've used the same plugin that the blog post refers to, but we're going to add it to our running Kong container by modifying the empty Dockerfile
that we wrote earlier.
To enable Kong to locate your plugin source code and load it while starting up, you have to copy (or mount) the source code of your plugin inside the Docker container and also add the plugin to the LUA_PATH
variable, which is used to load all the modules required by Kong.
We'll create two files handler.lua
and schema.lua
in a particular directory structure and add these files to our Docker image.
Kong expects the plugins to adhere to a standard directory structure for it to load it and it's dependencies successfully.
To do that make sure that these two files are created under the following path —
kong-plugin-add-header/kong/plugins/add-header/
which will make your tree
command in current directory to look like
.
├── Dockerfile
├── docker-compose.yml
└── kong-plugin-add-header
└── kong
└── plugins
└── add-header
├── handler.lua
└── schema.lua
5 directories, 4 files
Another thing we have to do is to tell Kong to enable the plugin as well, we do this by adding the plugin to the KONG_PLUGINS environment variable in our locally running Kong.
Once we make all the changes, our Dockerfile
should look like this
We'll run the docker compose up --build
command to ensure that Docker builds the Kong image again using our Dockerfile.
Once this is successful, we should be able to add the plugin to our already created service, to add plugin to the service
curl --location 'localhost:8001/services/httpbin/plugins' \
--header 'Content-Type: application/json' \
--data '{
"name": "add-header",
"enabled": true
}'
To verify that it works, we'll make a request to our service and verify that the response has the header added by the plugin.
curl --location 'localhost:8000/get'
Awesome! We've created and verified our plugin for Kong.
Step 3: Setting up breakpoint debugging
Once we have our plugin code running successfully, we'll now add breakpoint debugging to our Jetbrains IDE (It could be GoLand or IntelliJ Idea).
First of all, make sure you install the EmmyLua plugin for your IDE using the plugin marketplace.
Once EmmyLua is installed, we'll create a Debug configuration for Lua. To do this, go to add a new configuration option in the IDE and select the EmmyLua Debugger (New) option.
Once selected, add the name of your configuration like Kong Debug, select the option IDE Connect debugger (where IDE will connect with the debugger process which we will start in the Kong Docker container).
Also, select the option to Block the program and wait for the IDE to make sure that code execution halts until our IDE can connect to the debugger process.
Once done, the IDE will generate a code snippet for us that we will use in our plugin source code to connect to the IDE before continuing execution.
Essentially, what the code snippet is doing is adding a C package to LUA_CPATH
and then using tcpListen and waitIDE methods of this package.
The package path that we see is the path where EmmyLua Debugger is installed on our machine, but since we are running Kong inside a Docker container, we need to provide the path where the EmmyLua debugger package is installed inside our container.
Before we do that, we will also have to modify our Kong Dockerfile to ensure that it also installs the EmmyLua debugger in our Kong image.
To do that, we have to add the following code snippet to the Dockerfile. We are also installing CMake here because making EmmyLua does not work without it. This post by @omervk talks in more detail about this.
Once we modify the Dockerfile to install EmmyLua debugger, we can modify the code snippet to point to the installed debugger. It should now look like below, and we add it to the entrypoint of our plugin source code which is inside the access
method.
Our Dockerfile should now look like this —
Your access
method should now look like
Once we do that, we'll build the Kong again using docker compose up --build
, add a breakpoint in the IDE on dbg.waitIDE()
and make a request.
curl --location 'localhost:8000/get'
We see that the code execution waits to attach to the IDE, but code execution does not stop at our breakpoint.
Once we hit Next on the debugger, execution completes without ever stopping at our breakpoint.
It happens because IDE is unable to find the source file that Kong is executing because Kong is referring to a file inside the Docker container. To work around this, we will modify the fixPath
function that EmmyLua debug has (again thanks to @omervk for finding this).
We fix the path prefix from the one inside the Docker image to our local path where our source code is present. Once you do this, your modified code snippet should look like this.
We'll again build and run the Kong image using docker compose --build
and make a request.
Hopefully, if you've followed the steps correctly and there's not something unexpected that happened, your code should stop at the breakpoint, and you should be able to see through all the variables available at runtime.
If you face any issues with the setup, please add a comment below, and I'll try to help you solve it.
References that helped me figuring this out:
- https://dev.to/omervk/debugging-lua-inside-openresty-inside-docker-with-intellij-idea-2h95
- https://lua-programming.blogspot.com/2015/12/how-to-debug-kong-plugins-on-windows.html
- https://konghq.com/blog/engineering/custom-lua-plugin-kong-gateway
- https://docs.konghq.com/gateway/latest/plugin-development/distribution/
- https://notebook.kulchenko.com/zerobrane/debugging-openresty-nginx-lua-scripts-with-zerobrane-studio
- https://github.com/mercedes-benz/debug-monkey
Posted on April 14, 2024
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.