Martin Heinz
Posted on October 23, 2019
Note: This was originally posted at martinheinz.dev
In previous post we talked about how to create "Ultimate" setup for your Golang project, now it's time to apply it to something real - RESTful APIs. This post will cover database, unit testing, API testing, example application and essentially everything you need for a real world project. So, let's jump right into it.
As this article is a follow-up to previous article in these series Ultimate Setup for Your Next Golang Project, so if you didn't read that one, then you might want to check it out first.
TL;DR: Here is my repository (rest-api
branch) - https://github.com/MartinHeinz/go-project-blueprint/tree/rest-api
Frameworks and Libraries
First of all, what are we gonna use to build it?
Gin - we will use Gin HTTP web framework. It is a high-performance framework built on top of
net/http
that delivers the most essential features, libraries, and functionalities necessary. It also has quite clean and comprehensive API.GORM - is a Golang ORM library developed on top of
database/sql
. It includes all the fancy features like preloading, callbacks, transactions, etc. It has some learning curve and not so great documentation, so if you are kind of person who prefers to write raw SQL, then you can just go withsqlx
.Viper - Go configuration library, which can handle various formats, commandline flags, environment variables, etc. Setup and usage for this library was already explained in previous post here, so I won't go into detail here.
Project and Package Structure
Next, let's go over individual packages in the project. Apart from main
there are following packages, each with single responsibility. Let's go from database stuff, through queries, all the way to API endpoints:
Models
Models package (models
) has single file, which defines types which reflect structure of database tables. In the example in the repository there are 2 struct
types - Model
and User
:
type Model struct {
ID uint `gorm:"primary_key;column:id" json:"id"`
CreatedAt time.Time `gorm:"column:created_at" json:"created_at"`
UpdatedAt time.Time `gorm:"column:updated_at" json:"updated_at"`
DeletedAt *time.Time `gorm:"column:deleted_at" json:"deleted_at"`
}
type User struct {
Model
FirstName string `gorm:"column:first_name" json:"first_name"`
LastName string `gorm:"column:last_name" json:"last_name"`
Address string `gorm:"column:address" json:"address"`
Email string `gorm:"column:email" json:"email"`
}
Model
is a same type as gorm.Model
with addition of json tags to make it easier to generate JSON responses that include its fields. Second, User
describes simple application user with GORM tags which specify which column the field should be associated with. There are plenty more tags for things like indexes, types or associations and you can explore more of them here.
Data Access Objects
Next up, daos
package, which is short for Data Access Objects (DAOs). DAO is an object which is responsible for accessing data (surprise, surprise...), which essentially means, that it makes SQL queries either using GORM or raw SQL. In example below, we have simple function, which retrieves user data based on ID and returns it in form of User
model mentioned before together with error, if there is any:
func (dao *UserDAO) Get(id uint) (*models.User, error) {
var user models.User
err := config.Config.DB.Where("id = ?", id). // Do the query
First(&user). // Make it scalar
Error // retrieve error or null
return &user, err
}
You can separate DAOs based on table they access or similar business logic or whatever other metric you want, just don't lump them all together, otherwise it will become a mess.
Services
When we have data nicely loaded into our models, we can perform additional logic to process the data before we serve it, that's where Services come into play. This extra logic can be, for example filtering, aggregating, modifying structure or validating data. On top of that it allows us to separate database queries from business logic, which makes the code much cleaner, easier to maintain and most importantly (for me) easier to test (More on that later). So, let's look at the code:
type userDAO interface {
Get(id uint) (*models.User, error)
}
type UserService struct {
dao userDAO
}
// NewUserService creates a new UserService with the given user DAO.
func NewUserService(dao userDAO) *UserService {
return &UserService{dao}
}
// Get just retrieves user using User DAO, here can be additional logic for processing data retrieved by DAOs
func (s *UserService) Get(id uint) (*models.User, error) {
return s.dao.Get(id) // No additional logic, just return the query result
}
In the code above we first define interface, that groups all previously created DAO functions, in this case just Get(id uint)
from previous section. Next, we define User service which contains our DAO and a function that creates it using DAO supplied as parameter. Finally, we define a function, that can perform some additional logic and use DAO from UserService
, here however, to keep it simple, we just use DAO to query database for user and return it. Example of logic that could be performed here would validation of model or check for errors.
APIs
Finally, with services ready to give us processed and valid data, we can serve them to our users. So, let's see the code:
func GetUser(c *gin.Context) {
s := services.NewUserService(daos.NewUserDAO()) // Create service
id, _ := strconv.ParseUint(c.Param("id"), 10, 32) // Parse ID from URL
if user, err := s.Get(uint(id)); err != nil { // Try to get user from database
c.AbortWithStatus(http.StatusNotFound) // Abort if not found
log.Println(err)
} else {
c.JSON(http.StatusOK, user) // Send back data
}
}
In this snippet we can see function that can be used to serve an API endpoint. Here, we first create service with supplied user DAO, which was described earlier. Next, we parse ID, which we expect to be in URL (something like - /users/{id}
), then we use service to get us user data from database and finally, if the data is found we return it in JSON format with 200
status code.
Wiring It All Together
Things shown in previous sections are nice and all, but now we actually need to set it up in main
, so that Gin knows where to serve our APIs:
r := gin.New()
r.Use(gin.Logger())
r.Use(gin.Recovery())
v1 := r.Group("/api/v1")
{
v1.GET("/users/:id", apis.GetUser)
}
r.Run(fmt.Sprintf(":%v", config.Config.ServerPort))
We obviously first need to create instance of Gin, after that we attach middlewares to it e.g. logger or CORS. Final part - the most important one - we create group of endpoints which will all start with api/v1/
and register our GetUser
function from previous section to be served specifically at /api/v1/users
followed by user ID parameter and that's all, now we can run our application!
At this point you might be thinking "Why create all the packages, separate files, layers of functions and what not?" - well, the answer is - if your application gets sufficiently big, then it would become maintainability nightmare if you had everything lumped together, also - in my opinion, more importantly - this separation is needed for better testability, as it's much easier to test each layer - database access, data manipulation and APIs - separately rather than all in one place. So, while we are on the topic of tests, I think it's time to write some...
Note: In the snippet above, I omitted few lines of code and comments for the sake of clarity, such as connection to database or loading of config, these can be found in the repository here, including few more comments for explanation.
Test Everything
Now, for my favourite part - tests - let's start with test_data
package. This package contains utility functions related to testing database and test data. One function in this package that I want make note of is init
:
func init() {
err := config.LoadConfig("/config")
if err != nil {
panic(err)
}
config.Config.DB, config.Config.DBErr = gorm.Open("sqlite3", ":memory:")
config.Config.DB.Exec("PRAGMA foreign_keys = ON") // SQLite defaults to `foreign_keys = off'`
if config.Config.DBErr != nil {
panic(config.Config.DBErr)
}
config.Config.DB.AutoMigrate(&models.User{})
}
This function is special, as Go executes it when package is imported. This is good place to do our setup for tests - more specifically, we first load configuration, then create our testing database, which is in-memory SQLite for which we also enable foreign keys. Lastly we create database tables using GORM AutoMigrate
function.
You might be asking "Why use SQLite in-memory database, isn't better?" - Well yes, I, myself use PostgreSQL for every project, but when it comes to testing, you want something consistent, fast (in-memory) and independent of host system/database server, which this setup provides.
I won't go over remaining functions of this package as this post is already quite long and they are documented in code here.
Apart from initialization function we also store some data in this package, that is, db.sql
file which contains SQL inserts that populate our SQLite database before running tests and also JSON test cases, which are used as expected outputs for API endpoints.
Now, that we got setup out of the way, let's go over tests in each package:
func TestUserDAO_Get(t *testing.T) {
config.Config.DB = test_data.ResetDB()
dao := NewUserDAO()
user, err := dao.Get(1)
expected := map[string]string{"First Name": "John", "Last Name": "Doe", "Email": "john.doe@gmail.com"}
assert.Nil(t, err)
assert.Equal(t, expected["First Name"], user.FirstName)
assert.Equal(t, expected["Last Name"], user.LastName)
assert.Equal(t, expected["Email"], user.Email)
}
First off, daos
, the test above is very simple, we just create DAO and call function under test (Get
) and test against expected values, that were inserted during setup into SQLite database, there's really not much else to talk about. Let's move on to services
:
func TestUserService_Get(t *testing.T) {
s := NewUserService(newMockUserDAO())
user, err := s.Get(2)
if assert.Nil(t, err) && assert.NotNil(t, user) {
assert.Equal(t, "Ben", user.FirstName)
assert.Equal(t, "Doe", user.LastName)
}
user, err = s.Get(100)
assert.NotNil(t, err)
}
func (m *mockUserDAO) Get(id uint) (*models.User, error) {
for _, record := range m.records {
if record.ID == id {
return &record, nil
}
}
return nil, errors.New("not found")
}
func newMockUserDAO() userDAO {
return &mockUserDAO{
records: []models.User{
{Model: models.Model{ID: 1}, FirstName: "John", LastName: "Smith", Email: "john.smith@gmail.com", Address: "Dummy Value"},
{Model: models.Model{ID: 2}, FirstName: "Ben", LastName: "Doe", Email: "ben.doe@gmail.com", Address: "Dummy Value"},
},
}
}
type mockUserDAO struct {
records []models.User
}
This is quite a bit more code, let's go through it from bottom-up. First thing we need is mock DAO (mockUserDAO
), so that we are independent of implementation of real DAO. For this mock to be useful, we need to fill it with some testing data, that's what happens in newMockUserDAO
. Next, we also need to define mock version of Get
that imitates the real one - here, instead of querying database, we just look through fake records and return one if we find supplied ID.
Now, for the actual test - we create NewUserService
, but instead of passing in real DAO, we use our mock with predictable behavior and therefore we are able to isolate function under test from underlying DAO. After that, the test is very simple - we use mock Get
method and test for presence of expected values we inserted into mock.
The last thing to test is the API and these tests are literally one-liners, but we need little bit of preparation:
func newRouter() *gin.Engine {
gin.SetMode(gin.TestMode)
router := gin.New()
config.Config.DB = test_data.ResetDB()
return router
}
func testAPI(router *gin.Engine, method string, urlToServe string, urlToHit string, function gin.HandlerFunc, body string) *httptest.ResponseRecorder {
router.Handle(method, urlToServe, function)
res := httptest.NewRecorder()
req, _ := http.NewRequest(method, urlToHit, bytes.NewBufferString(body))
router.ServeHTTP(res, req)
return res
}
func runAPITests(t *testing.T, tests []apiTestCase) {
for _, test := range tests {
router := newRouter()
res := testAPI(router, test.method, test.urlToServe, test.urlToHit, test.function, test.body)
assert.Equal(t, test.status, res.Code, test.tag)
if test.responseFilePath != "" {
response, _ := ioutil.ReadFile(test.responseFilePath)
assert.JSONEq(t, string(response), res.Body.String(), test.tag)
}
}
}
These 3 functions here imitate HTTP request for our testing purposes. First of them creates Gin in testing mode and resets database. Second serves and subsequently hits specific API endpoint and the last one runs list of test cases and checks whether status codes are same and optionally if JSON outputs are same, too. Now, let's see example test cases:
func TestUser(t *testing.T) {
path := test_data.GetTestCaseFolder()
runAPITests(t, []apiTestCase{
{"t1 - get a User", "GET", "/users/:id", "/users/1", "", GetUser, http.StatusOK, path + "/user_t1.json"},
{"t2 - get a User not Present", "GET", "/users/:id", "/users/9999", "", GetUser, http.StatusNotFound, ""},
})
}
There's a lot of parameters, but it's pretty simple, let's go through them one by one:
-
"t1 - get a User"
- name of the test case - numbered so, it's easier to find when debugging -
"GET"
- HTTP method -
"/users/:id"
- URL being served/tested -
"/users/1"
- URL being hit - with populated parameters -
""
- body of request - in this case empty -
GetUser
- method attached to endpoint -
http.StatusOK
- expected status code - here200
-
path + "/user_t1.json"
- path to expected JSON output - these are stored intest_data
package mentioned earlier
Conclusion
That's it - all you need to create RESTful API in Golang. Hopefully, at least some of this will help you when building your next project. You can find all the source code here and if you have any suggestions or improvements feel free to create issue, pull request or just fork/star the repository. If you liked this article, look for the next one, where I will show how to add Swagger docs to this project.
Posted on October 23, 2019
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.