Run environment-aware database migrations with yuniql
Rodel E. Dagumampan
Posted on June 1, 2020
Background
Let me start this again by saying, I am writing this both to create awareness on yuniql; an open source database devops tool I am maintaining and to share some experiences that I hope would help our community make software better, faster, safer.
The solutions here works with SqlServer, PostgreSql and MySql. :) Not let's get to meat of the story...
One of the challenges when working with databases are variations of scripts and schema definitions driven by compliance to existing governance and practicalities.
Cross-database queries ๐ต
Some organization requires database names to represent the environment where it's created. Letโs say we have HR and PAYROLL databases where each have DEV, TEST and PROD environments. As a result, in DEVELOPMENT environment we have HRDB-DEV
and PAYROLLDB-DEV
. This script certainly cannot be applied to TEST and PROD databases as the database names are suffixed with an environment code DEV.
CREATE VIEW [dbo].[vw_employee_timesheets]
AS
SELECT E.employee_id, E.first_name, E.last_name, E.position, T.checkin_time_utc, T.checkout_time_utc
FROM [HRDB-DEV].[dbo].[employees] E
INNER JOIN [PAYROLLDB-DEV].[dbo].[timesheets] T
ON E.employee_id = t.employee_id
GO
Environment-specific scripts ๐
In another case, we may want to create partitioned tables best-fit for big data in TEST and PROD environment. We can observe the relatively complex configuration of the same table when created in PROD.
CREATE PARTITION FUNCTION RangePartFunction (datetime)
AS RANGE RIGHT FOR VALUES ('20200101', '20200201');
CREATE PARTITION SCHEME RangePartScheme AS PARTITION RangePartFunction
TO ([PartBefore2020], [Part202001], [Part202002]);
CREATE TABLE [dbo].[timesheets](
[timesheet_id] [int] IDENTITY(1,1) NOT NULL,
[employee_id] [int] NOT NULL,
[checkin_time_utc] [datetime] NOT NULL,
[checkout_timeutc] [datetime] NULL,
CONSTRAINT [PK_timesheets] PRIMARY KEY CLUSTERED ([timesheet_id] ASC)
) ON RangePartScheme (checkin_time_utc);
CREATE CLUSTERED INDEX IDX_Part On [dbo].[timesheets](datetime)
ON RangePartScheme (checkin_time_utc);
But wait, this may not be necessary for local development and testing where a minimal set of samples are being loaded. A simpler model like this may just work.
CREATE TABLE [dbo].[timesheets](
[timesheet_id] [int] IDENTITY(1,1) NOT NULL,
[employee_id] [int] NOT NULL,
[checkin_time_utc] [datetime] NOT NULL,
[checkout_timeutc] [datetime] NULL,
CONSTRAINT [PK_timesheets] PRIMARY KEY CLUSTERED ([timesheet_id] ASC)
) ON [PRIMARY];
Solutions
yuniql addresses this in two ways:
- via token replacement
- via environment-reserved directories/folders.
To begin with, lets create a new repository repository. A yuniql repository is nothing but a git-ready repository to hold all your scripts files. For guide on how to get started with yuniql, please visit https://yuniql.io/docs/.
# install yuniql cli
dotnet tool install -g yuniql.cli
# prepare your repository
cd c:\temp
md yuniql-environment-aware
cd c:\temp\yuniql-environment-aware
# initialize your repository
yuniql init
c:\temp\yuniql-environment-aware>yuniql init
INF 2020-05-31T... Created script directory c:\temp\yuniql-environment-aware\_init
INF 2020-05-31T... Created script directory c:\temp\yuniql-environment-aware\_pre
INF 2020-05-31T... Created script directory c:\temp\yuniql-environment-aware\v0.00
INF 2020-05-31T... Created script directory c:\temp\yuniql-environment-aware\_draft
INF 2020-05-31T... Created script directory c:\temp\yuniql-environment-aware\_post
INF 2020-05-31T... Created script directory c:\temp\yuniql-environment-aware\_erase
INF 2020-05-31T... Created file c:\temp\yuniql-environment-aware\README.md
INF 2020-05-31T... Created file c:\temp\yuniql-environment-aware\Dockerfile
INF 2020-05-31T... Created file c:\temp\yuniql-environment-aware\.gitignore
INF 2020-05-31T... Initialized c:\temp\yuniql-environment-aware.
Token replacement with yuniql ๐
When using yuniql, we can specify tokens to be replaced during migration run. For the given case, we use ENVIRONMENT
as token key.
CREATE VIEW [dbo].[vw_employee_timesheets]
AS
SELECT 'hello yuniql!' message;
--SELECT E.employee_id, E.first_name, E.last_name, E.position, T.checkin_time_utc, T.checkout_time_utc
--FROM [HRDB-${ENVIRONMENT}].[dbo].[employees] E
--INNER JOIN [PAYROLLDB-${ENVIRONMENT].[dbo].[timesheets] T
--ON E.employee_id = t.employee_id
GO
Then during migration run, we pass the token key/value pair. In the process, yuniql inspects all tokens in script files and replaces them.
yuniql run -a -k "ENVIRONMENT=DEV"
yuniql run -a -k "ENVIRONMENT=TEST"
yuniql run -a -k "ENVIRONMENT=PROD
Alternatively, we can pass multiple tokens in a single call.
yuniql run -a -k "ENVIRONMENT=DEV,USERNAME=rdagumampan,SOURCE=AzDevOpsTask"
Environment-reserved directories with yuniql ๐
Sometimes it would be simpler to group all scripts in a single environment-reserved directory. While it forces us to make duplicate script files, it can also help us stay organized. In this sample, letโs create _development
, _test
and _production
directories.
When we call yuniql run
, we can pass --environment
to demand an environment-aware migrations. yuniql discover all directories, sort and preapare for execution. When environment-reserved directory is present, it only picks the right directory.
yuniql run -a --environment development
yuniql run -a --environment test
yuniql run -a --environment production
Parting words ใ๏ธ
Whoah! You have reached this far! Thanks! ๐ป
Environment-aware migrations using token replacements and environment -reserved directories addressed the variations of scripts demanded by internal enterprise policies and guidelines. You may also find other use cases for token replacements such as annotating the scripts and baseline data.
P.S. Please support yuniql by clicking GitHub Star! For a young project like this, a star can help capture more user experiences and improve the tool in its early stage. https://github.com/rdagumampan/yuniql
Cheers!
Posted on June 1, 2020
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.