Integration Test Reporting
Jesse Phillips
Posted on October 1, 2019
I am very disappointed with the test reporting options out there. I'm talking about this need for verification of test coverage (associated to features/functionality) and data/artifact storage.
Many of these systems are built as 'requirements up front' with an associated test "block". One of the popular ones is to define a grammar which associates code steps. This gives you features and requirements which are all written in plain English (bet not really). Some allow for your existing test harness to report into these systems.
The Problem
These systems make a huge dependency on the system and if you've been reading up on the gitlab propaganda these all in one systems provide great benefits with consistency and even stability. This is no necessarily inaccurate propaganda so why is this a problem?
This effects more then just the development team, to use these systems correctly (or at least to get the most out of it with least duplication) the sales team could be effected, depending on how your product is sold.
Sometimes the requirements management can just be done better in a different tool (not that I've seen it). I believe requirements need to be in version control like everything else.
This makes test efforts focus on reporting first and testing second.
On Its Head
I spent some time building a rapid software testing tool. And what I realized is, testing needs a dumping ground for random artifacts. Not orderly, pre-planned test steps. Test automation can, and should provide adhoc test efforts which could have post evaluation done to determine what was tested.
Let me say this again. A good test session does not start with a goal of validation but when the session is done there may have been a number of items validated.
What I Want
I like the idea of these tools which claim they provide a way for your existing framework to inject test results. But I need even more freedom because they require I setup a place to put my results first. I used the term 'dumping ground' for a reason.
Test runs should be available out of thin air with a specified ID to use for later associations. Data of all shapes and sizes should be stored with optional thin air test steps.
Reporting on these runs should be easy. Which ones have no release? Did we identify product versions/commit? What is the specific commit for the test code/tool?
Posted on October 1, 2019
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.
Related
November 14, 2024