How Are E2E Tests Performed for Test Automation Product? - Dogu E2E Testing Implementation
Daniel Ahn
Posted on August 14, 2023
This post is translated to English from original post.
When Dogu was not yet an open-source product, we connected our office equipment to GitHub Runner and set it up to execute E2E tests whenever a commit was made, fetching the code from the Runner device. At that time, our test case count was relatively low. However, when we started open-sourcing Dogu, we temporarily paused the E2E tests as we detached the private GitHub Runner. After about a month and a half following the open-sourcing, we reconfigured the E2E tests, and we'd like to share that journey.
Dogu is unified test automation platform for cross platform game and app
Dogu - Seamless Unified Test Automation Platform
We are preparing for re-building project to open source
What's Dogu?
Dogu is a seamless unified test automation platform for web, mobile and game application. You can integrate various tools such as Appium, Selenium and Playwright that you have previously used with Dogu, allowing you to perform parallel processing and check test results more easy. Experience more efficient test automation with Dogu.
Architecture
Dogu is a test automation platform that provides a seamless unified experience for web, mobile and game application. Dogu is composed of the following components.
Test CI Architecture
Dogu Features
Device Farm
Build a systematic device farm with Windows, MacOS, Android, and iOS devices.
Real devices and emulators are supported.
Device Studio
Control devices remotely in Device Studio.
Inspecting UI
Inspect UI with Device Studio.
Test CI
Run test parallelly and periodically with routine.
Previously, E2E testing for Dogu was lacking, but efforts were made to improve it after open-sourcing. Transitioned from using private GitHub Runner to utilizing Dogu for E2E testing. Set up the test environment connection for local testing. Leveraged Dogu routine functionality for simultaneous execution on multiple platforms. Employed dogu-github-action for automated E2E test integration. Enhanced efficiency and reduced developer workload through CI integration, leading to improved product quality.
Testing for Dogu
Before open-sourcing, E2E testing covered a smaller portion compared to the product's features. (The Dogu team focused more on feature development initially..🥲) We decided to implement QA within the team before releasing, and during the approximately one and a half months when E2E was paused, release days felt like entering a battlefield. It was a war with testing, where we had to manually test many features, and we had three different platforms to test.
"□□□ should be tested on a MaZc ARM64 device, and for △△△, please test it on a Windows x64 device."
"Who will handle testing on Mac x64?"
...
Such extensive and repetitive testing on release days, coupled with the need for quick fixes upon issue discovery, led to a constant busy cycle. Does this sound familiar? This is evidence of the need for automated E2E testing! 😎
During this period, someone from outside posed the question, "How do you test Dogu?" At that time, Dogu lacked many features for testing. We reached a conclusion that we needed to use Dogu extensively ourselves to understand its helpful aspects and areas for improvement, leading us to the idea of "Using Dogu to perform Dogu E2E testing!" So, with minimal effort, we decided to reintegrate E2E into CI.
Challenges Faced...
Although E2E testing seemed easily attainable, the process didn't always align with our expectations...😠We encountered numerous minor issues, but a few significant challenges were particularly hindering.
Running Locally
Our previous approach involved building Docker images after each commit, uploading them to a cloud repository, fetching them onto server equipment, and then executing them on E2E devices. However, due to cost and time concerns, we decided to revert to the pre-open-source approach of fetching E2E devices locally for testing, instead of relying on the cloud.
However, post open-sourcing Dogu, GitHub Docs recommended using private repositories, prompting us to remove most of our private GitHub Runners. To adhere to this guidance while testing Dogu through Dogu, we connected E2E devices to the internal test Dogu environment using Dogu itself. We tuned the Dogu Agent app used for connecting the test environment to the E2E device, enabling Dogu Agent execution for E2E on the local E2E device. This was necessary due to potential conflicts when two Dogu Agents recognize the device simultaneously during Dogu E2E tests involving device streaming functionality.
Controlling Multiple Clients Simultaneously
Dogu uses a web browser and a desktop app for device farm setup. We needed to launch both clients on the E2E test device. While our automation testing features included remote testing and routines, they were insufficient for testing across multiple devices. Remote testing allowed users to test scripts on locally connected devices to Dogu before committing to the repository. However, testing multiple devices and running two clients simultaneously were not yet possible.
Hence, we opted for a routine that allows concurrent execution across multiple devices and is scalable. Routines could be written in GUI or YAML, and actions provided by routines allowed for executing scripts across various environments (platforms). Dogu E2E tests were configured to run on macOS ARM64, macOS x64, and Windows devices within a single routine. We also added Git integration to Dogu to facilitate local execution.
Below is an example YAML format used to test Dogu with the routine!
name:e2eon:workflow_dispatch:jobs:e2e-macos:runs-on:group:-e2e-macossteps:-name:envrun:printenv-name:fix diverged mainrun:git reset --hard origin/main-name:Checkoutuses:dogu-actions/checkoutwith:clean:true-name:create dotEnvrun:|~/.dogu_bin/env-generator gen-all e2e-name:Run newbierun:yarn newbie:cicd-name:Run newbie:pythonrun:export PATH=/opt/homebrew/bin:/usr/local/Cellar/poetry/1.5.1/bin:$PATH && yarn newbie:python-name:Build Projectsrun:printenv && yarn workspace dogu run build-name:Run influx, pgsql, redis, nexus, turn-serverrun:|yarn workspace console-web-server run start:e2e-backgroundenv:PATH:/usr/local/bin:$PATH-name:Newbie nm-spacerun:yarn workspace dogu run newbie:nm-space-name:Build nm-spacerun:cd nm-space && yarn workspace nm-space run build-name:Download, Build third-partyrun:|export PATH=/opt/homebrew/bin:/usr/local/Cellar/cmake/3.27.0/bin:/usr/local/go/bin:$PATH && yarn third-party:download:build -name:Run e2erun:|yarn workspace e2e run util:install-chromedriver && yarn workspace e2e run start:cirecord:truee2e-windows:runs-on:group:-e2e-windowssteps:-name:Printenvrun:set-name:Checkoutuses:dogu-actions/checkoutwith:clean:true-name:create dotEnvrun:|$HOME/.dogu_bin/env-generator gen-all e2e-name:Run newbierun:yarn newbie:cicd-name:Run newbie:pythonrun:yarn newbie:python-name:Build Projectsrun:set && yarn workspace dogu run build-name:Run influx, pgsql, redis, nexus, turn-serverrun:|yarn workspace console-web-server run start:e2e-background-name:Newbie nm-spacerun:yarn workspace dogu run newbie:nm-space-name:Build nm-spacerun:cd nm-space && yarn workspace nm-space run build-name:Download, Build third-partyrun:|yarn third-party:download:build-name:Run e2erun:|yarn workspace e2e run util:install-chromedriver && yarn workspace e2e run start:cirecord:true
Integration with CI
Now we were prepared! Simply executing the Dogu routine after each commit was all it took. However, we needed to enable running the Dogu routine within GitHub Action. To achieve this, we created a GitHub Custom Action to run the Dogu routine in GitHub Action. This led to the creation of the dogu-github-action!
While it currently doesn't retrieve logs or similar information from Dogu, it waits until the routine is finished. Once completed, it provides the routine result URL, and it meets minimal requirements for handling cancellations of routines or GitHub Actions.
Writing Test Scripts
Dogu E2E test scripts were written using our team's internally developed testing framework called Dest and Selenium, Playwright. Dogu web uses Selenium, while Dogu Agent is a desktop app built with Electron, so we tried Playwright for scripting. (We plan to switch to Jest or Pytest as Dest won't be further developed!)
Existing E2E scripts were written based on previous specifications, covering around 30 test cases. We enhanced the scripts to cover more than 130 cases. Using unique or reliably identifiable attributes like id, rather than relying solely on Full XPath, we designed the XPath for scripting to address changes in web UI or the addition of new elements.
Now, many of Dogu's features are tested through E2E, with tests executed with each commit! This brings us more comfort during development. E2E tests can identify abnormal behavior resulting from code changes.
Happy Testing!
Let's take a look at recent E2E test results. All tests passed except on macOS ARM64 device.
We need to investigate the cause of the failure on Mac mini device.
From the logs, it seems we couldn't find an element matching //*access-id="add-host-form-name" XPath. This information is insufficient to determine the issue and actions performed. To address this, we recorded device operations during the routine using record: true, allowing us to review the recorded video and gain insight into the exact actions taken.
This recording capability greatly facilitates debugging. Even if E2E tests fail, debugging time is significantly reduced, making it easier to identify and rectify issues.
In Conclusion...
Now, E2E tests are performed not just on release days but with every commit or PR, reducing the need for last-minute preparations on release days. With Dogu routines and test script writing, tasks that previously required manual repetition, from user registration to feature testing and testing across platforms, have become much easier!
However, E2E testing still doesn't cover all cases, which is one of the downsides. While some functions like iOS streaming and remote testing are still manually performed, E2E now handles many of the previously repetitive tests (e.g., user registration, member addition, streaming, etc.), making development more streamlined.
Are you still manually repeating tests? It might be worthwhile to explore setting up an automated testing environment to improve testing efficiency!