Technical testing at Qxf2: Q1 of 2022

Qxf2 often brands our type of testing as technical testing. We do not care about “manual testing” or “automated testing”. We struggle to explain what that means to other folks who are stuck in the “manual vs automation” mindset. Technical testing is one of those terms that is easier to experience than to explain. Given how often we need to explain this term, we thought it best to document some of the work we do. This way you get a flavour of what we mean by technical testing. In this post, we will go through the new practices, technologies, tools and types of testing that were implemented by Qxf2 testers in the first three months of 2022.

Why this post?

Many testers tell us that they know Qxf2 does a “different kind of testing”. But they do not really know what that “different kind of testing” means. We also come across talented testers that are stuck doing routine work because their projects have no scope for creative work. We thought we would share a bit of what we are doing. This might give you ideas to make your work more technical. It might also motivate you to become a technical tester.

Some examples of technical testing

We will try to keep this post in story form since no single tester can know about all these different areas. We will also focus on the problems being solved and the creative twists that our testers introduced to seemingly mundane tasks. Even if you do not follow the solution, pay attention to the problem we were solving. Then see if you also are facing a similar problem within your team.

Containerizing some complex tests

One of the teams we work on was introducing many simple errors. Our tester noticed that this was happening because the developers found it cumbersome to run certain tests on their local setup because of the amount of setup and maintenance the tests required. So he built a lightweight docker container that included the test setup. Now developers are able to run the tests easily. Any maintenance to the test and test setup is simply done by updating the container. This has greatly improved the quality of the product reaching the QA team. If you are in a team that is refusing to run some automated tests on local environments, do consider containerization to distribute some of your more complex tests.

It is one thing to know about Docker as a tester. But it is another to be able to solve a real-world testing problem with it. When we talk about technical testing, we mean coming up with solutions like these to improve the testing within the team.

Proactive health checks

A Qxf2 tester is working on a highly distributed product with about 40 different teams. He quickly realized that his team had several dependencies on other services. He dug through Jira and spotted that many bugs reported against his team were due to other services being unavailable. Having paid attention and been curious about DevOps before, he designed some simple health check scripts. The scripts are in Python and use the Paramiko library. These health checks have been very useful to even the non-technical folks in his team. They help his team distinguish between software problems specific to the team vs ones that are being caused by other services. This has greatly reduced the time they spend on unwanted triage of problems. This is an example of where a tester writes code to solve a technical testing problem but is not writing “automated tests”.

Testing AWS lambdas using batch messages

One of our testers discovered a really cool bug and opened up doors for all of us to test lambdas better. She was testing a lambda which behaved fine when sending messages one after the other. But when she sent several batched messages and played around with the timing, all sorts of errors started to happen. Potentially, the end users of that product could receive multiple emails, many of which would be obsolete by the time they arrived. This was cool learning for us at Qxf2. She was able to implement this test because she was comfortable with AWS SAM CLI. This is another example of being technical. If the tester were non-technical, she could not have designed the batch messages well nor could she have played around with the timing. Yet another example where technical testing does not mean writing test automation.

TestOps and CrowdStrike

One of our testers, who works in a TestOps team has spent a bit of his time getting them setup with Falcon CrowdStrike. This helps his TestOps team identify breaches and monitor endpoint security. Credit to our client – they chose the tool. It just happened that our tester had bandwidth and the mindset to try new things. So he went ahead and got the tool running. He has also documented his work and learning. This will eventually help the support teams when customers start using CrowdStrike with their deployments.

Introducing contract testing

A Qxf2 tester introduced contract testing for his team. He used Pact and Python. This has helped his team deploy independent of the other teams. After introducing contract testing for this specific client, we identified several new problems that we need to address. Most are socio-technical in nature. The other teams (the other sides of the contract) did not want contract testing because they could not see its value immediately and they thought the effort would be too much. For now, our tester has implemented a minimal set of tests for all sides of the contract. We should get better at selling the idea and getting buy-in from all teams. If some of you have implemented contract tests successfully at your company, we would love to learn how you got past the resistance from within your teams.

Using mocking in integration tests

One of the applications we were testing lives on AWS infrastructure. The automated tests we were running were taking long. Our tester used moto to mock out certain portions of AWS to make some tests go faster, at least locally. We still need to run the long running tests on some of the integration environments. But in local and on CI environments, this one change has made an enormous difference. Moto is a Python module that is typically used for unit testing. But since our tester was technical enough, she was able to spot its potential and use it in specific instances of an integration test.

UI test automation frameworks

While the above examples are pretty technical and do not involve UI automation, we do not shy away from implementing UI test automation. We work with a variety of frameworks and languages. The choice really depends on the client, the team composition and the application being tested.


One of our testers implemented Testcafe and a BDD layer over it to test a React application. The project itself was to support a new version of the UI that was being written in React. While TestCafe is old, we chose it because it suited the make up of the team. The BDD layer was brought in to help some non-technical folks to participate in the test writing process. Now that all the team members can write tests on their own, the dependency on the automation team has been reduced. This work is an example of where we have made a rather deep engineering tradeoff – we went with a slightly older tool because it suits the context of the team.

Karate and ReportPortal

Another Qxf2 tester got a Unicorn started with test automation. The client did not have any automation tools or framework. Our tester interviewed the team, figured out what might have been stopping their previous efforts, understood their technical levels and helped introduce a new automation framework. She chose Karate framework and integrated it with Report Portal as a reporting tool.

But implementing a framework is relatively easy. Our tester worked on the adoption of the framework. The Qxf2 engineer helped everyone with the local setup of Karate. She wrote up tickets for test automation and made sure the existing QA were working on these tasks every sprint. She supported the code review process. The Qxf2 tester also helped whenever some more complex tests needed her help. To boost adoption, she also wrote incomplete tests and asked folks to complete them – that way everyone on the team could get familiar with the framework without having to learn it from scratch. This is an example where we were smart enough to realize that creating a framework alone is not enough – you want to get the whole team to write and maintain the framework.


Somewhat related to UI test automation frameworks, is our work with UI testing. Our client had an urgent need to support their product on mobile browsers as well. They wanted to test their application on all mobile device browsers but setting up a simulator for every OS version, every browser, and every mobile model was challenging. Our tester suggested BrowserStack tool to test their application on all the various browsers for all the mobile models and their OS versions, helping the client test quickly and fix any issues that may arise. She could not have spontaneously given the suggestion if she was not technical. Further, there is no way the client would have agreed unless they already had a high opinion of her technical skills.

Hopefully, this small write-up gives you some examples of what we mean by “technical testing”. It would be fantastic if some of our anecdotes give you ideas on how you can improve testing at your company. We hope to continue this series and document some more examples every so often. Please share stories of your testers and how they solved complex problems for their client. We would like to learn from your experiences as well.

Leave a Reply

Your email address will not be published. Required fields are marked *