Tcases: Auto-generating API Tests

Most development teams that Qxf2 works with, describe and document their APIs using Open API specification. This means, there is a set structure for folks to write tests. Handcrafting the simple test cases can be cumbersome and time consuming. Given there is a set structure, we started to look out for solutions that could create API tests based on an Open API specification. We were not looking for a “complete” solution but just something that automatically produced the easy test cases – verifying response codes, validating the response schema structure, exercising all valid request types for an endpoint, etc. In this post, we will cover Tcases – a tool we found useful for our usecase.

Note: You can skip to the end of this article for some caveats regarding using these auto-generated tests.

Tools for Auto-generating API Tests using OpenAPI Definition

In our hunt for a cool and budget-friendly way to auto-generate API test cases, we checked out a bunch of tools. We were after something easy to use, not crazy expensive, with good support, and updated frequently. Here are the contenders:

IBM API Connect: Yup, it’s user-friendly and can crank out test cases like a champ. But, it comes with a price tag.

Evomaster: It’s sleek, user-friendly, and the best part—it’s free. However, the catch is that the test cases generated in block box mode were more about testing the data retrieved from the API requests rather than the structure of the received data.

Swagger_meqa: It’s free, which is sweet, but the support seems to be non existent, and the last time it got an update was six years ago.

Schemathesis: Now, this one is intriguing. It’s user-friendly, absolutely free, and stays fresh with regular updates. The only hitch is that it needs the OpenAPI specification all the time to run the tests. We have used this tool during writing some contract tests and we like it.

Tcases: Here’s another interesting player. It’s user-friendly, absolutely free, boasts good support, and the developers are keeping it spicy with regular updates as well.

In the end, the showdown came down to Schemathesis and Tcases as our top picks, considering user-friendliness, regular updates, and cost. However, for the deep dive in this blog, we’ll be shining the spotlight on Tcases.

Setting up Tcases

Setting up Tcases for your command line usage is a straightforward process that involves downloading the binary distribution file from the Maven Central Repository. Follow these simple steps to get started:

1. Navigate to the Central Repository page for tcases-shell:
Visit the Maven Central Repository and locate the page for tcases-shell.

2. Select the Latest Version:
Once on the tcases-shell page, identify the entry for the latest version. Click on “Browse” to explore the available files for that version.

3. Download the Distribution ZIP or TAR file:
Choose the desired distribution format by clicking on either “tcases-shell-${version}.zip” for ZIP or “tcases-shell-${version}.tar.gz” for TAR. Download the selected file to your machine.

4. Extract the Contents:
Unpack the downloaded distribution file to a directory of your choice. This directory will now be referred to as your “Tcases home directory.” The extraction process creates a “Tcases release directory” (e.g., tcases-m.n.r) containing all the necessary files for this release.

5. Explore the Release Directory:
Inside the release directory, you’ll find several subdirectories:

  • bin: Contains executable shell scripts used to run Tcases.
  • docs: Houses the user guide, examples, and Javadoc.
  • lib: Holds all the JAR files essential for running Tcases.

6. Adjust the PATH Environment Variable:
To finalize the setup, add the path to the bin subdirectory to the PATH environment variable on your system. This step ensures that you can execute Tcases from any location in your command line.

By following these steps, you’ve successfully set up Tcases on your system. Now, you’re ready to leverage its powerful command line capabilities for test case generation.

Generating Executable Tests

Tcases provides a powerful command-line interface, tcases-api-test, allowing users to easily generate executable tests for their OpenAPI specifications.

1. To run the tcases-api-test command, navigate to your shell command line and execute the following command. In this example we will be generating test cases for Qxf2’s Survey Application.:

tcases-api-test -o src/test/java/org/examples survey-api.yaml

Make sure to replace survey-api.yaml with the actual path to your OpenAPI definition file. For further details on command syntax you an run the command

tcases-api-test -help

2. Upon execution, tcases-api-test performs the following steps:

  • Reading API Definition: The tool reads the API definition from the specified YAML file (in this example, survey-api.yaml).
  • Generating Test Cases: It then generates test cases for each defined request. The generation process involves creating both valid and failure test cases.
  • Writing API Tests: The tool then writes JUnit tests for the generated cases. By default, it uses REST Assured to execute requests. The resulting test class is named based on the title of the OpenAPI definition and is stored in the specified destination directory (src/test/java/org/examples in this example).

3. After running tcases-api-test, a summary of the generation process can be viewed in the tcases-api-test.log file. Additionally, the generated source code for the test class, such as HelpSurveyTest.java, can be inspected. This class includes methods for testing each defined API request, incorporating REST Assured for request execution and validation.

package org.examples;
import org.junit.Test;
import java.util.Map;
import static java.util.stream.Collectors.toMap;
import io.restassured.http.Header;
import io.restassured.response.Response;
import org.cornutum.tcases.openapi.test.ResponseValidator;
import org.hamcrest.Matcher;
import static io.restassured.RestAssured.*;
import static org.hamcrest.Matchers.*;
 
public class HelpSurveyTest {
 
    private ResponseValidator responseValidator = new ResponseValidator( getClass());
 
    @Test
    public void postSurveyAdminQElo_filter_response_BodyDefined_Is_Yes() {
        Response response =
            given()
                .baseUri( forTestServer( "http://localhost:8000"))
                .header( "User", tcasesApiKey())
                .contentType( "application/json")
                .request().body( "{\"end_date\":\"2022-10-10\",\"start_date\":\"2023-01-20\"}")
            .when()
                .request( "POST", "/survey/admin/QElo_filter_response")
            .then()
                .statusCode( isSuccess())
            .extract()
                .response()
                ;
 
        responseValidator.assertBodyValid( "POST", "/survey/admin/QElo_filter_response", response.statusCode(), response.getContentType(), response.asString());
        responseValidator.assertHeadersValid( "POST", "/survey/admin/QElo_filter_response", response.statusCode(), responseHeaders( response));
    }
 
    @Test
    public void postSurveyAdminQElo_filter_response_BodyDefined_Is_No() {
        Response response =
            given()
                .baseUri( forTestServer( "http://localhost:8000"))
                .header( "User", tcasesApiKey())
            .when()
                .request( "POST", "/survey/admin/QElo_filter_response")
            .then()
                // Body.Defined=No
                .statusCode( isBadRequest())
            .extract()
                .response()
                ;
 
        responseValidator.assertBodyValid( "POST", "/survey/admin/QElo_filter_response", response.statusCode(), response.getContentType(), response.asString());
        responseValidator.assertHeadersValid( "POST", "/survey/admin/QElo_filter_response", response.statusCode(), responseHeaders( response));
    }
 
    @Test
    public void postSurveyAdminQElo_filter_response_BodyMediaType_Is_Other() {
        Response response =
            given()
                .baseUri( forTestServer( "http://localhost:8000"))
                .header( "User", tcasesApiKey())
                .contentType( "text/xml")
                .request().body( "-260.8")
            .when()
                .request( "POST", "/survey/admin/QElo_filter_response")
            .then()
                // Body.Media-Type=Other
                .statusCode( isBadRequest())
            .extract()
                .response()
                ;
 
        responseValidator.assertBodyValid( "POST", "/survey/admin/QElo_filter_response", response.statusCode(), response.getContentType(), response.asString());
        responseValidator.assertHeadersValid( "POST", "/survey/admin/QElo_filter_response", response.statusCode(), responseHeaders( response));
    }
....
....
....

By following these steps, Tcases simplifies the process of generating executable tests for your OpenAPI specifications.

How Does It Work?

So, wondering how this magic happens, right? Well, let me break it down for you.

When Tcases gets down to business, it cooks up an executable test faster than you can say “API testing.” What you end up with is a bunch of files that are basically your test program. Build it, run it – it’s that simple.

How does it conjure up this wizardry, you ask? Enter the TestWriter API, the maestro behind the scenes. It brings together three key elements.

  1. Request Test Definition: The foundation of API test generation is laid by a request test definition. This definition outlines the inputs for request test cases and is automatically crafted from an OpenAPI definition.
  2. TestWriter: TestWriter plays a pivotal role by crafting the necessary code tailored to a specific test framework. It translates the abstract test definitions into executable test code. Popular test frameworks like JUnit and TestNG in the Java world fall under this category.
  3. TestCaseWriter: TestCaseWriter is the final piece of the puzzle, which produces code to build and submit API requests using a specific request execution interface. It translates the abstract test case information into API requests using interfaces such as HttpClient or REST Assured in the Java ecosystem.

Exploring some of the options to generate tests

Now let’s take a look at some of the options we can use when generating the tests

1. Select the API server used by generated tests:
When generating tests, it is crucial to specify the API server that will receive requests from the test cases. By default, Tcases utilizes the first API server listed in the OpenAPI definition. However, there are various options available to customize this behavior.

  • Option 1: Select API Server by Index
    To choose a specific API server from the list, you can use the -B index option. In the following example, tests will be generated for the API server at index 2 in the servers list defined in survey-api.yaml:

    tcases-api-test -p org.examples -B index=2 survey-api.yaml
  • Option 2: Select API Server by Description
    Alternatively, you can use the -B contains option to select an API server based on a specific string in its ‘description’:

    tcases-api-test -p org.examples -B contains=Test survey-api.yaml
  • Option 3: Specify API Server URI
    To use a specific API server URI that may not be present in the OpenAPI definition, you can utilize the -B uri option:

    tcases-api-test -p org.examples -B uri=http://localhost:8000 survey-api.yaml

2. Handle an Untrusted API Server:
In scenarios where the API server’s certificate verification is not feasible during testing, Tcases offers the -V option to generate tests that connect to the API without checking the server certificate.

tcases-api-test -p org.examples -V -B uri=http://localhost survey-api.yaml

3. Organize Tests by API Resource Path:
For APIs with numerous endpoints, organizing test cases by resource path can improve manageability. Use the -S option to generate separate test source files for each API resource path:

tcases-api-test -S -n org.examples.Survey survey-api.yaml

4. Exclude Response Validation:
By default, Tcases validates API responses against the OpenAPI definition. To exclude response validation, use the -d false option:

tcases-api-test -p org.examples -d false survey-api.yaml

This flexibility in configuring Tcases for OpenAPI ensures that generated tests align with specific testing needs and server configurations.

Running the Generated Tests

Once you have generated tests using Tcases, it’s essential to understand how to run them. The process involves setting up dependencies, managing test resources, and configuring runtime settings. Let’s break down each step.

1. Set up Test Dependencies
If your generated tests include response validation (which is the default), you’ll need to ensure that the tcases-openapi-test JAR is accessible during compilation and execution. If you are using Maven, you can include the dependency in your project as follows:

<dependency>
    <groupId>io.rest-assured</groupId>
    <artifactId>rest-assured</artifactId>
    <version>4.4.0</version>
</dependency>
<dependency>
    <groupId>org.hamcrest</groupId>
    <artifactId>hamcrest-junit</artifactId>
    <version>2.0.0.0</version>
    <scope>test</scope>
</dependency>
<dependency>
  <groupId>org.cornutum.tcases</groupId>
  <artifactId>tcases-openapi-test</artifactId>
  <version>4.0.2</version>
</dependency>

Alternatively, you can download the JAR directly from the Maven Central Repository.

2. Manage Test Resources
For tests that validate response requirements, Tcases generates response definition files corresponding to each test class. These files are crucial for the tests to run successfully. By default, the response definition files are created in a resource directory determined by the location of the test source file. However, you can specify a different directory using the -d option or, in Maven, -DresourceDir=path.

If using Maven with default settings, the response definition files should be automatically copied to the classpath. Otherwise, adjust command options or settings in your build project to ensure accessibility.

For greater flexibility, you can override the resource directory setting by using the tcasesApiResourceDir system property. For example:

# Specify a custom resource directory for response definitions
mvn test -Dtest=HelpSurveyTest -DtcasesApiResourceDir=/customResources

3. Override the Default API Server
Generated tests default to using the API server specified during code generation. However, you can override this setting during runtime.
To override the default API server, use the tcasesApiServer system property:

# Specify a custom API server for all test requests
mvn test -Dtest=HelpSurveyTest -DtcasesApiServer=http://localhost:8000

4. Define Credentials for Request Authorization
If your OpenAPI definition specifies security requirements, you need to provide the necessary authorization credentials during test execution. Different security schemes require different settings:

API Key:

mvn test -Dtest=HelpSurveyTest -DtcasesApiKey=YOUR_API_KEY

HTTP Basic Authentication:

mvn test -Dtest=HelpSurveyTest -DtcasesApiUser=yourUsername -DtcasesApiPassword=yourPassword

HTTP Bearer Authentication:

mvn test -Dtest=HelpSurveyTest -DtcasesApiBearer=yourBearerToken

By following these steps, you can run the generated tests. Next, let’s have a look at some tips which we can follow when using Tcases.

Some handy tips when using Tcases:

1. Tcases is a useful tool for generating API tests to cover the the simple cases

2. While most generated tests are solid, some can be a bit too rigid. For instance, Tcases might insist on a 10-character date format (like 2023-02-02) for successful validation, disregarding the flexibility of shorter formats (e.g., The same Date can be passed as ‘2023-2-2’ as well). Human review is key!

3. The better your OpenAPI specification, the better Tcases performs. Testers can enhance it by adding missing schema definitions or setting examples for request bodies, improving the overall test suite.

4. Not all generated tests are gold. Testers may need to sift through and remove some redundant or overly strict cases that Tcases produces.

While Tcases is a nifty tool, it’s not foolproof. By understanding its strengths and limitations, testers can harness its capabilities effectively to generate high quality API tests.

Hire testers from Qxf2

As you explore the power of Tcases for API test generation, remember that Qxf2 offers a team of skilled testers who excel in implementing advanced testing techniques. Our expertise extends beyond test automation, ensuring comprehensive and high-quality testing for your software projects. Reach out to us today for testing that goes beyond the ordinary!


Leave a Reply

Your email address will not be published. Required fields are marked *