{"id":20547,"date":"2023-12-19T07:33:22","date_gmt":"2023-12-19T12:33:22","guid":{"rendered":"https:\/\/qxf2.com\/blog\/?p=20547"},"modified":"2023-12-19T07:33:22","modified_gmt":"2023-12-19T12:33:22","slug":"tcases-auto-generating-api-tests","status":"publish","type":"post","link":"https:\/\/qxf2.com\/blog\/tcases-auto-generating-api-tests\/","title":{"rendered":"Tcases: Auto-generating API Tests"},"content":{"rendered":"<p>Most development teams that <a href=\"https:\/\/qxf2.com\/?utm_source=Tcases&#038;utm_medium=click&#038;utm_campaign=From%20blog\">Qxf2<\/a> works with, describe and document their APIs using Open API specification. This means, there is a set structure for folks to write tests. Handcrafting the simple test cases can be cumbersome and time consuming. Given there is a set structure, we started to look out for solutions that could create API tests based on an Open API specification. We were not looking for a &#8220;complete&#8221; solution but just something that automatically produced the easy test cases &#8211; verifying response codes, validating the response schema structure, exercising all valid request types for an endpoint, etc. In this post, we will cover <a href=\"https:\/\/github.com\/Cornutum\/tcases\" rel=\"noopener\" target=\"_blank\">Tcases<\/a> &#8211; a tool we found useful for our usecase.<\/p>\n<p>Note: You can skip to the end of this article for some caveats regarding using these auto-generated tests.<\/p>\n<h3>Tools for Auto-generating API Tests using OpenAPI Definition<\/h3>\n<p>In our hunt for a cool and budget-friendly way to auto-generate API test cases, we checked out a bunch of tools. We were after something easy to use, not crazy expensive, with good support, and updated frequently. Here are the contenders:<\/p>\n<p><strong><a href=\"https:\/\/www.ibm.com\/products\/api-connect\/api-testing\" target=\"_blank\" rel=\"noopener\">IBM API Connect<\/a>:<\/strong> Yup, it&#8217;s user-friendly and can crank out test cases like a champ. But, it comes with a price tag.<\/p>\n<p><strong><a href=\"https:\/\/github.com\/EMResearch\/EvoMaster\" target=\"_blank\" rel=\"noopener\">Evomaster<\/a>:<\/strong> It&#8217;s sleek, user-friendly, and the best part\u2014it&#8217;s free. However, the catch is that the test cases generated in block box mode were more about testing the data retrieved from the API requests rather than the structure of the received data.<\/p>\n<p><strong><a href=\"https:\/\/github.com\/meqaio\/swagger_meqa\" target=\"_blank\" rel=\"noopener\">Swagger_meqa<\/a>:<\/strong> It&#8217;s free, which is sweet, but the support seems to be non existent, and the last time it got an update was six years ago.<\/p>\n<p><strong><a href=\"https:\/\/github.com\/schemathesis\/schemathesis\" target=\"_blank\" rel=\"noopener\">Schemathesis<\/a>:<\/strong> Now, this one is intriguing. It&#8217;s user-friendly, absolutely free, and stays fresh with regular updates. The only hitch is that it needs the OpenAPI specification all the time to run the tests. We have used this tool during writing some contract tests and we like it.<\/p>\n<p><strong><a href=\"https:\/\/github.com\/Cornutum\/tcases\" target=\"_blank\" rel=\"noopener\">Tcases<\/a>:<\/strong> Here&#8217;s another interesting player. It&#8217;s user-friendly, absolutely free, boasts good support, and the developers are keeping it spicy with regular updates as well.<\/p>\n<p>In the end, the showdown came down to Schemathesis and Tcases as our top picks, considering user-friendliness, regular updates, and cost. However, for the deep dive in this blog, we&#8217;ll be shining the spotlight on Tcases.<\/p>\n<h3>Setting up Tcases<\/h3>\n<p>Setting up Tcases for your command line usage is a straightforward process that involves downloading the binary distribution file from the Maven Central Repository. Follow these simple steps to get started:<\/p>\n<p>1. Navigate to the Central Repository page for tcases-shell:<br \/>\nVisit the Maven Central Repository and locate the page for <a href=\"https:\/\/central.sonatype.com\/artifact\/org.cornutum.tcases\/tcases-shell\/4.0.2\/versions\" target=\"_blank\" rel=\"noopener\">tcases-shell<\/a>.<\/p>\n<p>2. Select the Latest Version:<br \/>\nOnce on the tcases-shell page, identify the entry for the latest version. Click on &#8220;Browse&#8221; to explore the available files for that version.<\/p>\n<p>3. Download the Distribution ZIP or TAR file:<br \/>\nChoose the desired distribution format by clicking on either &#8220;tcases-shell-${version}.zip&#8221; for ZIP or &#8220;tcases-shell-${version}.tar.gz&#8221; for TAR. Download the selected file to your machine.<\/p>\n<p>4. Extract the Contents:<br \/>\nUnpack the downloaded distribution file to a directory of your choice. This directory will now be referred to as your &#8220;Tcases home directory.&#8221; The extraction process creates a &#8220;Tcases release directory&#8221; (e.g., tcases-m.n.r) containing all the necessary files for this release.<\/p>\n<p>5. Explore the Release Directory:<br \/>\nInside the release directory, you&#8217;ll find several subdirectories:<\/p>\n<ul>\n<li><strong>bin:<\/strong> Contains executable shell scripts used to run Tcases.<\/li>\n<li><strong>docs:<\/strong> Houses the user guide, examples, and Javadoc.<\/li>\n<li><strong>lib:<\/strong> Holds all the JAR files essential for running Tcases.<\/li>\n<\/ul>\n<p>6. Adjust the PATH Environment Variable:<br \/>\nTo finalize the setup, add the path to the bin subdirectory to the PATH environment variable on your system. This step ensures that you can execute Tcases from any location in your command line.<\/p>\n<p>By following these steps, you&#8217;ve successfully set up Tcases on your system. Now, you&#8217;re ready to leverage its powerful command line capabilities for test case generation.<\/p>\n<h3>Generating Executable Tests<\/h3>\n<p>Tcases provides a powerful command-line interface, <code>tcases-api-test<\/code>, allowing users to easily generate executable tests for their OpenAPI specifications.<\/p>\n<p>1. To run the <code>tcases-api-test<\/code> command, navigate to your shell command line and execute the following command. In this example we will be generating test cases for Qxf2&#8217;s <a href=\"https:\/\/github.com\/qxf2\/qxf2-survey\" rel=\"noopener\" target=\"_blank\">Survey Application<\/a>.:<\/p>\n<pre lang=\"bash\">tcases-api-test -o src\/test\/java\/org\/examples survey-api.yaml\r\n<\/pre>\n<p>Make sure to replace survey-api.yaml with the actual path to your OpenAPI definition file. For further details on command syntax you an run the command<\/p>\n<pre lang=\"bash\">tcases-api-test -help<\/pre>\n<p>2. Upon execution, <code>tcases-api-test<\/code> performs the following steps:<\/p>\n<ul>\n<li>Reading API Definition: The tool reads the API definition from the specified YAML file (in this example, survey-api.yaml).<\/li>\n<li>Generating Test Cases: It then generates test cases for each defined request. The generation process involves creating both valid and failure test cases.<\/li>\n<li>Writing API Tests: The tool then writes JUnit tests for the generated cases. By default, it uses REST Assured to execute requests. The resulting test class is named based on the title of the OpenAPI definition and is stored in the specified destination directory (src\/test\/java\/org\/examples in this example).<\/li>\n<\/ul>\n<p>3. After running <code>tcases-api-test<\/code>, a summary of the generation process can be viewed in the <code>tcases-api-test.log<\/code> file. Additionally, the generated source code for the test class, such as <code>HelpSurveyTest.java<\/code>, can be inspected. This class includes methods for testing each defined API request, incorporating REST Assured for request execution and validation.<\/p>\n<pre lang=\"java\">\r\npackage org.examples;\r\nimport org.junit.Test;\r\nimport java.util.Map;\r\nimport static java.util.stream.Collectors.toMap;\r\nimport io.restassured.http.Header;\r\nimport io.restassured.response.Response;\r\nimport org.cornutum.tcases.openapi.test.ResponseValidator;\r\nimport org.hamcrest.Matcher;\r\nimport static io.restassured.RestAssured.*;\r\nimport static org.hamcrest.Matchers.*;\r\n\r\npublic class HelpSurveyTest {\r\n\r\n    private ResponseValidator responseValidator = new ResponseValidator( getClass());\r\n\r\n    @Test\r\n    public void postSurveyAdminQElo_filter_response_BodyDefined_Is_Yes() {\r\n        Response response =\r\n            given()\r\n                .baseUri( forTestServer( \"http:\/\/localhost:8000\"))\r\n                .header( \"User\", tcasesApiKey())\r\n                .contentType( \"application\/json\")\r\n                .request().body( \"{\\\"end_date\\\":\\\"2022-10-10\\\",\\\"start_date\\\":\\\"2023-01-20\\\"}\")\r\n            .when()\r\n                .request( \"POST\", \"\/survey\/admin\/QElo_filter_response\")\r\n            .then()\r\n                .statusCode( isSuccess())\r\n            .extract()\r\n                .response()\r\n                ;\r\n\r\n        responseValidator.assertBodyValid( \"POST\", \"\/survey\/admin\/QElo_filter_response\", response.statusCode(), response.getContentType(), response.asString());\r\n        responseValidator.assertHeadersValid( \"POST\", \"\/survey\/admin\/QElo_filter_response\", response.statusCode(), responseHeaders( response));\r\n    }\r\n\r\n    @Test\r\n    public void postSurveyAdminQElo_filter_response_BodyDefined_Is_No() {\r\n        Response response =\r\n            given()\r\n                .baseUri( forTestServer( \"http:\/\/localhost:8000\"))\r\n                .header( \"User\", tcasesApiKey())\r\n            .when()\r\n                .request( \"POST\", \"\/survey\/admin\/QElo_filter_response\")\r\n            .then()\r\n                \/\/ Body.Defined=No\r\n                .statusCode( isBadRequest())\r\n            .extract()\r\n                .response()\r\n                ;\r\n\r\n        responseValidator.assertBodyValid( \"POST\", \"\/survey\/admin\/QElo_filter_response\", response.statusCode(), response.getContentType(), response.asString());\r\n        responseValidator.assertHeadersValid( \"POST\", \"\/survey\/admin\/QElo_filter_response\", response.statusCode(), responseHeaders( response));\r\n    }\r\n\r\n    @Test\r\n    public void postSurveyAdminQElo_filter_response_BodyMediaType_Is_Other() {\r\n        Response response =\r\n            given()\r\n                .baseUri( forTestServer( \"http:\/\/localhost:8000\"))\r\n                .header( \"User\", tcasesApiKey())\r\n                .contentType( \"text\/xml\")\r\n                .request().body( \"-260.8\")\r\n            .when()\r\n                .request( \"POST\", \"\/survey\/admin\/QElo_filter_response\")\r\n            .then()\r\n                \/\/ Body.Media-Type=Other\r\n                .statusCode( isBadRequest())\r\n            .extract()\r\n                .response()\r\n                ;\r\n\r\n        responseValidator.assertBodyValid( \"POST\", \"\/survey\/admin\/QElo_filter_response\", response.statusCode(), response.getContentType(), response.asString());\r\n        responseValidator.assertHeadersValid( \"POST\", \"\/survey\/admin\/QElo_filter_response\", response.statusCode(), responseHeaders( response));\r\n    }\r\n....\r\n....\r\n....\r\n<\/pre>\n<p>By following these steps, Tcases simplifies the process of generating executable tests for your OpenAPI specifications.<\/p>\n<h3>How Does It Work?<\/h3>\n<p>So, wondering how this magic happens, right? Well, let me break it down for you.<\/p>\n<p>When Tcases gets down to business, it cooks up an executable test faster than you can say &#8220;API testing.&#8221; What you end up with is a bunch of files that are basically your test program. Build it, run it &#8211; it&#8217;s that simple.<\/p>\n<p>How does it conjure up this wizardry, you ask? Enter the TestWriter API, the maestro behind the scenes. It brings together three key elements.<\/p>\n<ol>\n<li><strong>Request Test Definition:<\/strong> The foundation of API test generation is laid by a request test definition. This definition outlines the inputs for request test cases and is automatically crafted from an OpenAPI definition.<\/li>\n<li><strong>TestWriter:<\/strong> TestWriter plays a pivotal role by crafting the necessary code tailored to a specific test framework. It translates the abstract test definitions into executable test code. Popular test frameworks like JUnit and TestNG in the Java world fall under this category.<\/li>\n<li><strong>TestCaseWriter:<\/strong> TestCaseWriter is the final piece of the puzzle, which produces code to build and submit API requests using a specific request execution interface. It translates the abstract test case information into API requests using interfaces such as HttpClient or REST Assured in the Java ecosystem.<\/li>\n<\/ol>\n<h3>Exploring some of the options to generate tests<\/h3>\n<p>Now let&#8217;s take a look at some of the options we can use when generating the tests<\/p>\n<p>1. <strong>Select the API server used by generated tests:<\/strong><br \/>\nWhen generating tests, it is crucial to specify the API server that will receive requests from the test cases. By default, Tcases utilizes the first API server listed in the OpenAPI definition. However, there are various options available to customize this behavior.<\/p>\n<ul>\n<li>Option 1: Select API Server by Index<br \/>\nTo choose a specific API server from the list, you can use the <code>-B index<\/code> option. In the following example, tests will be generated for the API server at index 2 in the servers list defined in survey-api.yaml:<\/p>\n<pre lang=\"bash\">tcases-api-test -p org.examples -B index=2 survey-api.yaml\r\n<\/pre>\n<\/li>\n<li>Option 2: Select API Server by Description<br \/>\nAlternatively, you can use the -B contains option to select an API server based on a specific string in its &#8216;description&#8217;:<\/p>\n<pre lang=\"bash\">tcases-api-test -p org.examples -B contains=Test survey-api.yaml\r\n<\/pre>\n<\/li>\n<li>Option 3: Specify API Server URI<br \/>\nTo use a specific API server URI that may not be present in the OpenAPI definition, you can utilize the -B uri option:<\/p>\n<pre lang=\"bash\">tcases-api-test -p org.examples -B uri=http:\/\/localhost:8000 survey-api.yaml\r\n<\/pre>\n<\/li>\n<\/ul>\n<p>2. <strong>Handle an Untrusted API Server:<\/strong><br \/>\nIn scenarios where the API server&#8217;s certificate verification is not feasible during testing, Tcases offers the -V option to generate tests that connect to the API without checking the server certificate.<\/p>\n<pre lang=\"bash\">tcases-api-test -p org.examples -V -B uri=http:\/\/localhost survey-api.yaml\r\n<\/pre>\n<p>3. <strong>Organize Tests by API Resource Path:<\/strong><br \/>\nFor APIs with numerous endpoints, organizing test cases by resource path can improve manageability. Use the -S option to generate separate test source files for each API resource path:<\/p>\n<pre lang=\"bash\">tcases-api-test -S -n org.examples.Survey survey-api.yaml\r\n<\/pre>\n<p>4. <strong>Exclude Response Validation:<\/strong><br \/>\nBy default, Tcases validates API responses against the OpenAPI definition. To exclude response validation, use the -d false option:<\/p>\n<pre lang=\"bash\">tcases-api-test -p org.examples -d false survey-api.yaml\r\n<\/pre>\n<p>This flexibility in configuring Tcases for OpenAPI ensures that generated tests align with specific testing needs and server configurations.<\/p>\n<h3>Running the Generated Tests<\/h3>\n<p>Once you have generated tests using Tcases, it&#8217;s essential to understand how to run them. The process involves setting up dependencies, managing test resources, and configuring runtime settings. Let&#8217;s break down each step.<\/p>\n<p>1. Set up Test Dependencies<br \/>\nIf your generated tests include response validation (which is the default), you&#8217;ll need to ensure that the tcases-openapi-test JAR is accessible during compilation and execution. If you are using Maven, you can include the dependency in your project as follows:<\/p>\n<pre lang=\"yaml\">\r\n<dependency>\r\n    <groupId>io.rest-assured<\/groupId>\r\n    <artifactId>rest-assured<\/artifactId>\r\n    <version>4.4.0<\/version>\r\n<\/dependency>\r\n<dependency>\r\n    <groupId>org.hamcrest<\/groupId>\r\n    <artifactId>hamcrest-junit<\/artifactId>\r\n    <version>2.0.0.0<\/version>\r\n    <scope>test<\/scope>\r\n<\/dependency>\r\n<dependency>\r\n  <groupId>org.cornutum.tcases<\/groupId>\r\n  <artifactId>tcases-openapi-test<\/artifactId>\r\n  <version>4.0.2<\/version>\r\n<\/dependency>\r\n<\/pre>\n<p>Alternatively, you can download the JAR directly from the Maven Central Repository.<\/p>\n<p>2. Manage Test Resources<br \/>\nFor tests that validate response requirements, Tcases generates response definition files corresponding to each test class. These files are crucial for the tests to run successfully. By default, the response definition files are created in a resource directory determined by the location of the test source file. However, you can specify a different directory using the -d option or, in Maven, <code>-DresourceDir=path<\/code>.<\/p>\n<p>If using Maven with default settings, the response definition files should be automatically copied to the classpath. Otherwise, adjust command options or settings in your build project to ensure accessibility.<\/p>\n<p>For greater flexibility, you can override the resource directory setting by using the <code>tcasesApiResourceDir<\/code> system property. For example:<\/p>\n<pre lang=\"bash\"># Specify a custom resource directory for response definitions\r\nmvn test -Dtest=HelpSurveyTest -DtcasesApiResourceDir=\/customResources\r\n<\/pre>\n<p>3. Override the Default API Server<br \/>\nGenerated tests default to using the API server specified during code generation. However, you can override this setting during runtime.<br \/>\nTo override the default API server, use the tcasesApiServer system property:<\/p>\n<pre lang=\"bash\"># Specify a custom API server for all test requests\r\nmvn test -Dtest=HelpSurveyTest -DtcasesApiServer=http:\/\/localhost:8000\r\n<\/pre>\n<p>4. Define Credentials for Request Authorization<br \/>\nIf your OpenAPI definition specifies security requirements, you need to provide the necessary authorization credentials during test execution. Different security schemes require different settings:<\/p>\n<p>API Key:<\/p>\n<pre lang=\"bash\">mvn test -Dtest=HelpSurveyTest -DtcasesApiKey=YOUR_API_KEY\r\n<\/pre>\n<p>HTTP Basic Authentication:<\/p>\n<pre lang=\"bash\">mvn test -Dtest=HelpSurveyTest -DtcasesApiUser=yourUsername -DtcasesApiPassword=yourPassword\r\n<\/pre>\n<p>HTTP Bearer Authentication:<\/p>\n<pre lang=\"bash\">mvn test -Dtest=HelpSurveyTest -DtcasesApiBearer=yourBearerToken\r\n<\/pre>\n<p>By following these steps, you can run the generated tests. Next, let&#8217;s have a look at some tips which we can follow when using Tcases.<\/p>\n<h3>Some handy tips when using Tcases:<\/h3>\n<p>1. Tcases is a useful tool for generating API tests to cover the the simple cases<\/p>\n<p>2. While most generated tests are solid, some can be a bit too rigid. For instance, Tcases might insist on a 10-character date format (like 2023-02-02) for successful validation, disregarding the flexibility of shorter formats (e.g., The same Date can be passed as &#8216;2023-2-2&#8217; as well). Human review is key!<\/p>\n<p>3. The better your OpenAPI specification, the better Tcases performs. Testers can enhance it by adding missing schema definitions or setting examples for request bodies, improving the overall test suite.<\/p>\n<p>4. Not all generated tests are gold. Testers may need to sift through and remove some redundant or overly strict cases that Tcases produces.<\/p>\n<p>While Tcases is a nifty tool, it&#8217;s not foolproof. By understanding its strengths and limitations, testers can harness its capabilities effectively to generate high quality API tests.<\/p>\n<h3>Hire testers from Qxf2<\/h3>\n<p>As you explore the power of Tcases for API test generation, remember that Qxf2 offers a team of skilled testers who excel in implementing advanced testing techniques. Our expertise extends beyond test automation, ensuring comprehensive and high-quality testing for your software projects. <a href=\"https:\/\/qxf2.com\/contact?utm_source=Tcases&#038;utm_medium=click&#038;utm_campaign=From%20blog\">Reach out to us<\/a> today for testing that goes beyond the ordinary!<\/p>\n<hr>\n","protected":false},"excerpt":{"rendered":"<p>Most development teams that Qxf2 works with, describe and document their APIs using Open API specification. This means, there is a set structure for folks to write tests. Handcrafting the simple test cases can be cumbersome and time consuming. Given there is a set structure, we started to look out for solutions that could create API tests based on an [&hellip;]<\/p>\n","protected":false},"author":29,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[43,395,394],"tags":[],"class_list":["post-20547","post","type-post","status-publish","format-standard","hentry","category-api-testing","category-openapi","category-tcases"],"_links":{"self":[{"href":"https:\/\/qxf2.com\/blog\/wp-json\/wp\/v2\/posts\/20547","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/qxf2.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/qxf2.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/qxf2.com\/blog\/wp-json\/wp\/v2\/users\/29"}],"replies":[{"embeddable":true,"href":"https:\/\/qxf2.com\/blog\/wp-json\/wp\/v2\/comments?post=20547"}],"version-history":[{"count":28,"href":"https:\/\/qxf2.com\/blog\/wp-json\/wp\/v2\/posts\/20547\/revisions"}],"predecessor-version":[{"id":20689,"href":"https:\/\/qxf2.com\/blog\/wp-json\/wp\/v2\/posts\/20547\/revisions\/20689"}],"wp:attachment":[{"href":"https:\/\/qxf2.com\/blog\/wp-json\/wp\/v2\/media?parent=20547"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/qxf2.com\/blog\/wp-json\/wp\/v2\/categories?post=20547"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/qxf2.com\/blog\/wp-json\/wp\/v2\/tags?post=20547"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}