In our previous blog post, I introduced the Streaming Data Generator, developed at Qxf2 and explored its capabilities in creating diverse data streams. Today, I want to delve into a specific and practical application – creating complex data streams by combining data from multiple endpoints. This method can help simulate real-world scenarios that are challenging to replicate with simple data […]
Using Streaming Data Generator to Create Complex Data Streams
Enhance Test Reproducibility With Streaming Data Generator App
Welcome to Qxf2’s Streaming Data Generator application! This tool makes it easy to generate and stream various kinds of synthetic data, helping with reproducible testing. Built with FastAPI, our app produces streaming data in different waveforms and distributions. It can also generate data with anomalies like random, random-square, clustered, etc. Access this data through HTTP endpoints, making it simple to […]
Simplifying Fake Data Generation in Rust using fake crate
Generating realistic test data is often a challenging task in the software industry. However, Rust’s fake crate provides a simple solution to this problem. By using the fake crate, data generation in Rust becomes effortless, allowing users to generate mock data for testing purposes or any other requirements. This blog post delves into the utilization of the Fake crate in […]
Insights from Git Logs for Testing teams
As testers, we are always looking for various tools that can help us enhance our testing. Our go-to tools are mostly defect tracking tools, exploratory testing of the product, and documentation. While these have been important, what if we could gain a fresh perspective by exploring the development activity? How might this aid us in aligning our testing efforts more […]
Data Validation Using Assistants API: Exploring AI-driven approach
This post extends my previous exploration of conducting data validation tasks using Large Language Models like ChatGPT. To provide context, at Qxf2, we execute a series of data quality tests using Great Expectations. Initially, we explored the possibility of employing ChatGPT for these validations, but it faced challenges in performing them effectively. Now, with the recent release of more advanced […]
Testing Charts using GPT-4 with Vision model
This post builds upon my prior exploration of testing charts with Transformers using the Visual Question Answering approach. I had presented charts to Transformers models like Pix2Struct and matcha from Google (which were specifically trained on charts) and then queried with questions. The outcomes proved satisfactory when the charts were well-defined with clearly labeled data points. Now, with the recent […]
Testing Charts with Transformers using Visual Question Answering (VQA)
I tried testing charts using VQA. What that means is that I showed several charts to an AI model and made it answer questions about it. My idea was to use these answers as part of test automation. This post will show you what (sort of) worked for me and what techniques did not work. I hope people use this […]
Harnessing the Power of Pix2Struct for Testing Images
Machines struggle with verifying things that humans find easy to check. Images, like charts, graphs, diagrams, and maps, are especially challenging for machines to evaluate. In this post, we introduce a method to automatically check important parts of generated images using Pix2Struct, an advanced model from the Transformers library. We were amazed by the model’s effectiveness in our testing of […]
Data Validation with ChatGPT: Trials and Insights
We conducted a study to explore the feasibility of using large language models like ChatGPT for performing validation on numerical data. At Qxf2, we execute a set of data quality tests using Great Expectations. Our goal was to assess the efficiency of leveraging ChatGPT to carry out these validations instead. In order to achieve this, I selected two specific scenarios. […]
Real-time Data Streaming: Neo4j to Flask using Kafka Connect
This post is meant for testers who have to work with Apache Kafka. In this blog, I will show how to build a real-time data streaming pipeline to capture data from Neo4j and stream it to a Flask app by using Kafka. The post has a lot of information so even folks with just superficial knowledge of Apache Kafka can […]