Join the waitlist
open submenu

Robotics for Verification and Validation (V&V) Testing

The technology behind Digital Transformation takes many forms. One of them is robotics. While many companies use robotics to complete tasks in manufacturing or customer service (known as Robotic Process Automation), YSoft Labs is looking at robotics to advance the capabilities of quality assurance.
 
As you can imagine, with the increasing applications of IoT, human interactions with devices will be on the rise. These devices need to be tested, and traditional test scripts are no longer adequate. So, let’s look at one area in which robotics can help advance the traditional quality assurance area.

Most of us think of robots as doing repetitive tasks – repeatedly taking an instruction and performing the action. This is a normal robotic task as robots are good at this sort of thing, whereas humans need to take breaks and make mental mistakes because of boredom. Therefore, robots are often used in quality assurance scenarios where verifying features can be tested.

In verification testing, a robot can test whether a feature works according to specification or not. It is binary. The feature either did what it was specified to do or it didn’t. An example might be, if I press the power-on button, does the power come on or not? If I press the power-on button twice, did the machine power on and then power off, or did it do something else? And if I press the button for a long time, etc.

However, with recent computer vision capabilities and declining sensor costs, robots have become very good at validation testing. With validation testing, the question is not whether something happened or not; it answers the question of whether what happened was expected or if the results meet users' needs. In some ways, this is akin to using robots for usability testing, but there are also ways robotics have evolved to do even more.

Let’s start with usability testing. In human form, users may be asked to perform a task using a computer or device and be timed to see how fast the user interface is and how fast the user interface evolves in a time period – to understand if the user can complete the task quickly.
 
Another example would be if the user understood the results of an action and could take the next step in a process. To some extent, regular robots can do this kind of usability testing very well. However, humans can typically only test one variable at a time with any accuracy.

When paired with computer vision, sensors, and artificial intelligence, robots can perform many kinds of validation and verification (V&V) tests simultaneously. When a test fails, the robot can recognize the ‘fail state’ and continue with the proper action, capture the situation, reboot and continue testing another area, or stay in the “fail state” and call for assistance.
 
For example, a robot may be scripted to recognize a screen icon to test the launch of an application on a touch screen. Typically, the script would include the exact position on the screen where the icon could be found. However, if the icon is in a different location, the robot cannot perform the test. In this case, computer vision technology allows the robot to find whether the correct icon is there and where it is. Similarly, if the user interface changes or looks different on another device, the robot must be able to find the icon to continue the test.
 
Internet of things-02

The real industry challenge is the combination of verification and validation simultaneously. This approach is typically impossible to do manually, but with the increasing complexity of SW and HW systems, the importance is much greater. In the current dynamic world with extremely fast progress, there is no space to rely just on a functionality approach (verification) or to split functionality and SW & HW qualities into separate areas. When the robotic system can do validation and verification at the same moment, it provides a faster feedback loop for developers while also supporting greater quality.

Using the same scenario, when a human user encounters an unexpected response – perhaps the load time of a screen is taking too long, for example, the user knows what to do to continue. With computer vision and artificial intelligence, a robot can also learn to take the necessary steps to continue.

Sensors can also help robots in validation testing in other ways. Temperature sensors will enable robots to test conditions in various hot/cold scenarios. Proximity, motion, weight, level sensors, and simple counters can also aid in validation testing.
 
Verification and validation testing only scratch the surface of robotic abilities. With YSoft AIVA robotics, you can also do integration and regression testing as well as quality assurance tests.
 
Robotics for Quality Card Reader Testing