Skip to main content

How to choose what tests to automate

Today I wanted to discuss the basic automation question - what exactly do we want to automate?
We can of course be extreme and say "everything" or "nothing" but both of those options are a very costly choices, first takes a lot of resources the other get you pay for it at the end.
I was consulting a client once where they had a complex process of ROI calculation that resulted in them not automating anything as the process to choose what to automate was time consuming by itself and ironically enough was not part of the Return Of Investment formula.

With this client we had to find and streamline the process to make a quick and easy decision regarding the automation candidates and I wanted to share this process with you.

To better understand the process we will need to answer two simple questions - Why and What:
  1. Why we automate
  2. What we automate

Why we automate?

Imagine yourself in an epic competition Human against the Machine in a battle of hammering nails into the wood. 
You prepare yourself by performing a set of complex stretches, warming your hands and feeling the weight of the hummer, the pure fresh air of a spring morning gives you the bust of energy that you need. While the other team of lab folks fine tuning the shining robot and adjusting the configurations before the start signal. Then the announcement coming in, but not a one that everyone expects. 
"Instead of a long, straight, board with nails to hummer", says the voice through the loadspeaks "we decided to introduce a small change - the board will be snake shaped"
The crowed murmurs, the lab team scrambles in a frenetic panic and you smile to yourself.
Then the boards are being replaced to a super long snake shaped boards with hundred of nails to hummer that will probably take you hours to finish and the gong  announces the beginning of the competition.
You are jumping on the first nail with the crowd supportive cheers, but you competitor stands still. What happened? You competitor, it seems, cannot start as all the tuning and the calculation had to be adjusted to the new terrain of the game. 
By the time you are half way through with the nails the lab team finally got their act together, finished the new calculations, applied the configuration and the robot is on its way.
"If the boards were shorter I would have won now", you think to yourself with a sigh, but unfortunately the things are not that easy. With a blink of the eye the robot catching up, passing you, and in another several minutes finishing the competition as an absolute winner.
"Rough start but an amazing finish" announcing the voice. And the crowd roars.

So from this story we can clearly see why we want to automate:  


We Automate To Save Time In The Long Term


If we have a short term project should we automate - probably No. If we are in a multi-level long term project should we automate - absolutely Yes.  


What we automate?

Imagine that in the previous story the competition was not about hammering nails but about inventing a story based on 10 randomly requested subjects. While you could come up with some sort of answers easily enough the lab team would probably needed hours if not days to program the robot to tell the new story each time.

So we can easily see that the best candidate for automation is:


Repetitive tasks


Back to my client, when we were able to narrow down our interest to "repetitive tests that we will require to execute for many releases in the future" we had easily arrived to to the right answer every time, keeping the decision making short and simple.


Comments

Popular posts from this blog

Proper QA estimation in Agile project

  Today, I want to chat about a common issue in Agile development – story point estimation. You see, it's easy to get caught up in estimating how long a feature will take to develop and forget about the crucial Quality Assurance (QA) effort.  It happens to often: your team is in the middle of sprint planning. New user stories are being discussed, and estimations are given, but there's a catch. You've estimated the development time perfectly, but you've barely even glanced at how much effort QA will take. Why? Agile talks about Development Team and you mistakenly thought that it is all about DEVELOPMENT. Sound familiar? Trust me; it's a more common scenario than you might think. So, what's the big deal? Well, when you leave QA effort out of the equation, several not-so-great things can happen: Vilocity can suffer: You might think you can squeeze more into a sprint than is humanly possible, setting your team up for disappointment and overcommitment. Quality can de

What is Quality Assurance?

What is Quality Assurance (QA) in the software development world? This one will be the most popular question,with the widest variety of answers, which by its own gives me a good reason to answer it. So, I will start with dictionary(Wikipedia) definition: Quality assurance, or QA (in use from 1973) for short, is the systematic monitoring and evaluation of the various aspects of a project, service or facility to maximize the probability that minimum standards of quality are being attained by the production process. QA cannot absolutely guarantee the production of quality products . By large this is a nice definition, though it has two issues that need to be discussed further more: 1.minimum standard of quality. 2.quality of a product. But before we go there let me say something about "assurance"

Performance testing vs Load testing vs Stress testing

Today I want to discuss a popular topic regarding a difference between Load and Performance testing. Those two types of testing are commonly used together but there are several key differences between the two. To get a better understanding of the topic lets have a real life example from one of my clients and use it to explain the difference. I have worked for a client that was building an in-house web application that provides its customers with an option to select and order different products and services. The request was, before the up-coming release, to test the performance of their product. We started the task by trying and understand what they expect from performance point of view and then went through an exercise of defining what is captured by what test. The client said that they are looking to have about 900-1300 active users on the site at a given moment, and the expected response time (for a page to load) should be less than 5 seconds. With those details what are the t