Skip to main content

Importance of Unit Tests

Anyway I'm writing a post on creating my first set of unit tests by myself for a small console app that a colleague created, and I thought to lead up to it I'd write a brief post explaining the importance of unit tests, why they're important and how they can make our lives as QA easier.

Wikipedia defines Unit Testing as "unit testing is a method by which individual units of source code, sets of one or more computer program modules together with associated control data, usage procedures, and operating procedures, are tested to determine if they are fit for use". So essentially, in everyday terms, it's the smallest possible piece of testable code.

Some people would argue that unit tests are a developers task, but I feel being the first form of QA on the code, that it's only right that as a QA you should play some role in coming up with then. Now you don't have to be able to write the unit tests, but at the very least you should sit down and come up with the scenarios, and maybe even pair program on writing the tests, if that's what floats your boat. 

The overall driving factor behind unit testing is that it's cheaper to fail fast and early, the cost of fixing a bug increases exponentially the further down the development cycle it's found, with Unit Tests they give you instant feedback when a test fails.


If a unit test fails it's often easy to debug and quick to see why it failed, even better if the unit tests are run at check in and fail the check in if any unit tests don't pass, this way it encourages developers to code better, and to make sure that unit tests are passing and testers only ever see good quality builds being deployed, and time isn't wasted sitting around waiting for a working build.

Unit Tests are quick and easy to run and maintain, unlike UI tests, they aren't brittle, they don't rely on interacting with a User Interface. Unit tests ensure that the code does what it is expected to do, whereas acceptance tests ensure that the application does what the business/users expect it to.

It's also important to get involed as a QA in coming up with scenarios in unit tests, often if a test is covered by a unit test then it might mean there isn't a need for an acceptance test to be written and run(obviously this is very much specific to the codebase etc).



Comments

  1. What does Programmers dont want to write Unit Tests ?

    ReplyDelete
    Replies
    1. Good point. I think you should try and educate them, and the business. Often it's not developers who don't want to write unit tests, but time constraints stopping them from writing them. If you can convince the business and the developers of the value in spending that little bit longer writing unit tests, the benefits will be tenfold.

      Delete

Post a Comment

Popular posts from this blog

Coding something simple.... or not! Taking a screenshot on error using Selenium WebDriver

I recently wrote a little function that takes a screenshot at the end of a test if it has errored. What sounded very simple at the start turned out to be quite a bit of work, and quite a few lines of code to handle certain scenarios! It's now over 50 lines of code! I'll start with what I had at the beginning, this was to simply take a screenshot in the working directory, we are using SpecFlow and Selenium to run the tests, so we are going to check if the ScenarioContext.Current.TestError isn't null, if it is, then using Selenium, take a screenshot (note the below code is a simplified version of what I had at the beginning). [AfterScenario]         public static void TakeScreenShotOnError()         {             if (ScenarioContext.Current.TestError == null) return;             var screenshotDriver = Driver as ITakesScreenshot;             if (screenshotD...

How to manage resources within new teams?

Working where I work we are constantly spinning up new teams to take on new workloads as business come up with new demands and new features they want developed and tested. The problem with this is how do we ensure the work of the newly spun up team is of sufficient quality. One method is by taking people from other established teams and placing them on the new team. This works great for the new team, but unfortunately it will oftenl eave the established team lacking in a resource whilst they try and fill the gap left by the person who has left. We are seeing this often with our offshore teams, it can be damaging to the team structure and the teams velocity, but try as I might, I can't think of another way around it. It's far easier to take 1 person from a team that is established than it is to build a whole new team from scratch. At least by leaving the core of a team in place, you should be guaranteeing that the new team are aware of any coding standards or any QA standard...

Considerations when creating automated tests

We recently released to a number of teams our automated regression pack that has been worked on over the past few months. This regression pack tests legacy code, but contains a large number of tests.  As a bit of background, a number of teams are working on new solutions whilst some are still working on legacy code. With this in mind we constructed an email with a list of guidelines when creating new tests that need to be added to this regression pack.  I figured that these can be quite broad so should apply for any organisation, so thought it would make an interesting blog post...  So here goes,  when creating automated tests, it's important to consider and adhere to the following: - Think about data . The tests need to retrieve or set the data they need without any manual intervention - This should help them be more robust and easier to run without manual intervention. - The tests need to be idempotent - By making it so that each test is standalone and does...