Saturday, 8 October 2011

Who tests the unit tests?

With unit tests having fast become an accepted method of automating and reducing test time the question has to be asked - how are unit tests themselves tested?

A good developer can always be described as lazy because it's good developers who understand a little effort  invested up front will pay off plenty in the future. It's this philosophy that has spread the wide adoption of unit-testing and specifically test driven development (TDD).

The problem is - like all code - unit tests can themselves contain bugs.  A prime example in the .net framework:

[TestMethod]
        public void TestMethod1()
        {
            //
            // TODO: I'm not doing anything!
            //
        }

Even though this test clearly doesn't test anything, when executed it is a pass!

There are methods out there of how to craft a good unit test which focuses on the properties and structure of a quality test, and of-course standard re-factoring and quality control factors still apply, and here is the problem..

How can a unit test be completely trusted - when they are code themselves?  Are unit tests simply "passing the buck" of responsibility?

Since unit tests are produced by developers. How can a developer absolutely know that a unit test accurately represents a requirement? Do we test it? How? If so who tests the unit tests?

2 comments:

  1. Unit test should fail first before it passes.

    ReplyDelete
  2. I agree, for some reason the logic has been turned on its head when designing the testing framework. Perhaps it's just a throwback from NUnit which the .NET Framework inherited.

    ReplyDelete