Thursday, April 18, 2019

Improve Your Unit Testing 12 Ways

Everyone these days seems to understand that having unit tests are an important part of the work of developing software. Yet, people struggle with the practice. Here are 12 ways you can immediately improve your unit tests (and at the bottom, a few ways to go well beyond these 12 rules). 
  1. Select assertions to provide clear error messages.
    Assert.That(x.Equals(y), Is.True) is not it.
    Try Assert.That(x, Equals(y)).
    See the difference in how the test tool explains the failure.
    If the failures aren't clear, add a text message to the assertion.
  2. Go for Maximum Clarity
    The rule here is "you read the tests to understand the code, you should never have to read the code in order to understand the tests."
    The tests need to be better documentation than the code or its comments.
    When this is not true, it's generally because the code isn't doing too many things in a given function. When code is clear and simple, the tests can be also.
  3. Watch Out For Side Effects
    There's nothing more puzzling than finding a test that sets a flag to false, calls a function with no parameters and then checks the text of some string that was never mentioned in the fixture or test setup.
    This indicates that your test is testing side-effects.
    This, in turn, indicates that your functions are being used for their side effects rather than their direct effects -- a sign your signal-to-noise ratio and your cohesion are low.
    When these tests are hard to write, it's because the code is hard to understand. 
  4. Let namespaces work for you.
    Module.namespace.class.function is the full name of your test. If you can introduce a new namespace to group a bunch of related tests, and doing so will make the test method names stand out more clearly, then do so.
  5. Disconnect your database.If the logic you're testing is not written in SQL then don't test SQL.
    Tests that run against shared databases are notorious for creating false positives when data changes in ways that tests didn't take into consideration.
    It is better to not use a database if possible or to ensure the test has its own unique and isolated instance if a database MUST be used,
    Better yet, disconnect tests from databases altogether. Mocks are your friend. You can create interfaces or mock existing interface so that the tests run faster and intended results are obvious.
  6. Avoid magic strings, magic numbers, other magic. It is often in writing tests that I realize I need an enum, const, class, or what-have-you.
  7. Single Responsibility Principle applies to tests.
    You want tests to isolate failures, not conglomerate them. A test should fail for one reason only, and that reason is the purpose of the test.
    Sometimes you can't avoid some collateral influence, but most of the time you can avoid most of it. The more single-focused the test is, the easier it is to understand and maintain.
  8. Refactor your tests.
     Eliminating duplication and structuring the tests so you can find them is not a waste of time. Test utility packages are very helpful and can reduce the size and complexity of your tests.
  9. Ease up on test-class-per-production-class.
    Try writing one test class per testing scenario, even if it means many test classes per production function.
    A test class provides a shared setup. That suggests that a test class should correspond to some initial condition in which various activities are expected to have context-specific results.
    Therefore a test class describes some system state, from the point of view of the code that is being tested.
    Maybe many test classes per production class can be better understood this way, and trying to force that into test class per production class is a mistake.
    Perhaps test classes should have Friends' Naming, such as "TheOneWhereRossBuysADuck."
  10. Consider the system of names used in tests.
    Compare test names to each other, to the class that contains them, to the context in which they operate. Try to make the names reflect the domain and the situation.
    And, particularly, make the name of the test match well with the assertion, because when the test fails people will typically see the name of the test and the assertion message first; they probably shouldn't have to read the file and the text of the test to know what mistake they just made!
  11. Write the Test First
    If you have written the code first, you will write the tests according to what the code does, from the point of view of the coder who just wrote the code.
    If you write the tests first, then you are beginning as a user of the code, and the tests are helping you define a usable and reasonable API. This results in tests and code that are more readable and more usable.
    This outside-in progression tends to produce better tests and better code, where "better" means that it is harder to misunderstand. Misunderstanding the purpose and effect of code in maintenance tends to introduce defects.
  12. Remember the High-Fidelity Rule
    A critical rule of testing is that the code being tested has no idea whatsoever that it is being tested. If the code checks to see if it is running in a test, whatsoever, even once, even in a deeply-buried conditional corner case, then the code is not reliable. You haven't checked to see what it will do in production -- you've written code to pass a test only, not to work even though it is in a test.
If you are trying TDD, you should probably skip right to the FIRST qualities of microtests


  1. Agreed, my only comment is the example for #1.

    I personally can't stand that hamcrest syntax anymore. Much prefer the assertJ format.


  2. Which is fine, of course Ehel. The concern is about the error message. When we get "expected true but was false" when comparing that the result of a calculation was a number less than 1 million, it's not very helpful. Anything that says "expected value to be less than 2,000,000 but was 14,232,222" is *better*.