Occasionally referred to as Test First Development, Test First Programming is a beautiful concept that radically changed the way I approach software development.

The ideas of Test First Programming and Test Driven Development are often muddled together.
However, Test First is powerful enough to stand on it’s own.
I think it’s important to present the concepts separately.

TDD and many other agile practices build on Test First.
This isn’t just about remembering the past.
The lessons learned from Test First are still very important.

Here’s what I’ll cover in this post:

Simple Concept

At it’s core, the idea is very simple.

Write the tests before you write the code.

That’s it really.

I’ve seen the steps of written something like this before:

  1. Write the specification
  2. Write the tests to the specification
  3. Write the code until all of the tests pass

That’s essentially it.
But I don’t find these 3 steps that helpful in telling me actually how to do it.

The next bit will present the steps in a way that I find useful to actually get work accomplished.
After that, I’ll talk a bit about why test first is a good idea.

Steps for Test First Programming

  1. Think about what you want to do.
  2. Think about what it looks like from the customers perspective, including the API.
  3. Think about how to test it.
  4. Write the happy path test cases.
  5. Write the production code.
  6. Expand the tests for more complete behavior coverage.
  7. Write the production code to make the tests pass.

Each of these is way more complex than the one line implies.
Each of these is done on every feature, every behavior change, every fix.

The steps are in order. It is a progression.
There are also loops, iterations, revisits.

Parallelism is also possible for some steps, and welcome.

More on this as I talk about each step.

Steps Expanded

1. Think about what you want to do.

Collect requirements.
Define the feature.
Try to understand what customer problem the feature is solving.
What are the real core requirements to solve the problem.

Features often expand in scope as development proceeds.
Now is a great time to nail down what the minimum viable scope for this feature is.
If others have input as to extensions for the feature, or “it would be cool if” ideas, great. Go ahead and capture those ideas.

But the first iteration should stick to a minimal scope. You can expand iteratively after you have something working that you could sell.

2. Think about what it looks like from the customers perspective, including the API.


  • What’s the API going to look like?
  • Does this solve the customer problem?
  • Is it clean?
  • Is it awkward?
  • Can you imagine using it?
  • Is it easy to explain?


  • What is it going to look like?
  • What input elements are needed?
  • Are they really needed?
  • Is it reasonable for the user to know the information you are asking?

This is a great time to get a usability engineer involved if you have one available.

Tell someone how verbally what the purpose of the feature is, how to use it, and what they should see.
Listen to their questions and feedback.

Anything you want to change after all of this?
Now is the easiest time to change, since there isn’t any production code yet.

3. Think about how to test it.

  • How do we know if the feature works?
  • How do we know if it is not working?
  • What can go wrong? Can we check for that?
  • What is the mission critical part of the feature? How can this be tested?
  • Is any API missing to allow automated tests to interrogate the system for error conditions?
  • What is the riskiest part of the feature? How can this be tested?

For complex features, this step can be something semi-formal.
Write down the information gathered from steps 1-3.
Pass it around to people who might have good input.

Or do this as a quick meeting if you have all the appropriate people in one place.

4. Write the happy path test cases.

Writing the tests allow you to use the API.
If the API is cumbersome, the tests will be a pain to write. If so, change the API. Make it easier to use.

Write enough tests to satisfy the following:

  • The full feature is being tested the way a customer would use it.
  • Every function of the API is being used by at least one test.

Are there functions left over that you didn’t need to use? Can you remove this API function without limiting the user? Then do so.

Take note of things you know you’ll need to test that weren’t captured in step 3.

5. Write the production code.

Write the production code to make one test pass.
Keep going until all of the tests pass.

If your test framework supports xfail, you can mark all of the tests as xfail at first and remove the mark one at a time.

Take note of things you know you’ll need to test that weren’t captured in steps 3 or 4.

Take note of features you want to add but aren’t needed to make the tests pass. It’s natural to want to add bells and whistles. Just don’t do it now. Capture the ideas somewhere and let that be for a future iteration after the minimal feature is done. Then later you can prioritize these ideas with the rest of the feature expansion ideas.

6. Expand the tests for more complete behavior coverage.

Look at the set of tests you have, the feature, the API.

Be as behavior complete as you can be.
Are there error conditions that haven’t been tested for yet?
Make sure the mission critical and the risk area parts identified in step 3 are fully tested.

Many of these new test will pass. Great. However, this does get tricky to make sure you are actually testing the code.
Some people like sabotage the code to make the test fail to make sure they know the test is working.

Of course, some of the new tests will fail. Also great. You’ve discovered a weakness in your code.

Spend the appropriate amount of time on this. And then move on.
I know this is vague. But the time spent on this is “it depends”.

If the feature is small, and not crucial to the success of your product, then over testing it is a waste of time.
If the feature is the most important feature of your product that no other competitor can do, then test the snot out of it.

This step is really the difficult one to get right.
Test writing and test case design is a bit of an art form.
I am going to write about some test case design theory and strategy in a future post.

7. Write the production code to make the tests pass.

Pretty self explanatory. Go fix the code.

Why Test First?

It is undeniable that for complex software, regardless of whether or not TDD or Test First is used, thorough tests are essential to have in place before you can call your software done.

You can write the tests after you write the code.
But there are really good reasons to write the tests first.

  1. You can use the tests to guide you in what to code next.
  2. You can use the tests to tell you when you are done coding, and help avoid feature creep.
  3. If it’s the same person writing the tests and the production code, then the assumptions you made while writing the code will color how you write the tests and you’ll miss ways to use your code. Writing the tests before you write the code will mitigate this somewhat.
  4. Thinking about and writing tests help you understand the specification.
  5. Cumbersome parts of the API can be fixed before the API is implemented.
  6. Holes and inconsistencies in the specification are found out early.
  7. The tests will get written. If you put off the tests until after your code is “done”, there is a reasonable chance that you will ship that code before the tests are done. And then your customers will find your bugs that your testing could have caught.

Be honest. You are gonna do some testing to make sure your code works anyway, why not just go ahead and write the functional tests first and use those to help you develop and debug your code while you are writing it.

Blurred lines between TDD and Test First

This description does sound quite a lot like Test Driven Development. I do think the line is blurry.
And to be honest, my version of TDD resembles what’s presented here closer to some versions of TDD.

I’ll write about many of the flavors of TDD in future posts, including a description of what I use as TDD.

And then I’ll try to remember to swing back around and compare all of them.

As always, let me know what you think, and keep in touch. I love getting email.

Support posts like this.

If this was helpful, show your support: