Have you ever had a conversation like this with a developer on your team?
Mara: So…is the IRC meme generator done?
Beck: Yup! All done.
Mara: Ok cool, I’ll take a look at the pull request.
Beck: Oh, wait, I haven’t committed it yet.
If you have, your team probably doesn’t have a consistent definition of the word “done”. At Wattpad, we recently realized that we had this problem. We were going to just write up a checklist that all our developers could consult before declaring “Yup! All done,” but we quickly realized just how important the definition of done is to a company.
To be considered done, anything we do (whether it’s a new feature or a bug fix) should be in the hands of our users (live) and it should make them happy. Yes, this means that a feature could be in production but still not be done. We might, for example, launch a feature in stealth mode at first in order to collect data on performance. Collecting that data is still part of the work for that feature so it’s not done yet. But that’s not all. Another developer, new to the code for a particular feature or bug fix, should also be able to change it easily and be happy in the process. So we’re done if our users are happy and our developers are happy. Makes sense, right?
We hope to achieve this by practicing the following.
Code Quality Standards. You’ve followed the team guidelines for code quality. Things like always leaving the campground cleaner than the way you found it, making sure there are no new compiler warnings, not repeating code, etc.
Manual Testing. You’ve done some thorough manual testing of your own work (on a real device if it’s mobile). You put on your QA hat and really tried hard to break what you just wrote. With millions of users, if you can think of an edge case, it will happen.
Automated Testing. You’ve written unit and integration tests to help future maintainers of your code understand it more easily. You’re also ensuring that regressions on that code will get caught by your tests later and you’re building a safety net to make refactoring less risky in the future.
Performance Testing. Don’t assume your feature will just work when millions of people are using it at the same time. Test it first. Prove that it will work.
Documentation. Yeah, you know how it works. But will another developer six months from now? Document it.
Code Review. Two brains are better than one. Get a teammate to look at your code; they’ll almost always find something you missed.
Rollout Plan. This doesn’t always apply but if, for example, a schema change is required, you’re not done. Plan how to make that schema change and how to roll back if code hits the fan.
Ship It! You’ve crossed your I’s and dotted your T’s so you’re probably feeling pretty confident you can actually deliver your work into the hands of your users. Go for it.
Following Up. As mentioned earlier, it’s not done until users are happy. Now is the time to collect feedback and data to determine if they really are happy. You probably also want to monitor error rates and performance to make sure you didn’t break something else. If all checks out, you’re done!
A company’s definition of done is directly related to the quality of their product. If, for example, what you consider done does not include testing, your product will be buggier than a company’s that does include testing. Thinking you’re done when you’re not is also a great way to incur technical debt. Technical debt won’t necessarily affect the quality of your product but it certainly will affect your team’s velocity (how much stuff they can get done in a given period of time).
More importantly, defining done helps us decide what to do and when. Without a definition, you really can’t estimate how long something will take. One developer may give an estimate based on coding the feature, testing it and writing documentation for it. But another may estimate based solely on coding the feature. Without a consistent way to estimate, it’s impossible to pick the right amount of features to fit into your next sprint. If everyone is on the same page with what it means to be done, you can reliably determine feature A won’t fit into this next sprint, but feature B will. Otherwise you’re just guessing.
It’s also important to have a consistent definition of done across teams. Without that consistency, it makes it really difficult to gauge how the teams are performing. If team A unit tests but team B doesn’t, on the surface, team A will look like they’re working faster. But in reality, their code is probably buggier and less maintainable.
We’re really looking forward to seeing how this improves the quality of our product and how much more consistent our schedules will get.