NDP Software 

Compelling Software • Design & Construction

Software Development Maturity Model

Re-writing the buggiest function in the system offers much more benefit than working on a relatively stable area. The same goes for organizations: if the salespeople close dozens of deals, but there are no programmers to create the vaporware, spending more money training salespeople won't help. Fix the development team. This is obvious, but notion of identifying a software organization's weakness was the germ of the idea outlined below.

I've worked with dozens of organizations through the years, and all have their strengths and weaknesses. I've been asked numerous times to assess "how's our software development process going?" I can certainly dig in and see their weaknesses, but having an external grading system, like the table below, allows more reasoned recommendation.

The following table summarizes categories of software development maturity within an organization. It was inspired by the CMM (capability maturity model), but in no way tries to emulate it.

To use this table, first assess where the organization is with respect to each area. From my experience, most organization won't fit simply into one maturity level, rather different levels in different areas. They are better at some things more than others. For example, it's common to have source code control working well, but no bug tracking system.

Once you have done this assessment, the organizations strengths and weaknesses will be more apparent. The CMM echo that before becoming mature in one area, and organization should address the other areas first. This simply means focus on improving the areas where you are weakest. Focusing on the weakest areas will provide the most benefit for the organizational change investment. That was the idea behind putting this table together.

Level 1 is the lowest level. Often it is characterized by "surprises". It is indicated by a software organization that has delivered a piece of software, and unfortunately, many organizations are not at this level.

Level 2 organizations have delivered software repeatedly, but not without significant hiccups. Often it is characterized by "disappointment". There are misfires, abandoned projects, but the software is going out.

Level 3 is the mature software organization that is has the people and projects aligned. Opportunities are identified, software created, and customers satisfied. There will be mistakes, but they are identified as early as possible.

Level 1 Level 2 Level 3
requirements, product and project management
the working software outlines the requirements; future requirements may be identified, but they are un-prioritized, incomplete; the process is chaotic or ad hoc; quickly changing product vision; success may require heroics; failures; abandoned projects active project management, consistent successes; published, prioritized requirements (or user stories); combined with fire drills, late or buggy releases; the product development process is defined, although not necessarily followed or effective predictable success; agile, efficient, maximize value; institutionalized, optimizing; both strategic and tactical plans, with the ability to be opportunistic; seldom build low-value features; shared understanding of process; ability to re-prioritize requirements efficiently; reliable estimates; risk management
programming team
individuals build the products functional team, code ownership, code style guide; shared designs as needed shared code ownership, shared software designs, patterns and strategy; code style compliance, code reviews; professional development, cross-training;
Code
"Hey, it works!"; silo-ed knowledge; code mine-fields and buggy areas; home-grown solutions; unmodifiable legacy areas works well; some legacy problem areas; core business concepts validated and documented; duplicate solutions to same problems collective ownership; consistency; organized; modern patterns and tools; integrated modern tools
quality control
defect tracking system includes post-its and to-do lists; for QA, everyone chips in, or "sales team looks it over" defect tracking system, QA plan; functional tests; may have some automated testing, unit testing; most changes go through system testing, but may be emergency fixes that skip process integrated in development; continuous integration, code coverage metrics, other software metrics as needed; QA plan in place and executed
programming tools
tools (editors, compilers, etc.) modern, professional tools unified, modern and professional tools
source code management and builds (configuration management)
have the source code in hand, backups, numbered releases modern tool such as svn, cvs, git, p4, etc.; mostly script-able builds linkage of code changes to requirements and bugs; visibility and metrics into code; fully-scripted and automated builds;
release process
determined (or delayed) by software quality and feature completion scheduled based on feature completion at regular intervals; early planning sacrifices features; late sacrifice of quality or ship date always releasable quality; maintain quality and release date; sacrifice low-value features

Copyright (c) 2007-2016 NDP Software. All Rights Reserved.