Sequel: how methods can help us to assure quality

Our last post centered on the variety of tools available to assist engineers in controlling and improving product quality. Let’s now see how methods can help us to assure quality

The methods

Development starts with a requirements analysis, or with writing good user stories if you prefer the agile style. A certain scheme for capturing requirements helps to pin down their essentials and serves as a reminder for what one tends to forget, such as motivations, priorities, prerequisites, dependencies as well as non-functional requirements and non-requirements. Incomplete or even wrong requirements are a good recipe for disaster and ensure bad quality, no matter how good the following process is.

The design of the solution is the first step done exclusively by developers. The design quality can have a tremendous impact on the product quality, or on how easy it becomes to assure it. This begins with the very basics, such as appropriate modularity. The divide and conquer approach helps not only to accomplish teamwork but also to test, maintain, debug, and possibly replace parts independently.

In a model-driven architecture, lots of repetitive code can be generated automatically, guaranteeing consistency and reducing the chance of manual errors, typically of the copy and paste variety. In fact, large parts of the PTV xServer data model and communication infrastructure are automatically generated from UML.

In general design as in coding, the “keep it simple, stupid” (KISS) principle is key for creating a maintainable and testable solution. The fewer pieces you have, the better. If you cannot reduce the number of pieces, at least minimize the “moving” ones: do things statically whenever feasible. The principle of postponing optimizations until they are needed is related to the KISS principle. Quoting the famous Donald E. Knuth: “Premature optimization is the root of all evil”.

Reuse of existing solutions is another principle. While it is often necessary to reinvent some wheels (after all, we no longer use wooden wheels, do we?), you should actively try to reuse previous solutions. What you gain is the time usually required for these solutions to mature. If you have no previous solution, it is a good idea to buy one. Of course, you often don’t even need to buy one: if licenses permit, we make liberal use of free open source software. In the PTV xServer, over fifty carefully chosen open source tools and libraries are used for development or are integrated in the final product.

Preventing errors is only one part of the design work. Other quality aspects include appropriate error detection, fault tolerance, as well as failover and recovery mechanisms. Useful error logs, watchdogs, decoupled processes, and automatic restarts usually do not come for free but must be explicitly designed.

Extreme programming methods benefit coding and testing quality. We often program in pairs, benefitting from their built-in quality control and exchange of know-how. Of course, we change pairs depending on the skills needed, or split them up when a task becomes straightforward. High test coverage even allows us to change large parts of the software without having to fear catastrophic side-effects – a process known as refactoring. Regular refactoring is the only way to reduce the impacts of software aging – the decay of your software fitness due to the fact that environments, technologies, expectations are changing around it.

Tools, methods and principles are a sound basis for creating a solid product, and they need to be embedded into a good process. So our next week’s post will deal with the processes.

Previous posts:

  1. Software Quality Assurance for the PTV xServer Product Line – a Report from the Trenches
  2. Software Quality Assurance for the PTV xServer
  3. How do we achieve quality?
  4. Tools for quality assurance

[ratings]