Quality Assurance Driven Development - And The Resulting Damage
Recently, QA departments spotted the opportunity for measurement of unit-test coverage. This makes the situation even worse. Because writing unit tests takes time, it is often much easier to test obvious code in order to raise the test coverage. Meanwhile, I get suspicious when an average corporate project shines with test-coverage greater than 80%. The question then is: what do the remaining 20% consist of?
I encountered a more serious problem in an "agile" project (with a huge amount of unit tests): a considerable unit-test coverage of CRUD-cases (like masterdata management), while some really hard-to-test algorithms were simply skipped, causing problems in production. All because they simply wouldn't have contributed enough to the code-coverage results...
On top of that, you can even generate your tests to increase the coverage. Even System.out.println can be tested and the acronym would be cool as well: Code Driven Test Generation (CDTG).
In summary, instead of believing in numbers, sometimes a portion of common sense could really streamline your development. The problem here is that you then need to get rid of many buzzwords, acronyms, processes, and tools...