Throughout my work with clients, I have taken on different roles from
project to project - I've sat on both sides of the table: I have been in
the "ivory towers" of considering how business strategy affects
technology strategy. I've been down in the trenches writing software.
I've been part of bid teams trying to sell solutions and I've helped
evaluate options for purchasing services and solutions.
I believe this has given me an interesting perspective on both how "pointy haired bosses" and developers think.
What is interesting is that both sides tend to lament the lack of being "scientific" in the approach to creating solutions, however what they mean by it varies markedly depending on perspective and the power balance of the culture: developer centric organizations tend to believe that Agile is the one true way to produce solutions, whereas management centric organizations tend to believe in Big Up Front Planning (AKA some variety of "Waterfall").What Science Is Not: Religion, Beliefs & Dogma
I have already implied where I think both of these predominant
perspectives have gone wrong: they are management or developer centric,
hence they miss out a large part of the picture.
Let's start with the management centric perspective and "Waterfall": this mode of solution building is built largely on management beliefs around what constitutes "scientific" when it comes to software development without actually having any practical knowledge of how the work works or what its nature is. For a person with little or no software experience, or limited variety of experience (not knowing any other way), it makes perfect rational sense given the missing knowledge and experience to believe that a software can be created sequentially by planning, designing, writing the code then testing a solution.
However, the problem with this is that what may be a rational belief is in fact only beliefs, and massively deficient at that, based on lacking information and experience. To paraphrase the words of John Seddon, it doesn't work, because the people who design the work don't know how the work works.Agile Anecdote Driven Development
Culturally more developer centric organizations tend to be more likely to adopt some shape or form of Agile. However what Agile "is" tends to vary markedly from organization to organization (I have never seen any two alike). While I believe Agile has a lot to offer, in particular if the mindsets and values can be fully adopted above all, I do believe the practical implementation quite often fails in a variety of ways:
- Practices become unquestionable dogmas, religious beliefs rather than continuously improved engineering practices.
- Attempts to codify and make checklists out of what should fundamentally be mindsets and cultural values.
- People miss out on the greater purpose and goal as they get lost in User Stories.
- Self-organized teams become uncoordinated, isolated island, sometimes even fiefdoms run by the individual(s) most adept at playing politics.
To sum up the Agile dilemma, it really is one of autonomy vs. coordination towards the bigger goals - what is the right mix of direction while allowing people to design their own work?Being Scientific: Knowledge of the Details, Understanding of the Big Picture, Questioning Everything
I'm going to disappoint you by not giving a prescription of what "being scientific" entails, I'm simply going to conclude that the existing predominant perspectives are severely lacking in a number of ways. Most specifically, they lack to take into account the whole, the totality of what creating and running a solution entails. And to add to that, both of the perspectives we have looked at often miss out one major component: operations. The capital expenditure of creating solution will end up creating operational expenditure that will go on long beyond the initial creation, yet this is one part of the puzzle that is rarely even considered.
The answers are not more top-down, nor more bottom-up. It's a greater
understanding of the big picture AND the details: the small parts of how
work works affect the shape of the bigger picture, but the hard thing
is to understand where and why.
For instance, advances in programming languages and technology may radically alter the shape of both how you automate testing and architect solutions, which in turn could affect the bigger principles on which an organization relies to make their solutions and systems work together towards the greater organizational goals.
As a corollary, the greater goals and ultimately purpose of an organization existing, the deeper "why's" of the work existing in the first place affect everything in a top-down manner - it needs to be translated into a set of principles for how solutions fit together towards organizational goals, which in turn need to be understood to make technical decisions that fit with the principles. Furthermore, without understanding the bigger picture "Why's", it is easy to get side-tracked and distracted with work that is irrelevant scope-creep.
The big picture feeds into the detail, the detail feeds into the big
picture. You cannot optimise parts without sub-optimising the whole.
There are no answers other than trying to build an adaptive
organizational system with a clear sense of purpose that is neither
top-down, nor bottom-up, but questions everything and is prepared to
change at a moments notice.
Question everything, learn at every opportunity, uncover implicit assumptions and make them explicit so that they may be questioned as well, rinse and repeat. Yesterdays science is todays pseudo-science.