What are Non-Obvious Systemic Constraints?
In Theory Of Constraints (TOC), Eli Goldratt explains how “constraints” in a system, for example a manufacturing plant, limit the throughput of the whole system (plant). In his later books, he talks more about “policy constraints” in non-manufacturing situations, and how these can be even more pernicious and deleterious to the throughput (effectiveness) of a system or organisation.
In that constraints, by the TOC definition, affect whole systems, we may reasonably regard all constraints as “systemic constraints”.
This post is about those kinds of systemic constraints – particularly those that dramatically impact knowlege-work organisations – that are not so obvious. Not the capacities of machines, teams or people. Not the policies of the organisation – explicit or implicit. Not even the individual attitudes and beliefs of the folks within an organisation.
Some of these “non-obvious systemic constraints” have limited (or maybe more accurately, invisible) impact on the day-to-day work of the organisation, but remarkable impact on the ability of the organisation to evolve, improve, and raise its effectiveness (what I refer to as Rightshifting).
Examples of Non-Obvious Systemic Constraints
So what kinds of “non-obvious systemic constraints” am I talking about? Here are some examples:
- Business As Usual – being busy “getting stuff done”, to the detriment of spending time, attention, etc. on improving the capabilities of the organisation.
- “Work ethic” – paying people (and promoting or demoting them) based on how “willing” or “busy” they appear, rather than on how much of a contribution (often in very intangible ways) they actually make to the success of the organisation (some may describe this as “bureaucracy”).
- Fundamental Attribution Error – e.g. ignorance of Deming’s 95% rule – a lack of an appreciation of the role of “the system” in constraining the productivity of individuals.
- Hierarchy – the way in which conflating report lines, rewards systems, coordination mechanisms and responsibility systems contributes to what Roger L Martin describes as “The Responsibility Virus“.
- Low Trust – when trust is absent or at a low level, this constrains the degree to which people talk to each other about things that matter (and thus to the festering of those issues, with concomitant “temporary” work-arounds).
- “Not-Invented-Here” – collective arrogance or hubris constraining folks’ enthusiasm and opportunities for looking outside their own immediate surroundings to learn about how others have studies and maybe even solved issues.
- Isolated learning, hard limits on exploration – many organisations (generally, unintentionally) make it difficult or impossible for folks to get together and mutually learn about and explore issues and opportunities.
- Fear of conflict – when people try to be “nice” to each other, often to compensate for the inhumanity of their working conditions, the value of conflict gets misunderstood – and even productive, positive conflict becomes something to avoid rather than embrace.
- Pigeonholing – a prevailing, unspoken belief that everyone has their own work to do, their own (non-overlapping) sets of responsibilities, and that there’s negative value in folks getting involved in, or taking over, elements of others’ work.
- Poor appreciation of value-add – a limited or lack of understanding about where it’s best to focus time and effort, for optimum value-add.
- Professional detachment – particularly the kind of professional detachment that constrains people’s willingness and ability to relate to each other as fallible, emotional, joyful, inconstant, messy human beings.
- Homogeneity – the belief that all situations, all people, all clients, must be treated in the same way, to be “fair” and “consistent”.
- Conventional thinking – adopting “work-arounds” for problems by borrowing from popular culture, previous experience, or “industry best practice”, rather than getting to the heart of an issue and finding an apposite solution (preferably, a dissolution). Typical work-arounds include the concept of managers (see also: pigeonholing, hierarchy, Theory-X); command-and-control; bonuses; and functional decomposition of organisational structure (silos).
- Mandatory optimism – aka “Don’t rock the boat”, or “wilful ignorance”. When people realise that any doubts, negative comments or criticisms, however constructive or realistic, are not welcome, this severely constrains any attempts to surface and thus fix troublesome issues. See also: “Smile or Die” animation; and “All Executives are Unethical”.
- Disjoint purpose – different people working disjointly or at cross-purposes for lack of a clear understanding of a common, shared purpose for the system. This constrains people’s ability to constructively resolve issues, as well as constraining their day-to-day work activities.
“A system is a network of interdependent components that work together to try to accomplish the [purpose] of the system. A system must have a [purpose]. Without the [purpose], there is no system.” ~ W E Deming
- Theory X – and authoritarianism in general. When an organisation believes it better to have folks that work, and other folks to tell them what work to do (Scientific Management aka Taylorism), “process police” ensuring homogeneity and conformity, and even how to do the work (micro-managing), this constrains the natural motivation and engagement of the people.
I’m sure you can think of others. If you do, please let me know – I’d love to add them to this list.
Which of these “non-obvious systemic constraints” are limiting the effectiveness, and the rate of change, in your organisation? (Note: following Goldratt, I posit only one of these is the key constraint in any one organisation, at any given time).
I’m open to suggestions about how we might refer to these kinds of “non-obvious systemic constraints”. What do you think?
It Starts With Awareness
Are you aware of these non-obvious systemic constraints and the role they play in your organisation? How about other folks? Given how some of these constraints reduce or prevent discussion, would you even know what other folks think, whether they’re aware or not?
And when awareness dawns, what then? Even though just one constraint may be key, the web these constraints weave, collectively, may seem as intractable as the Tholian Web. Personally, I’d consider applying the TOC tools known as “Current Reality Tree” and “Future Reality Tree” to identify the key point(s) of leverage (key constrain(s)), and the future state that we’d like to see happen. Ackoff might suggest “Reference Projection”. What approach would you favour in this situation?