What are the fundamental concepts of the Theory of Constraints?

By Eli Schragenheim and Dave Updegrove

Eli.S note: this article is the result of a collaboration between Dave Updegrove and I, on the topic of defining the essence of the Theory of Constraints (TOC). 

We claim that every beneficial insight removes a current limitation that prevents us from achieving value.  The limitations TOC deals with are usually caused by flawed paradigms, or assumptions. 

With this basic insight, Dave and I have looked into the TOC body-of-knowledge, and tried to understand, for every insight, concept or tool, what was the limitation that the particular insight had removed, meaning what value couldn’t have been generated and now, using the new insight, we are able to.

Next step is to understand the flawed paradigm behind the limitation, so we’ll be able to clearly understand the scope of new potential value that can be reached.

We have chosen what we believe to be the most generic concepts of TOC and the limitations, followed by the identified flawed assumptions, behind these concepts, and also the somewhat lower-level insights/concepts/tools they impact.

Three key concepts that address the three fears of every manager: complexity, uncertainty, and conflicts.

Concept 1: Inherent Simplicity.

All systems (for instance, organizations) are inherently simple, despite their apparent complexity.

In systems, a few or even one point (Constraint[s]) controls the performance of the whole system, and a few or even one root cause (Core Conflict[s]) generates the vast majority of problems.

Limitation addressed:

Being unable to predict, in a good enough way, the consequences of an action or imposed change.  This failing to predict consequences vastly reduces the quality of decisions.

Flawed assumption:

Treating our current reality as complex, thus failing to make the efforts to identify the few variables that significantly impact the consequences of any action or change.

We can find and manage the few points controlling the system

Example of following the flawed assumption:

Dividing a complex system into subsystems, assuming, 1) that they are much less complex, and 2) that this can help to predict the local impact of any change; further hoping that optimizing local systems will result in good enough prediction of the impact on the whole.

This expectation is most damaging.

Resulting/affected applications:

The Five Focusing Steps, the Four Concepts of Flow, the Three Questions

Concept 2: Inherent Consistency (Harmony):  

There are no conflicts (or inconsistencies) in reality.

All conflicts (on inconsistencies) exist only in our minds. One or more invalid assumptionsare behind every perceived conflict (or inconsistency).

Limitation addressed:

Having to compromise between two conflicting actions, where each action is necessary to satisfy a necessary condition for achieving a desired common objective.

By compromising we get significantly less value for the desired objective.

Flawed assumption:

We know and accept our perception of “reality.”  Actually, every perception of reality is based on many (hidden) assumptions.  It is possible that challenging just one assumption, meaning creating a situation where that assumption is not valid, opens the way to get much more of the common objective.

The perception of a “conflict” should trigger us to reveal our assumptions and then look actively for a valid (realistic) way to challenge them.

Example of following the flawed assumption:

The seesaw conflict of holding less inventory to lower investment and carrying costs, versus holding more inventory to ensure availability to the system, allowing generating more value.

Resulting/affected applications:

The Evaporating Cloud (Conflict resolution diagram), The Change Matrix / Procon cloud

Concept 3: Inherent Consistency (Harmony):  There are no conflicts (or inconsistencies) in reality.

All conflicts (on inconsistencies) exist only in our minds. One or more invalid assumptions produce any perceived conflict (or inconsistency).

Limitation addressed:

Having to compromise between two conflicting actions, where each action is necessary to satisfy a necessary condition for achieving a desired common objective.

By compromising we get significantly less value for the desired objective

Flawed assumption:

We know and accept our perception of “reality.”  Actually, every perception of reality is based on many (hidden) assumptions.  It is possible that challenging just one assumption, meaning creating a situation where that assumption is not valid, opens the way to get much more of the common objective.

The perception of a “conflict” should trigger us to reveal our assumptions and then look actively for a valid (realistic) way to challenge them.

Example of following the flawed assumption:

The seesaw conflict of holding less inventory to lower investment and carrying costs, versus holding more inventory to ensure availability to the system, allowing generating more value.

Resulting/affected applications:

The Evaporating Cloud (Conflict resolution diagram), The Change Matrix / Procon cloud.

Two Key Tools

Concept 4: Inherent Causality:

Systems are subject to cause-and-effect dynamics.

To understand and manage a system, apply rigorous cause-and-effect logic, governed by the Categories of Legitimate Reservation

Limitation addressed:

Being unable, even when accepting the inherent simplicity, to answer the three questions:

  • What to Change?
  • What to change to?
  • How to cause the change?

Flawed assumption:

Using logic is too cumbersome, subjective, and difficult to quantity, to make it effective in finding answers.

A few simple, learnable logical tools can greatly enhance analysis and provide answers to important questions.

Example of following the flawed assumption:

Attempting to independently solve any undesirable effects (symptoms) in the organization without considering root cause(s).

Resulting/affected applications:

The TOC Thinking Processes, The Three Questions

Concept 5:  Inherent Valuation

By dividing expenses into truly-variable costs and the cost of capacity, an entire system and each of its parts may be properly valued. The focus is on Throughput: the pace of generating goal-units. For commerial organizations throughput is: the periodical revenues minus the truly variable expenses.

Limitation addressed: 

It is very complicated to predict the financial outcomes of a suggested action, trying to evaluate directly its impact on revenues and expenses.  Without a well-accepted procedure to make such decisions, managers would be afraid to use such a complicated analysis. 

Flawed assumption:

Not distinguishing between linear behavior and non-linear.

Subtracting truly variable expenses from revenues (resulting in Throughput), and considering non-truly variable expenses to be part of Operating Expense allows us to reliably assess the system prtformamce and the contribution of each of its parts. 

Example of following the flawed assumptions:

Believing in and utilizing cost-per-unit, which is based on assuming expenses behave in a linear way, thus if the cost-per-unit is $1, then the cost of 25 units is $25. This dramatically distorts the real performance of the whole system.

Resulting/affected applications:

Throughput Accounting, Throughput Economics, Operations, Project, and Replenishment Planning, The Six Questions.

Two Beneficial Beliefs

Concept 6:  Inherent Goodness – People are Good.

The reasons for negative outcomes or evets in our systems does not come from people’s nature (good or bad), but from their assumptions and circumstances.

Limitation addressed:

Failing to achieve a desired objective due to contradictory behavior of other people, which wasn’t anticipated or understood.

Flawed assumption:

It is impossible to understand the behavior of other people. Thus, we cannot find the right way to convince them to behave in a way that would contribute to what we want to achieve.

The previous pillar of resolving conflicts also highlights the case where other people act to achieve something that clashes with what we are trying to achieve.

This pillar is wider than direct conflict with other people, it highlights our inability (limitation) to understand the motivation, or resistance, of other people to our initiatives.  From a business perspective, there is special importance to understand our clients, the clients of our client, our suppliers, and our employees.

It is difficult to use cause and effect logic alone to describe the motivation of another person. It should be possible however, based on some known effects and generic assumptions about human behavior, to reduce the overall impression of complexity. In other words, there are a few critical variables that should be considered, including practical gain, ego, and fear.

Example of following the flawed assumption:

Blame and finger pointing – “I did my part. ‘So-and-so’ is the problem…”

Resulting/affected applications:

The Engines of Harmony.

I suggest you read Eli Schragenheim’s article that deals with the extreme, yet realistic, cases of EVIL, and how rhis insight should be treated.

https://elischragenheim.com/2023/10/18/goldratt-claimed-that-people-are-good-how-can-we-understand-that/

Concept 7: Inherent Potential:  Never say “I Know.”

The more solid the base, the higher the jump. Any situation can be substantially improved by identifying new opportunities with significant added-value.  Thus, added-value potential is unlimited.

Limitation(s) addressed:

Being successful makes it difficult, and looks very risky, to identify new, big opportunities.

Success can lead to recency bias and inertia that limit searching for more opportunities.

Flawed assumptions:

I’ve made great improvements and am successful enough – no need for more, and there is no secure way to achieve more.

Actually, the opportunity is in fact huge and can be achieved safely.

If all you have is a hammer, everything looks like a nail.

New opportunities should always be looked at from a fresh perspective, not assuming the solution a priori.

Examples of following the flawed assumptions:

Thinking that since you are better than you used to be, there is no need to continue improving.

Thinking that new, big ways to generate more value do not exist.

Assuming, without careful examination, that a new opportunity will respond to the solution applied to the last opportunity.

Resulting/affected applications:

The Evaporating Cloud, The Three Questions, The S&T Trees, Decisive Competitive Edge (DCE), The Six Questions of Technology.

Three resulting, breakthrough, insights

One result from Inherent Simplicity is:

Inherent Focus:  All systems have one or very few constraints that determine their overall performance. We can maximize the performance of any system by identifying its constraint(s), deciding how we can best exploit them, subordinating everything else to these decisions, and getting more constraint capacity when necessary.

Limitation addressed:

Being unable to focus on what is truly constraining performance prevents very significant leaps of improvement.

Flawed assumptions:

We have many constraints that shift all the time.

Improvements in most areas have very minor impact on overall performance.

If we can make each part of the system more efficient, the entire system will be more efficient.

Improvement at the true constraint greatly improves the performance of the entire system.

Example of following the flawed assumption:

Policies driving local improvements, “peanut butter” spread budget cuts across the entire system.

Resulting/affected applications:

The Five Focusing Steps.

One result from Inherent Tolerance is:

Inherent Control / Buffer Management:  Effective priorities for meeting all our planning objectives can be generated by monitoring the state of the buffers.

Limitations addressed:

Buffers give us only limited protection; we are still exposed to some accumulated fluctuations that disrupt performance.

Our initial buffers are based on guesses.  Continuing to guess doesn’t improve the fitness of the buffers to protect performance from the actual level of uncertainty.

Measuring buffer penetration often provides “early warning” of the potential impact of disruptions.

Flawed assumptions:

When things go wrong it is already too late to react. Too frequent reactions, like expediting, might worsen the overall reliability.

Example of following the flawed assumptions:

Adding more and more status reporting and data analysis to our daily work, thinking that more data yield better information.

Resulting/affected applications

Planned Load, Capacity buffers, Simplified Drum-Buffer-Rope, Critical Chain Project Management, TOC Distribution/Replenishment.

One result from Inherent Potential is:

Inherent Value from Innovation:  Use Goldratt’s Six Questions for assessing the value of a new technology, but expand them to evaluate projects, new strategic moves, and new products

Limitation addressed:

  • Developing anything new is very risky

Flawed assumptions:

Asking potential customers to evaluate future value, which doesn’t exist today and being disappointed from the confused answers. People need to see the product in order to evaluate its value.

Risk funding: Investing in many innovations, expecting that 1 in 10, or even 1 in 20 will yield very high value – enough to cover all the rest and still leave good profit.

A breakthrough is achieved by analyzing future value, without asking potential users, by using the Six Questions of Technology for many different types of seemingly innoated proposals.

Each of the six questions is required to gain the most value from any innovation!

Published by

Eli Schragenheim

My love for challenges makes my life interesting. I'm concerned when I see organizations ignore uncertainty and I cannot understand people blindly following their leader.

Leave a comment