Between Sophisticated and Simple Solutions and the Role of Software

Smart people like to exploit their special capability by finding sophisticated solutions to seemingly complex problems.  Software allows even more sophistication and with better control. 

However, two immediate concerns are raised.  One is whether the impact on the actual results is significant enough to care for?  In other words: Do the results justify the extra efforts?  The other is whether such a solution wouldn’t fail in reality?  In other words, what is the risk of getting inferior results?

Simple solutions focus on only few variables, use much less data, and the inherent logic can be well-understood by all the involved players. However, simple solutions are not optimal, meaning more could have been achieved, at least theoretically.

Here is the basic conflict:

Until recently we could argue that the simple solutions have an edge, because of three common problems of sophistication.

  1. Inaccurate data could easily mess the optimal solution, as most sophisticated solutions are sensitive to the exact value of many variables.
  2. People executing the solution might misunderstand the logic and make major mistakes, which hurt achieving the excellent expected results.
  3. Any flawed basic assumption behind the optimal solution disrupts the results.
    • For instance, assuming certain variables, like sales of an item at different locations, are stochastically independent.
    • When the solution is based on software, then bugs might easily disrupt the results.  The more sophisticated is the algorithm the higher the chance for bugs that aren’t easily identified. 

The recent penetration of new technologies might push back towards sophistication.  Digitization of the flow of materials through shop-floors and warehouses, including RFID, advanced the accuracy of much of the data.  Artificial Intelligence (AI), coupled with Big Data, is able to consider the combined impact of many variables and also take into account dependencies and new discovered correlations. 

What are the limitations of sophisticated automation?

Two different types of potential causes for failures:

  1. Flawed results due to problems of the sophisticated algorithm:
    • Missing information on matters that have clear impact.
      • Like a change in the regulations, a new competitor etc.
        • In other words, information that humans are naturally aware of, but are not included in the digital databases.
    • Flawed assumptions, especially regarding modelling reality and software bugs.  It includes assessments of the behavior of the uncertainty and the relevance of past data to the current state.
  2. Misunderstanding the full objective of top management.  Human beings have emotions, desires and values.  There could be results, which are in line with the formal objective function, but violates certain key values, like being fair and honest to clients and suppliers.  These values are hard to code.

The human mind operates in a different way than computers leading to inconsistencies in evaluating a good solution.

The human mind uses cause-and-effect logic for predicting the future, using informal and intuitive information.  On one hand intuitive information might be wrong.  On the other hand, ignoring clear causality and truly relevant information could definitely yield inferior results.

Artificial Intelligence uses statistical tools to identify correlations between different variables.  But it refrains from assuming causality, and thus its predictions are, many times, limited to existing data and fail to consider recent changes that didn’t happen in the past.  The only way to predict the ramifications of such a new change is by cause-and-effect.

Human beings are limited in making a lot of calculations.  The human capacity also limits the number of different topics it can deal with at a period of time. 

Another aspect to consider is the impact of uncertainty.  The common approach to uncertainty is that it adds considerable complexity to the ability to predict the future based on what is known from the past. 

Uncertainty significantly limits our ability to predict anything that lies within the ‘noise’.  The noise can be described as the “common and expected uncertainty”, meaning the combined variability of all the relevant variables, focusing on the area of the vast majority of the cases (say 90% of the results), ignoring rare cases.  So, any outcome that falls within the ‘noise’ should not come as a surprise.  As long as the ‘noise’ stays at about the same level, it represents a limit to the ability to predict the future.  But that is already more than nothing, as it is possible to outline the boundaries of the noise, and predictions that are beyond the noise should be the focus for analysis and decisions.

Goldratt said: “Don’t optimize within the noise!”

Good statistical analysis of all the known contributors to the noise might be able to reduce the noise.  According to Goldratt this is often a poor exploitation of management time.  First, because in most cases the reduction in the noise is relatively small, while requiring efforts to look for the additional required data.  Secondly, it takes time to prove the reduction in the noise is real.  And thirdly, the most important, is that there are other changes that could improve the performance well beyond the existing noise.

A potential failing of statistical analyses is considering past data that are no longer relevant due to a major change that impacts the relevant economy.  One can wonder whether forecasts that consider data before Covid-19 have any relevance to the future after Covid.

The realization that a true improvement of the performance should be far above the noise greatly simplifies the natural complexity, and could lead to effective simple solutions, which are highly adaptive to significant changes, which are beyond the natural noise.

Demonstrating the generic problem:

Inventory management is a critical element for supply chains.  Forecasting the demand for every specific item at every specific location is quite challenging.  Human intuition might not be good enough.  The current practice is to determine a period of time, like two weeks, of inventory from item X at location Y, where the quantity of “two weeks of inventory” is determined through either a forecast or determination of an average sale-day.

Now, with much more sophisticated AI it is assumed that it is possible to accurately forecast the demand and align it with supply time, including the fluctuations in the supply.  However, forecasts are never one precise number, and so are the supply time.  Every forecast is a stochastic prediction, meaning it could vary.  Having a more accurate forecast means that the spread of the likely results is narrower than for less accurate forecast.  The sophisticated solution could try to assess the damage of shortages versus surpluses, however part of the required information for such an assessment might not be in the available data.  For instance, the significant damage of a shortage is often the negative response of the customers.  It might be possible to track actual loss of sales due to shortages, but it is challenging to assess the future behavior of disappointing customers.

The simpler key TOC insight for inventory management is to replenish as fast as possible.  This recognition means narrowing down the forecasting horizon.  Actually, TOC assumes, as an initial good-enough forecast, no change in the demand for that horizon, so replenishing what was sold yesterday is good enough. 

Another key insight is to base the target inventory not just on the on-hand stock, but to include the inventory that is already in the pipeline.  This is a more practical definition, as it represents the current commitment for holding inventory, and it makes it straight-forward to keep the target level intact.

Defining the target inventory to include both on-hand and pipeline stock makes it possible to issue signals reflecting the current status of the stock at the location.  Normally we’d expect anything between 1/3 to 2/3 of the target level to be available on-hand to represent the “about right” status of inventory, knowing the rest is somewhere on the way.  When there are less than one-third on-hand then the status of the stock is at risk, and actions to expedite the shipments are required.  This is the duty of the human manager to evaluate the situation and find the best way to respond to it.  Such an occurrence triggers the evaluation whether the target level is too low and needs to be increased.  Generally speaking, target levels should be stable most of the time.  Frequent re-forecasting usually come up with minor changes. 

Question is: as the target level includes safety, what is the rationale to introduce frequent changes of 1%-10% to the target level, as it is just a reflection of the regular noise, and probably not a real change in the demand?

A sophisticated solution, without the wisdom of the key insights, would try to assess the two uncertain situations: how much demand might show in the short time, and whether the on-hand plus whatever is on the way will make it on time. It’d also estimate whether the anticipated results would fall within the required service level.

Service level is an artificial and misleading concept.  Try to tell the customer that their delivery was subject to the 3-5% cases that the service level doesn’t cover.  Customers can understand that rare cases happen, but then they like to hear the story that justifies the failure.  It is also practically impossible to target a given service level, say 95%, because even with the most sophisticated statistical analysis there is no way to truly capture the stochastic function.  Assuming the spread of the combined delivery performance is according to the Normal distribution is convenient, but wrong. 

Given the practical need that humans should understand the logic of the solution, and be able to input important information that isn’t contained in any database, while recognizing the superiority of computers to follow well-built algorithms and carry huge number of calculations, leads to the direction of the solution.  It has to include two elements:  simple powerful and agreed-upon logic enabled by semi-sophisticated software, coupled with interaction with the human manager.  Definitely not an easy straight-forward mission, but an ambitious, yet doable, challenge.

Published by

Eli Schragenheim

My love for challenges makes my life interesting. I'm concerned when I see organizations ignore uncertainty and I cannot understand people blindly following their leader.

8 thoughts on “Between Sophisticated and Simple Solutions and the Role of Software”

  1. Automated sophistication induces a certain state of comfort in the vast majority of people, reducing the alertness necessary to react to a major deviation. Comfort is not healthy as a survival mechanism. The first humans to enjoy comfort were the food eaten by the saber-toothed tiger.
    A simple solution with well-understood logic maintains the necessary alertness to react faster.

    Like

  2. A simple (automated or not) solution with well-understood logic maintains the necessary alertness to react faster.
    Automated sophistication induces a certain state of comfort in the vast majority of people, reducing the alertness necessary to react to a major deviation. Comfort is not healthy as a survival mechanism. The first humans to enjoy comfort were the food eaten by the saber-toothed tiger.
    A simple solution with well-understood logic maintains the necessary alertness to react faster.
    The more complex the problem, the simpler the solution must be. If the solution is more complex than the problem, it will inevitably sink into its own complexity.
    The greater the dependency, the greater the complexity, the variation increases the complexity by several orders of magnitude. In the modern world, dependency and variation increases exponentially.

    Like

  3. I am confused. It is my understanding that simplicity is very much a foundation of TOC so claims that a working sophisticated solution could be developed is very much an attack on TOC’s foundation. Further I thought that all solutions of TOC are simple not only because complex solution are difficult to execute but because they do not comply with underlying reality which they seek to improve. All complex solutions have too many faulty assumptions behind them. E.g. MRP assuming unlimited capacity. Yet here you are basically making a case for sophisticated solution. What am I missing?

    Like

    1. You miss a lot. First, it is allowed even to challenge a foundation of TOC. Believing something is “true” without opening the statement to challenges is typical to Religion and not to Science.
      Inherent Simplicity was not given to us by GOD. It is a concpet that human beings, scientists, came up with. So, every one may claim it is not true, provided logical evident is presentated and be open to possible challenges. Claiming “All complex solutions have too many faulty assumptions behind them” is an empty claim. The example of MRP is also wrong because MRP doesn’t even try to be complex – it says loud and clear that it cannot handle finite capacity considerations.

      Then the article does NOT make a case for a sophisticated solution. It does recognize all the key simple and valid insights of TOC. It strives to use semi-sophisticated software for narrowing down the noise and do it using very simple, even straight-forward, simple solutions. All the semi-sophisticated do is use very simple algorithms that human beings cannot do because of lack of capacity to do that many simple calculations. If you read Goldratt’s The Haystack Syndrome you might re-assess what simplicity truly means. When I asked Goldratt whether the last part of the book isn’t “complex”, he answered that relative to the previous solution it is ‘simple’. The direction of solution I suggest in the article is much simpler.

      Like

  4. Yes I know it is a science and every thing is open to challenge. You are misunderstanding my question. By “simplicity is basis of TOC” I meant that in TOC sophistication and complexity is rejected in favour of simplicity by choice. Due to some specific reason all solutions in TOC were kept simple and relatively unsophisticated. The reason for this in my view was that sophistication required assumptions which are unrealistic in practice. Your point about MRP is exactly what I mean by having faulty assumption. How can something like MRP can even be developed when it is obvious that in reality there is no infinite capacity. There is no way such a solution can be executed successfully. But if that is not the cause behind choice for simplicity over sophistication than what is the cause? Is it computing power? I personally dont think it is because humans are most certainly capable of computing to the level semi-sophisticated software with certain techniques. In finance tools such as bond table and interest table etc. were used to handle math heavy calculations before computer era. The word “computer” itself were used to call humans tasked exclusively to calculate before machine computer were widely available. Effect of Computer is to reduce time and investment needed for calculations but large organization were certainly capable of performing complex calculation. Further machine computer were fairly powerful even in 80s to enable sophisticated software and yet TOC solutions were kept simple.
    Thus, I will ask again, what am I missing? Why till now simple solution are preferred in TOC over sophistication? Lastly I have read haystack syndrome and found last part time consuming but not complex.

    Like

    1. I disagree that TOC favors simplicity by choice. It looks for the absolutely necessary inherent simplicity. I claim that no human system, like an organization, can be complex and thus has to have sophisticated tools. Every human organization has to perform in a stable way, otherwise the clients of the organization would look elsewhere. In order for the human managers to perform in a stable way means to reduce the seemingly complexity combined by uncertainty have to be built in. These means are, many times, a lot of excess capacity and considerable excess capabilities. This means there has to be very few critical variables to closely follow, and the impact of all the rest is marginal.
      Simplicity is not a choice – it is necessary condition to perform in an acceptable way.

      Like

      1. Thanks for this. Now my understanding of TOC is changed and deepened. After reading this comment I have reread your article and understand it in completely different way.

        Like

Leave a comment