Indentifying the Emergence of Threats

word in the eye

Eli Goldratt spoke about the need to notice inconsistencies. He even claimed we need courage to do that. Once you realize the inconsistency you are urged to reconcile it even when it means updating one or more of your basic assumptions.

I’m not sure whether what we need is ‘courage’, because if we choose to ignore the inconsistency we expose ourselves to possible resulting threats of failing to understand our reality. What we need to do with ourselves is to acknowledge the fact that we don’t know, thus we make mistakes and we need to fix them in order never to repeat the same mistake.

When I learned programming I realized that the number of mistakes, bugs in software terminology, I had made were enormous. I had to accept the fact that I’m not all that clever to be able to do it right the first time. The good news were that every programmer generated many bugs, and thus the culture of software development was forced to recognize the existence of bugs and to expect every programmer to fix the bugs as soon as possible. I assume nowadays it is still the common culture in software, but only there. The frightening realization was that I probably do many mistakes in my “other life”, but do not have the computer to throw the mistakes in my face.

Failing to identify the threat is a mistake of ignoring signals that such a threat is developing.

It is our responsibility to identify emerging threats as early as possible and find the appropriate actions to neutralize the threats.

Some threats are anticipated before they emerge into our reality and put us, or our organization, into jeopardy. What we are able to do is to put a “control mechanism” in place. Here is my definition for the term ‘control’:

A reactive mechanism to handle uncertainty by monitoring information that points to a threatening situation and taking corrective actions accordingly

Buffer management, alarm systems, auditing of financial transactions, periodical medical checks and performance measurements are all control mechanisms for variety of potential threats that we take a-priori into consideration that they might emerge.

Sometimes a new threat is emerging, but with enough visibility so the identification is pretty much immediate. For instance, a proposed change in the tax laws, which might compromise our profits. Failing to note such a publicize threat signals an even bigger threat, because if such a clear threat is ignored, how many other emerging threats are ignored?

What can we do to identify the emergence of a new threat, of which we haven’t given a thought?

An emerging threat generates signals that are inconsistent with our expectations. We have a name for such a strange feeling that something unexpected has just happened. We call that feeling a SURPRISE.

Surprises can be good or bad, but the common effect is that every surprise points to a flawed paradigm that our brain uses automatically. A paradigm is actually a small cause and effect branch that we take automatically as true. For example:

I explain to my people what needs to be done2

Does the above paradigm ALWAYS work? It does happen that my people do not do what needs to be done. This is a signal that there is a flaw in the logic. It is not clear, at that stage, what the flaw is.

The kernel of the signal is the gap between expectations and outcomes:

My expectations are: my people do exactly what I’ve asked them to do!

Now I face the following incident: I asked Dave to put in writing the full facts regarding Deal X. What I got is one page with just the most obvious facts and many of the important happenings are missing.

I’m angry with Dave. This is not what I have expected from him. What should I do?

We usually deal with the gap between what we want and what we get. Organizations often check the gap between the plan and the execution. I point to the gap of realistic expectations and actual outcomes as a key opportunity to update a flawed paradigm.

Let’s look again at the simple case of Dave giving too short report on Deal X. The gap signals a flawed paradigm, but what is it? I did explain what to do and I’m still the boss.

Three (should be more) initial possible explanations:

  • Dave did not care what I’ve asked him to do
  • Dave was too busy with many other things he had to do
  • Dave did not understand the required level of details

To understand what truly went wrong in the case, there is a need to validate the facts and then dive into the paradigms we, both me and Dave, operate at.

Without carrying here the full analysis, let’s imagine the following realization is revealed:

When I do not clearly explain the “what for?” question of what I ask to do, it is possible that the outcome would not be what I think it should be.

If this is a new learned lesson – can you see the scope of the positive consequences? The possibility that their instructions would not be followed is a threat to every manager. If they do internalize the need to explain why it is necessary to do this-and-that then big part of that threat would not materialized. However, a similar gap might still emerge as a result of other flawed paradigms. We should still be open to spot surprises and go ahead with the learning process.

Published by

Eli Schragenheim

My love for challenges makes my life interesting. I'm concerned when I see organizations ignore uncertainty and I cannot understand people blindly following their leader.

3 thoughts on “Indentifying the Emergence of Threats”

  1. Hi Eli,
    (a) In reality we face many gaps between plans (that reflects the expectations) and actual outcomes.
    (b) We tend to ignore signals for inconsistencies in information.

    Are you saying that every divertion of the outcome from the expectations is caused by an ignored signal for inconsistency? Are we always have “early warnings” that if ignored will lead to undesired results? This leads me to an additional question – how can we be sure that the plans really express the expectations? It may very well be that the results are good but the plans are wrong.
    Avraham

    Like

    1. Hi Avraham,

      First, plans do not represent real expectations. In too many cases they represent “hopes”. Expectations are always a range and not one number. When “our team” in basketball plays with a rival that is considered somewhat better we hope, and based a plan on those hopes, to win, but losing the game, especially when it is not a defeat, it should not cause a surprise.

      A deviation from our true expectations is always caused by a flawed paradigm. For instance, we know the coach of the rival and know his tactics, but this paradigm might be flawed because coaches sometimes deviate from their patterns.

      Is there always a warning signal? I cannot be sure and I am not certain I’d always notice the signal. More, we may notice the signal and choose to ignore it because it has not seemed to be critical to us. We do need to focus on the more meaningful signals and sometimes we might miss the meaning. Focusing is not a guaranty you focus on the RIGHT things. I think we should do our best to focus on what we think are the key issues, while looking for meaningful inconsistencies. We should be ready to fail from time to time, and then what we should do is to learn the lessons.

      Positive surprises happen as well, and wrong plans might result in very nicely. We could learn a lesson there as well. There is uncertainty around and one option on the list of explanation of the gap is: a rare incident. I suggest to include it in the list, but put it at the very last – it happens only rarely.

      Like

Leave a comment