Behavioral biases and their impact on managing organizations

By Eli Schragenheim and Alejandro Céspedes (Simple Solutions)

Most of us, especially TOC practitioners, consider ourselves very good at decision making thanks to our cause and effect thinking.  However, behavioral economy, notably the research of two Nobel Prize professors, Daniel Kahneman and Richard Thaler, have convincingly shown several general biases from rational economical thinking, pushing most people to make decisions that look logically flawed. The psychology behind how people make decisions is extremely relevant to TOC because organizations are run by people, the same people TOC tries hard to convince to change their decision making process. TOC usually treats cause and effect relationships as based on rational logic.  But, cause-and-effect could also consider irrational causes, like having a special negative response to the term “loss”, even when it is not a loss, and predict the resulting effect of such a negative response.

Generally speaking TOC should look for answers to three key questions regarding such biases:

    1. Can we use cause-and-effect logic to map and explain biases? If so, can we eliminate, or significantly reduce, the biases?
    2. Is the impact of the biases on the organization’s decision making the same as on the decisions of an individual? Can we improve the organization’s decision making by treating the cause of the biases?
    3.  When numbers are fuzzy, as they usually are in corporate scenarios, what do managers rely on to make decisions?

Understanding loss aversion

In itself loss aversion is not a bias. It is a reasonable way to stay away from trouble.  What has been shown is that it is many times inconsistent, which practically means that human beings are impacted by irrelevant parameters that should not have an impact. To demonstrate the inconsistency we will use two experiments presented in “Thinking, Fast and Slow” By Prof. Kahneman.

In one experiment people were told that they had been given US$1,000 and that they had to choose between a 50% chance of winning an additional US$1,000 or getting US$500 for sure.

It’s no surprise that the vast majority of people was risk averse and chose to get the US$500.

What’s interesting is that when people were told that they had been given US$2,000 and that then they had to choose between a 50% chance of losing US$1,000 or lose US$500 for sure, then many people suddenly became risk seekers and chose the gamble.

In terms of final state of wealth both cases are exactly the same. Both cases put the choice between getting US$1,500 and accepting a gamble with equal chances of having US$1,000 or US$2,000. The two cases differ in their framing of the choice.  In the first case the choice is verbalized between gaining and taking a risk to gain more. The second case frames the dilemma between losing and potentially losing more (or not losing).  The fact that many people made a different decision between the cases shows a bias based on the framing of a ‘loss’ versus ‘gaining less’. It demonstrates how the words have a decisive impact.

These two experiments demonstrate two important findings. One, is that “losses” looms much larger than “gains”, and the other is that people become risk seeking when all their options are bad. This also explains why most people turn down a bet with a 50% chance of losing US$100 and 50% chance of winning US$150, even though on average the result is positive. If the bet would be 50% chance of winning US$200, then a balance between risk seeking and risk averse would be achieved. That means, “losing” is about twice as strong than “winning” as a general value assessment.

Should we be surprised by this seemingly irrational behavior?

Losing existing $100 might disrupt the short-term plans of a person, while the value of additional $150 is less clear. So, even though it is clear to most people that overall this is a good gamble, they resist it based on recognizing the greater negative impact of the loss.

Losing all we have is a huge threat. So, every person sets a mode of survival that should not be breached.  As the ability of most people to manage gains and losses in detail is limited, the survival instincts lead a negative reaction to any potential loss, making it more than the equivalent gain. So, taking into account our limited capabilities to control our exact state, we develop simple fast rules to make safe decisions. A simple rule could be “don’t gamble ever!”, or, “don’t bother with gambles unless you are certain to win much more.” These heuristics are definitely very helpful in most situations, but they can be costly in others.

While risk aversion seems rational enough, the framing bias is an irrational element, but the cause behind it is pretty clear and can be outlined as regular cause-and-effect.

We further assume that ‘framing’ is a bias that a person with a good background in probability theory would be able, most of the time, to resist the bias and come up with consistent decisions, especially for significant decisions.

Does this hold true for decisions made on behalf of an organization?

Suppose you are the regional sales manager of a big company and have to decide whether to launch a new product or not. Historically it has been statistically shown that there is a fifty-fifty chance that the new product will make a profit of US$2 million in one year or that it will lose a million dollars and its production would stop at the end of the year.

What would you do?

Our experience says that most seasoned managers will refuse to take the risk. Managers are naturally risk averse regarding any outcomes that will be attributed directly to them. As a matter of fact, every decision on behalf of an organization goes through two different evaluations: One is what is good to the organization and the other is what is good to the decision maker.

It’s common in many organizations that a “success” leads to a modest reward while a “failure” leads to a significant negative result for the manager. What’s more, because of hindsight bias decisions are assessed not by the quality of the decision making process and the information available at the time it was made, but by its outcome. No wonder loss aversion intensifies in corporate scenarios!

Earlier we mentioned that teaching the basics of probability theory and the acknowledgement of the different biases should reduce their impact. But, the unfortunate fact is that in most cases the decision makers face uncertain outcomes for which the probabilities are unknown. The case of launching a new product is such a case.  The statistical assessment of fifty-fifty chance is very broad and the decision maker cannot assume she knows the real odds.  This fuzzy nature of assessments naturally makes people even more risk averse, because the risk could be bigger than what is formally assessed. On the other hand, managers are expected to make some decisions, so they are sometimes pushed to take risky decisions just in order to look active as expected.

Now suppose that you are the Sales Vice-President and you have to decide whether to launch 20 different new products in 20 different regions. All product launches carry the similar statistics as presented earlier (50% chance of making US$2M and 50% of losing US$1M). Suppose the company is big enough to be able to overcome several product flops without threatening its solvency.

Would you launch all of them?

Assuming the success or failure of each of the products is independent on the other products then the simple statistical model would predict, on average, a total profit of $10M. However, since top management will most probably judge each decision independently, another bias known as narrow framing, the VP of Sales will try her best to minimize the number of failures. She might decide to launch only 8, basing her choice on the best intuition she has, even though she is aware she doesn’t really know. What’s ironic is that there’s a higher overall risk for the company in launching 8 products than 20 because of the aggregation effect.

There are many well-known examples of companies that decided to play it safe and paid a huge price for it. Kodak, Blockbuster, Nokia and Atari immediately come to mind. So, if organizations want managers to take more “intelligent” risks they need to create an environment that doesn’t punish managers for the results of their individual decisions, even when the outcome turns out to be negative. This is not to say organizations shouldn’t have a certain control on their human decision makers so they take potential losses seriously. Otherwise, managers might take huge risks because it is not really their money.  This means understanding how significant decisions under uncertainty have to be taken, and forcing procedures for making such decisions, including documenting the assumptions and expectations, preferably for both reasonable ‘worst case’ and ‘best case’ scenarios, that will later allow a much more objective evaluation of the decisions made.

This balancing act for taking risks is definitely a challenge, but what organizations have to recognize is that excessive risk aversion favors the status quo which could eventually be even riskier.

A Spanish translation of this article can be found at: www.simplesolutions.com.co/blog

Advertisements

The value organizations can get from computerized simulations

The power of today computers opens a new way to assess the impact of variety of ideas on the performance of an organization that takes into account both complexity and uncertainty. The need stems from the common view of organizations and their links to the environment as inherently complex, while also exposed to high uncertainty. Thus every decision, sensible as it may seem at the time, could easily lead to very negative results.

One of the pillars of TOC is the axiom/belief that every organization is inherently simple. Practically it means that only few variables truly limit the performance of the organization even under significant uncertainty.

The use of simulations could bridge the gap between seemingly complex system and reaching relatively simple rules to manage it well. In other words, it can and should be used to reveal the simplicity.  Uncovering the simple rules is especially valuable in times of change, no matter whether the change is the result of an internal initiative or from an external event.

Simulations can be used to achieve two different objectives:

  1. Providing the understanding of the cause-and-effect in certain situations and the impact of uncertainty on these situations.

The understanding is achieved through a series of simulations of a chosen well-defined environment that shows the significant difference in results between various decisions. An effective educational simulator should prove that there is a clear cause-and-effect flow that leads from a decision to the result.

Self discovery of ideas and concepts is a special optional subset of educational simulator.  It requires the ability to make many different decisions as long as the logic behind the actual results is clear.

dist sim

A simple educational simulator for distribution systems

  1. Supporting hard decisions by simulating a specific environment in detail, letting the user dictate a variety of parameters that represent different alternatives and get a reliable picture of the spread of results. The challenge is to be able to model the environment in a way that it keeps the basic complexity, and represents well all the key variables that truly impact the performance.

I’ve started my career in TOC by creating a computer game (The ‘OPT Game’) that aimed to “teach managers how to think”, and then continued to develop a variety of simulations. While most of the simulators were for TOC education, I had developed two simulations for specific environments aiming at answering specific managerial questions.

The power of today computers is such that developing wide-scope simulators, which can be adjusted to various environments and eventually support very complex decisions, is absolutely valid. My experience shows that the basic library of functions of such simulators should be developed from scratch as using general modules provided by others slows the simulations to a degree that they are unusable.   Managers have to take many of their decisions very fast.  This means the supporting information have to be readily accessible.  Being fast is one of the critical necessary conditions for wide-scope simulations to serve as an effective decision support tool.

Dr. Alan Barnard, one of the most known TOC experts, is also the creator of a full supply chain simulator. He defines the managerial need first to be convinced that the new general TOC policies behind the flow of products would work truly well. But, there is also a need to determine the right parameters, like the appropriate buffers and the replenishment times, and this can be achieved by a simulation.

There is a huge variety of other types of decisions that a good wide-scope simulator could support. The basic capability of a simulation is to depict a flow, like the flow of products through the supply chain, the flow of materials through manufacturing, the flow of projects, or the flow of money going in and out.   The simulated flow is characterized by its nodes, policies and uncertainty.  In order to be able to support decisions there is a need to simulate several flows that interact with each other.  Only when the product flow, order flow, money flow and capacity flow (purchasing capacity) are simulated together the essence of the holistic business can be captured.  The simulator should allow easy introduction of new ideas, like new products that compete with existing products, to be simulated fast enough.  The emerged platform for ‘what-if’ scenarios is then open for checking the impact of the idea on the bottom line.

For many decisions the inherent simplicity, as argued by Dr. Goldratt, provides the ability to predict well enough the impact of a proposed change on the bottom line. Throughput Economics defines the process of checking new ideas by calculating the pessimistic and optimistic impact of that idea on the bottom line of the organization.  It relies on being able to come up with good enough calculations on the total impact on sales and on capacity consumption to predict the resulting delta-T minus delta-OE.

However, sometimes the organization faces events or ideas with wider ramifications, like impacting lead-times or being exposed to the ‘domino effect’ where a certain random mishap causes a sequence of mishaps, so more sophisticated ways to support decisions have to be in place. Such extra complications of predicting the full potential ramifications of new ideas can be solved by simulating the situation with and without the changes due to the new ideas.  The simulation is the ultimate aid when straight-forward calculations are too complex.

Suppose a relatively big company, with several manufacturing sites in various locations throughout the globe, plus its transportation lines, clients and suppliers, is simulated. All the key flows, including the money transactions and their timing, are part of the simulation.  This provides the infrastructure where various ideas regarding the market, operations, engineering and supply can be carefully reviewed and given predicted impact on the net profit.  When new products are introduced determining the initial level of the stock in the supply chain is tough because of its high reliance on forecast.  Every decision should be tested according to both the pessimistic and optimistic assumptions, and thus management can make a sensible decision that considers several extreme future market behaviors, looking for the decision that minimizes downsides and still captures potential high gains.

Such a simulation can be of great help when an external event happens that messes the usual conduct of the organization. For instance, suppose one of the suppliers is hit by a tsunami.  While there is enough inventory for the next four weeks, the need is to find alternatives as soon as possible and also realize the potential damage of every alternative taken.  Checking this kind of ‘what-if’ scenarios is easy to do with such a simulator revealing the real financial impact of every alternative.

Other big areas that could use large simulation to check various ideas are the airline and shipping businesses.  The key problem in operating transportation is not just the capacity of every vehicle, but also its exact location at a specific time.  Any delay or breakdown creates a domino effect on the other missions and resources.  Checking the economic desirability of opening a new line has to include the possible impact of such a domino effect.  Of course, the exploitation of the vehicles, assuming that they are the constraint, should be a target for checking various scenarios through simulations.  Checking various options for the dynamic pricing policies, known as yield-management, could be enlightening as well.

While the benefits can be high indeed one should be aware of the limitations. Simulations are based on assumptions, which open the way to manipulations or just failures. Let’s distinguish between two different categories of causes for failure.

  1. Bugs and mistakes in the given parameters. These are failures within the simulation software or wrong inputs representing the key parameters requested by the simulation.
  2. Failure of the modeling to capture the true reality. It is impossible to simulate reality as is. There are too many parameters to capture. So, we need to simplify the reality and focus only on the parameters that have, or might have at certain circumstances, significant impact on the performance. For instance, it is crazy to model the detailed behavior of every single human resource. However, we might need to capture the behavior of large groups of people, such as market segments and groups of suppliers.

Modeling the stochastic behavior of different markets, specific resources and suppliers is another challenge. When the actual stochastic function is unknown there is a tendency to use the common mathematical functions like the Normal Distribution, Beta the Poisson, even when they don’t match to the specific reality.

So, simulations should be subject to careful check. The first big test should be depicting the current state. Does it really show the current behavior?  As there should be enough intuition and data to compare the simulated results with the current state results, this is a critical milestone in the use of simulations for decision support. In most cases there should be at first deviations that occur because of bugs and flawed input.  Once the simulation seems robust enough more careful tests should be done to ensure its ability to predict the future performance under certain assumptions.

So, while there is a lot to be careful with simulations, there is even more to be gained from by understanding better the impact of uncertainty and by that enhance the performance of the organization.

The big slogan and the potential real value of Industry 4.0

By Eli Schragenheim and Jürgen Kanz

We are told that in order to keep with the quick changes in the world, facing the fiercer competition, manufacturing organizations have to join the fourth industrial revolution called Industry 4.0, which is a very big suite of different new technologies in the field of IT, namely the Internet of Things (IoT), artificial intelligence and robotics.

The slogan of Industry 4.0 claims it is highly desirable to join the revolution before the competitors. Well, we are not sure whether the term ‘revolution’ truly fits the new digital technologies.  But, this is truly the smallest issue.  The fast pace of the technology should definitely force every top management team of every organization to think clearly what could be the impact of the newest technological development on the organization and its business environment.  Thinking clearly is required not only for finding new ways to achieve more of the goal, but also to understand the potential new threats that such developments might bring.

There are two significant threats that new technology might create. First, push management to invest heavily on technology that is still half-baked and its potential value is, if at all, small.  Second, cause loss of focus on what brings value and what not. Trying too many ideas, investing money and management attention on too many issues, could end with a big loss, or very low value.  Just look on the wide area of claims to bring value:image 1

Image 1: Improvement areas for Industry 4.0, adapted from McKinsey Digital 2015,
“Industry 4.0: How to navigate digitization of the manufacturing sector”

Application of new IT technologies and the connection of known technologies shall lead to the following expected improvements per area:

image 2

Image 2: Expected improvements, adapted from McKinsey Digital 2015,
“Industry 4.0: How to navigate digitization of the manufacturing sector”

We can recognize a number of good time and cost reductions that will increase the overall productivity, but what are the expectations of top managers? To gain more insight we can look into a survey of “Roland Berger Strategy Consultants” with input from 300 top managers of German industries as an example:

image 3
Image 3: Top-Manager expectations, adapted from “Die digitale Transformation der Industrie”, Roland Berger Strategy Consultants & BDI, https://bdi.eu/media/user_upload/Digitale_Transformation.pdf, last download 09/24/18

A big group of executives (43%) target only cost reduction with the help of Industry 4.0, while other managers want to have more sales with new products (32%) or more sales with existing products (10%). The objective to achieve more sales and cost reduction is a wish of 14% of the managers.

We can expect that approximately half of the managers will be satisfied with cost reduction due to improvements in above-mentioned areas, but there is no element in the above images that supports directly gaining more sales of new or existing products.

We assume that on one hand IT related product innovations will push sales with new technological products, like wearables (gps watches, health control, sleep tracker, etc.). On the other hand, there is the wish to cut cost in production, which due to the fierce competition would press the price down and increase sales quantity. Will this trend also increase net profit?  The short answer is: it depends; companies need to analyze the full impact on the bottom line very carefully.

The new digital technology can help reducing the “Time to Market”, the time to run a new product development project from idea to market launch incl. customer contribution. One question is by how much? The answer depends a lot on the specific technology of the new products.  Another question is whether Industry 4.0 can reduce production lead-time and what this could do to improve sales?

The mentioned improvements in Sales / Aftersales have an impact only on after-sales activities, but are they sufficient to create new sales?

It seems that the main vision of McKinsey and many other big players is limited to cutting operating expenses, which is fine for bringing certain value, but it is NOT a revolution. The tough reality of cutting costs is that it cannot be focused; it is spread over many cost drivers.  It requires a lot of management attention and usually brings limited net business value.  Question is what if the management attention efforts would have been directed to give higher value to more customers?

We understand that when implementing some of the most relevant Industry 4.0 technology, and when the technological changes are combined with the right management processes, such achievements, like cutting lead times by 20-50%, are not only possible, but should also dramatically improve the general responsiveness to customers and also becoming truly reliable in meeting all commitments.

But, is it sufficient that the technology is installed and used to achieve such results?

And is it enough to reduce the time to market, or cutting the production lead-time, to get better business results?

Significantly improved business results are achieved if and only if at least one of the following two conditions applies:

  1. Sales would grow either from selling more or from charging more and not losing too many sales because of the price increase.
  2. Costs are cut in a way that does not harm the delivery performance and the quality from the customer perspective.

The above should be the top objectives of any new move of management, including dwelling on implementing a new technology, like one of the Industry 4.0 elements.

On top of carefully checking how any of Industry 4.0 components could achieve one, or both, of the above conditions, a parallel analysis has to be used to identify the negative branches, the variety of possible harms that might be caused by the new technology.

For instance, the use of any 3D printer is limited by the basic materials that the particular printer can use. If this limitation wasn’t considered at the time when the decision to use such a printer was accepted then it could easily make the use of the “state-of-the-art” technology a farce.

We suggest every manufacturing organization to consider using chosen parts of Industry 4.0 in order to achieve one or both of the above top objectives, bringing higher level of business achievements.

A special effective tool to analyze any specific element of Industry 4.0 is the Six Questions on the Value of New Technology, developed by Dr. Eli Goldratt. The first four questions first appeared in Necessary but Not Sufficient, written by Goldratt, with Schragenheim and Ptak.

Question 1: What is the power of the new technology?

This is a straight-forward question on what the new technology can do, relative to older technologies, and also what it cannot do.

For instance, the ability of IoT to use PLC (programmable logic controllers) sensors on machines to send to a web page precise information about the state of a machine whether it functions properly or there is a problem. Predictions about the next maintenance step based on machine data are useful as well, because the results can help to avoid unexpected machine downtimes and to exploit the constraint.

Question 2: What current limitation or barrier does the new technology eliminate or vastly reduce?

This is a non-trivial question and it is asked from the perspective of the user. In order for a new technology to deliver value there has to be at least one significant current limitation that negatively impacts the user.  Overcoming this limitation is the source of value of the new technology.  It is self evident that verbalizing clearly the limitation for the user is a key for evaluating the potential value of the new technology.

The leading example of using PLC sensors for providing online information to variety of relevant users reveals that the limitation is the current need to have an operator physically near the machine to get information that could lead to an immediate action. We do not consider the capabilities of the PLC itself as the new technology in this analysis, as this is not truly a “new technology” by now.  The new concept is to use the Internet to reach far-away people that can gain, or help others to gain, from the online information on the current state of the machine and the specific batch that is processed.

There are two different uses for such immediate information. The first is when there is a problem in the flow of products, could be technical or bad-quality materials. The other type of information is for checking the likelihood of satisfying an opportunity, like changing over the production line to process a super urgent request or handling an unexpected delay.  The operator at the actual location has to get the fresh information and communicate it to certain people who appear in a predefined list.  The operator is also expected to update the IT system in an effort to consider the next actions. Overcoming the limitation means the flow of information does not need anybody at the physical location.  Depending on the technology the reactive actions could be taken from afar.

Question 3: What are the current usage rules, patterns and behaviors that bypass the limitation?

This question highlights an area that is too often ignored by technology enthusiasts. Assuming the current limitation causes real damages then ways to reduce the negative impact of the limitation are used.  For instance, before the cellular phones there were public phones spread all over the big cities to allow people to contact others from where they are.  Devices like a beeper or pager were in use to let someone far away know somebody is looking for her.  It is critical to clearly verbalize the current means to deal with the limitation because of two different objectives.  One is to understand better the net added-value of the new solution provided by the new technology.  The other is to understand the current inertia that might impact the users when the new solution would be provided.  This side is further explained and analyzed through the next question.

Today the industrial manufacturing landscape is roughly divided in two parts. We have factories with a high automation level for many years. These factories are often process industries using fully automated production lines for chemistry, pharmaceutics, etc. The machines and processes are connected by an independent data network that includes analysis. The monitoring of the production line and related processes takes place in a dedicated control room where the operator has to watch the information on a big screen and when spotting a problem the operator finds the best solution, or calls for help.

In addition we can find also many small and medium sized enterprises (SME) running modern machine tools with powerful controls and integrated sensors. These machines provide already all needed data for analysis, but in most cases, the data is left unused. It is also not very common to store information regarding the problems in the production flow into a database that can feed future analyses on improving the uptime of the production line. Operators can fix mainly small issues. They have to call the external supplier service in case of bigger problems with the machines. Bypassing the problem until it is resolved is typically managed by the operator and or by a cross-functional team. In today’s technology most of the information given to various decision makers is updated up to the previous day.  So, urgent requests and unexpected delays might wait to next day to be fully handled.

Question 4: What rules, patterns and behaviors need to be changed to get the benefits of the new technology?

Answering this question requires clearly detailing the optimal use of the new technology to achieve the maximum value. The behavior when the new technology is already operational is, in many cases, different than without that technology.  The value of the cellular technology is fully drawn only when the users carry their phone with them all the time.  There are many other ramifications of the change imposed by implementing the new technology, like being very careful not to lose our phone. New rules have to be developed to guide us to draw the most value of the new technology.

Industry 4.0 is pushing the idea that all available machine data should be used for monitoring, controlling and analysis. Modern machines can be connected directly with the IoT, while older machines need to install PLC sensors to provide data to the internet. A continuous stream of collected data is stored on a “Cloud” somewhere on a server farm of an external service provider.

The idea of having the PLCs information within immediate reach from everywhere could have added value if and only if people that have not been exposed to the information before would not only get the information but also be able to use it.  In order to use immediate online information one has to be aware that it is there.  As the true value of the new solution is the speed of getting the true updated information there is a basic need to be able to set an alarm to make the relevant person be aware that something important needs attention immediately.  This also means that there is a need to have effective analytics to note when the information becomes critical, and to whom.  This requirement should be part of answering question 4.  The more general lesson is that exposing the user to huge continuous stream of data from manufacturing, sales and Supply Chain is a problem the new technology has to offer a solution for.

When everything is connected with everything else, not just the internal company, it means direct connection with the external world via the IoT.  This move could create an opportunity for new level of business, but its rules and wide ramifications of such connection have to be very carefully examined.  For instance OEMs could have full access to all kind of data from every OEM supplier, which should enhance the win-win collaboration between the players.  But, this requires all the players to intentionally strive for this kind of collaboration.  The technology is just the enabler for the intentions. Creating this kind of transparency is a necessary condition for effective win-win collaborations.

The connectivity of everything is truly beneficial only with the right focus in place, preventing the human managers from being overwhelmed and confused by the ocean of data. This insight of having to maintain the right focus, the most basic general insight of the Theory of Constraints (TOC), is absolutely relevant for evaluating the potential contribution of every Industry 4.0 element, as it is so easy to lose focus and get no value at all.

Question 5: What is the application of the new technology that will enable the above change without causing resistance?

Resistance is usually raised because a proposed change might cause a negative, usually unintended, consequence. The example of every new medical drug for curing an illness also causes negative side effects, sometimes bigger than the cure, carries a wider message that this characteristic is much more general than just for medical drugs.

It is crucial not just to identify all the potential new negatives that the new technology would cause, but also to think hard how to trim them. The transition from film cameras to digital ones raised the negative consequence of having too many pictures taken.  During the years some solutions to organize the photos in a more manageable way have appeared.  If the thinking on that problem would have started sooner, the added-value would have been much higher.  This is a crucial part of the analysis:  to give the negatives much thought, in spite of the natural tendency of being happy with the new value.

It is advisable for every IoT idea to be analyzed for its probable negatives. A generic negative of almost all electronic devices is that when they fail to function properly the damage is usually greater than in the previous technologies.  This means stricter quality analysis is absolutely required, plus carrying the replacing electronic cards or devices in stock.

Question 6: How to build, capitalize and sustain the business?

This question is a reminder that the value of the new technology, plus all the decisions around it, is part of the global strategy of the company.

How does the above analysis live with the top objective of the organization? Does the plan for extracting value of the new technology provide synergy with the other strategic efforts required for achieving the goal?

So, here in the sixth question the global aspects of the proposition of implementing a specific application of the new technology have to be analyzed. Actually when several applications of new technologies are considered than question 6 should apply to all of them together.  Thus, when analyzing the various elements of Industry 4.0, the first step is choosing several for more detailed analysis; the last step in the analysis is evaluating the global strategy and deciding which ones, if any, to implement and what other actions are required to draw the expected value as soon as possible.

The previous questions of the leading example should discover by how much linking the PLC stream of data into the Internet would either increase sales or significantly reduce cost.

Suppose that the company has an active constraint in a specific machine or a whole production-line, and the constraint is frequently stopped due to various problems. In this case having a quick response mechanism, based on fast analysis of the PLC information, and immediately reaching the right people that can instruct the operator how to fix the problem is truly worthwhile.  The generated added value, both by keeping the customers unharmed and by superior exploitation of the capacity constraint, is high.

Add to this the decision to use 3D printers to overcome the management limitation in viewing new product designs from the original drawings, as managers might not have the capability of viewing 2D drawing and imaging the finished product. The cost of producing proto-types restricts the number of models for management to judge the design, and the number of alterations is also limited.   Using 3D printers eliminates the limitation.  After answering the rest of the questions the organization has to consider question 6 for both elements of Industry 4.0 and decide whether together the value is even greater.  If we consider the possibility that current proto-types of new products have to compete on the capacity of the constraint, while using the 3D printer bypasses the constraint, we could realize the synergetic added-value of leading to improved product-design that could enhance sales, while the capacity constraint is better exploited.

The overall conclusion has to highlight the sensitivity of the strategic analysis when the issue of Industry 4.0 is seriously considered. The contribution of the six questions could be truly decisive.

 

What Brick and Mortar Retail can learn from Big Data?

By Amir and Eli Schragenheim

Physical stores face a wicked problem that gets bigger and bigger. An increasing number of customers buy through online shops, offering lower prices and wider choice.  Online shops also face a huge business problem, coming from fierce global competition and lack of a clear competitive edge other than price. This does not help the physical stores to re-invent themselves and offer a real alternative to the customers, being offered by online shops wider variety, getting everything shipped home and with lower prices.

Part of the reason that so many customers buy online is that e-commerce stores succeed to understand better the specific taste of customers and approach them with lucrative suggestions. This understanding is achieved by investing huge efforts to gather a lot of data from every user entering their site, and perform sophisticated analysis of that data.  These are ongoing efforts so we can expect more improvements in manipulating customers to buy through the Internet.

How come physical stores do not take similar actions?

Physical stores have harder access to the relevant information. For instance, currently there is no good way to record customers who do not find the product they’re looking for.  The behavior patterns of the customers are not recorded.  There might be video cameras in a store, but their objective is security and all other behavioral aspects are not considered.  So, it seems there is not much the physical stores can do to study better their customers.

This is a HUGE mistake.  There is plenty of available data that can be processed into valuable information, and there are ways to access more data, which could be used to yield even more information.

Customers coming to big stores, certainly to supermarkets and drug-stores, buy, many times, more than one item. This is an opportunity to learn more about the possible relationships between different items, and the personal tendencies to brands, plus the role of the price in choosing from a variety of similar products.

The total purchase of different items purchased by a customer carries hidden information that testifies to the taste and economic level of the customer. To reveal the relevant information certain analysis has to be carried out, with the aid of statistics & machine learning (ML), to be able to come up with answers to key questions concerning the most important decisions every retail store has to make:

What new items to hold? What items should be eliminated?  What is the relative importance of keeping perfect availability of an item?  What items should be placed close together?  What items could be used for promotions?  What additional services, like an on-site bakery, should be added (or removed)?

Inquiring the total purchase of a customer reveals so much more than just looking at the sales of every item. The mix of items purchased at one time reveals wider needs, taste and economical behavior. When it is legally possible to identify the specific buyer, then the previous purchases of the buyer can be analyzed in order to define in greater detail the market segment that customer belongs to.  Part of the value of maintaining customer loyalty clubs is the ability to link different purchases, at different times, to the same client.  Thus, a client profile can be deduced.

The most obvious outcome is mapping the customers according to market segments, noting the different characteristics between the segments. Inquiring the purchases could highlight aspects regarding the family of the customer: spouse and children, their approximate age, their financial status and their preferences.  These characteristics can be revealed through analysis of the purchases and the frequency of purchasing.  Certain preferences, like smoking or being vegetarian, can be identified.  Together the key characteristics are revealed in order to define several layers of the market segment.

Another important value that can be deduced from inquiry of purchases is the dependencies between different items: when item X is purchased then there is a good chance that item Y is purchased as well.  These dependencies are sometimes intuitively deduced by some managers and they definitely impact decisions concerning maintaining the availability of both items, their placement within the store and even the possibility to sell them together as a package.  Understanding the linkages between products helps to check changes of purchasing habits over time.  For instance, when the economy goes down, how it impacts different segments, which is much more valuable than just watching the impact on individual products.  We can expect a general shift to cheaper products, but which brands are replaced by cheaper ones, and which segment makes those changes more than the other, should be valuable in forecasting those changes before the actual change in the economy takes place.

The structure of typical purchases by different market segments would certainly initiate marketing moves that would capitalize on that understanding. Analytical knowledge, translated into operational policies, would impact the performance of the various branches of the retail chain, as the specific needs of the branch are recognized, but also some of the generic insights.

When every purchase of a specific client can be linked with the previous purchases of that client, then the frequency of buying could lead to initiatives to influence the content of a typical purchase of a specific market segment.

Developing the machine-learning (ML) module to categorize better the different segments the store serves, should enhance both the marketing and the logistics of every retail store. There are always dilemmas in holding slow movers given the amount of the logistical efforts required to keep that slow mover available.  Being exposed to the right priorities, by understanding the full financial impact of the slow mover sales, would lead to better decisions about what items to hold in stock.  The relative value of a slow mover includes its impact on the sales of other products. Understanding the relative importance of that particular item to a specific segment contributes to determine the slow mover impact on the desirability of the store from the viewpoint of the market segment.

Through ML the retailers can get better understanding of the customer loyalty to specific brands and items. When it is already established that a certain segment prefers item X to item Y, then by intentionally creating unavailability of X for one day, it is possible to discover whether most clients from that segment bought Y, or refrained from buying a replacement.  It also answers the question whether buying the replacement would impact the brand loyalty.  Promotions also cause people to buy the less preferable items, but it is of major interest to know whether it impacts the brand loyalty.

It is highly desirable to have access to the information on the availability of all items at the time a certain purchase was made. When item X happens to be unavailable, then it provides an opportunity to check whether the seemingly brand-loyal customers switch easily to the replacement.

Supermarkets and drug-stores are typical retailers where purchasing usually includes several different items. It seems absolutely necessary that such retail chains would invest efforts in ML to learn more about their customers habits and develop the process of coming up with superior decisions to capitalize on that knowledge.

The Frustration of a Middle-level Manager – A short story by Eli Schragenheim

My boss, Dr. Christopher Logan, asked me to come to his office at 2pm sharp, and report in detail how come the delivery to MKM did not have all the 100 B12 units. I know how such meetings are conducted:  it looks friendly enough, but actually there is nothing friendly in such a meeting.  The kind of attitude, “we are nice understanding people” is supposed to hang over, but beneath the politeness you’re on TRIAL – try to prove it is not YOUR incompetence that caused the trouble.

Such a hostile inquiry takes place every two months. Many things go wrong every day, but only few receive this kind of treatment.  It is usually because the client is very important, very big or new, and when such a client is making a complaint then the incident becomes critical and somebody has to be blamed and punished.

MKM is both big and new. From whatever reason, getting 97 good units, out of 100, exactly on the formal due-date, was not enough for them.  I promised immediately to deliver the missing three units in five business days.  Apparently this was not good enough, so a formal complaint was put on Dr. Logan’s desk.  I have no idea how come the delivery of all 100 units at the promised day is so sensitive that 3% less is such a disaster.

I’m now preparing my defense, trying not to exaggerate too much the role of Freddy in the blunder.   I know that almost every one of my people might have done the same mistake he did, and that mistake only partially led to missing the delivery.  As MKM demanded 100 good units that would pass their test in full no later than June 1st, 2018, we decided to produce 120 units, as there is no viable way to identify such quality exceptions in B12 production in an early stage.  Freddy mistake was assuming that a temperature of less than 1 degree above the standard is still within the control limits.  Usually this is right, but for MKM specifications it is not.  That small deviation impacted, at most, five units, because the temperature was fixed very fast.  Five out of the extra buffer of twenty cannot be the only reason that only 97 units passed the MKM test.  We, by the way, sent 101 units that passed our own test, rejecting 19 due to variety of reasons.  How come four units, which passed our test, failed in their similar test is an open question. No one, I repeat, NO ONE, has any explanation for this fact.

This is the situation I face and I just hope it won’t turn out as bad as the blunder of last year, which led to the layoff of two good people. The charge was not paying enough attention to rare and unfortunate incident that caused the breakdown of expensive equipment.  In such cases, Dr. Logan is taking the juridical authority to find who to blame.  I actually understand him; his superiors would blame him unless he succeeds to find another scapegoat.  So, it is now my mission to avoid being the scapegoat.  I hope I’d succeed also to prevent Freddy from such undeserved verdict.

Part of the pain we in feel in Operations is that things could have been much better if we would have known more about the clients true needs. We are told not to be in touch with the clients, so all I’m able to know is what is written in the documents submitted by the clients.  We only see the name of the client, the name of the responsible product manager and a list of specifications without much detail.  The product managers also know very little on the client true needs.  I suggested Larry, the product manager responsible for MKM, that we deliver first 50-60 units of the order already in May 16th and two weeks later all the rest.  The reason was that we had to split the order into two batches due to some technical difficulties.  Larry called them and got a refusal, but no explanations.

Why?

Why do I have to function under strict instructions which I fail to see their rational?

Why I meet the executives, to whom Dr. Logan reports, only in special public ceremonies?

All I see above me is Dr. Logan. The rest are located far away and there is no active dialogue between me and them.

I stay in this company, first of all, because I have a wife and two small daughters. I also think I’m doing a very good job.  I cannot prove it, but I think that under someone else, with less experience and technical knowledge, the MKM failures would triple.  Most of the time the procedures we have, and the willingness of my people to react to any signal pointing that something might be wrong, keep what I consider to be very good overall quality of operations.

But, the ridiculous performance measurements look as if our performance is just moderate. The cost of our operations is, according to the measurements and the funny benchmarking they use, somewhat higher than the “average” of similar facilities.  This is so wrong that it is an insult!  If you don’t even contemplate to listen to us in order to understand what we do and why we do it this way, how do you expect us to improve?

I feel the situation is “us against them” – the people who do the job against the people who play the role of God and judge our performance, even though in some public speeches they say “we have to do much better”, making the wrong impression that they think they should improve as well. I don’t think Dr. Logan believes he has to improve.  He has me and my people to improve, and it is just me who has to learn the lesson and make sure the MKM case would never happen again.

So, here is a potential action plan. I shall argue that the whole incident happened because Sales intentionally concealed part of the detailed specifications of the order, being concerned we’d reject the order, because we don’t have the capability to do it right.  Larry, the product manager, told us that the chief salesperson hinted that MKM has the most sophisticated equipment in the world.  I didn’t see the implications at the time, but it is evident now that such equipment allows more precise measurements, so it could be that the true specifications were not given to us because our equipment is unable to meet such precise specifications.

Does MKM really needs more precise specifications for their products?

Fact is that 97 units have passed the test. It means that our equipment is able to achieve the required specifications. But how can we test the final quality when our testing equipment is not the same as MKM new testing equipment?

Human relationships as a part of the holistic approach in managing organizations

Management is about achieving results for the organization. The obvious meaning is that integrating the various organizational functions, like Sales, Operations, Finance and R&D, together to achieve the best performance is what the CEO has to accomplish.   This is the holistic approach: integrating the parts into a whole entity.

What is the role of HR in this need for integration?

In every part of the organization there are people who are truly required to achieve the global objectives. Every resource has a set of capabilities and a certain capacity that limits the amount of output in a period of time.  When it comes to human resources both capabilities and capacity are much more difficult to define and measure than the other types of resources, like machines, space and money.  But, the limitations of both human capabilities and capacities play a considerable part in the performance of the organization.

The characteristic of human resources is such that to properly utilize their capabilities the right motivation has to be active. For instance, a salesperson is meeting a new potential client.  Would she do everything she can to bring the client in?  Is this true also when she isn’t entitled to a special bonus for that?  Similar situations are a purchasing agent negotiating price and terms with a supplier and a foreman who gets a special request to expedite an order, which requires extra efforts.  Eventually all the above examples depend on the willingness of employees to help the company to prosper.

People might cause unintentional damage by failing to act in the right way. This could happen because of being untrained, or incapable, to do the job properly.  Another cause is flawed procedures and measurements that push people to do what harms the performance.  TQM, Lean, Six Sigma and TOC act in their different ways to fix that.

There are very few cases, but causing huge damage, where employees intentionally harm the performance of the organization.  This could take the form of refraining from doing what is required, like a strike, or even taking concrete actions that disrupt the performance of the organization.  It is definitely the responsibility of top management to prevent such high damaging cases, which raise emotions, like rage, preventing win-win solutions.

In the vast majority of the cases employees simply do their job as being told by their superiors.  When top management is doing a good job in integrating all the parts into synergetic performance than the results are positive, otherwise the employees cause damage, exactly because they follow instructions.

Can employees add high value that is far beyond just doing their job?

High level employees, like executives and highly professional employees, are expected to make huge efforts beyond their job. Question is: what is expected from the rest of the employees?

It is definitely possible that relatively lower level employees might know how to help the company to do better. In most cases the employees decide to keep quiet, believing the boss would not listen or appreciate their ideas.  Many employees feel that helping the company, beyond the formal description of the job, is a waste of their intellect.  This is a declaration of indifference:

“This is just my job, not my life. I’m not going to waste my intellect and special efforts for the organization that does not employ me for that purpose.”

So, the message for top management is that the employees might become a problem, either because they are not capable or because they are not motivated to do everything they can for the sake of the organization.

Henry Camp is the CEO of Shippers Supply Inc. and the owner of four other companies. Henry conducted a TOCICO webinar highlighting the 10 steps required to achieve the active collaboration of the employees for the company.  A special emphasis was on being ready to assist implementing a change in the way the company operates.

Henry Camp webinar is focused on what management should do to ensure that this indifference would never happen, preventing also the damage of intentional acts of frustration of the employees.

I recommend the reader to watch the recording of the webinar on the TOCICO site. A somewhat shorter alternative is watching his 30 minutes video on YouTube:  https://www.youtube.com/watch?v=4B0Azc6MNn0

I like to raise the issue of a CEO who is either new to the organization, or have not paid too much attention to the human relationships culture in the organization and now realizes that the time has come to diagnose the current state.

How much effort should management dedicate to diagnose problems with the motivation of their subordinates and solve them?

The objective is to find out whether the current performance of the organization is seriously harmed by existing level of distrust between employees and management. In the terminology of TOC the actual question is:

Is the internal human relationship the core problem of the organization?

The easy, but not always the best way, is inviting organizational behavior consultants to do the diagnostic. The result often is lack of focus, as the tendency is to come up with long list of what needs to be fixed.  The true damage to the Throughput is usually not defined.

There are two key organizational flows that determine the rate of achieving the goal. The first is the current flow-of-value to the customers.  The second is the flow of initiatives to improve the flow-of-value.  The concern from the impact of behavior on the current flow-of-value is creating blockages and by that harming the reputation of the organization.  The main concern for the initiatives is not trying hard enough to come up with great innovative ideas.

The chronic problem of the organizational culture is not with individual employees. Such problems are relatively easy to handle.  The problem is when most employees radiate indifference to achieve more of the goal.

Dealing with power groups within the organization is a situation that can easily become disastrous. Every airline has to manage its relationships with the pilots with extra care, while all the other groups are watching and might react to any change in the status quo.  In hospitals the surgeons have extra dominance and universities are run by full-time professors.  The balance between the power group, top management and the other groups is quite sensitive.  It is possible to get a win-win for all the groups, but it is not easy to maintain it for long time.

Can we apply rational cause-and-effect to diagnose existing or emerging behavioral problems and then find the effective win-win?

There is a common claim that people behave irrationally and thus analyzing it with rational logic is not effective. The argument is that we, human beings, often act based on impulses, stirred by emotions, leading to behavior that seems irrational because the actual results to the person are bad.  For instance, criminals behave in a way that eventually leads them to be in jail.  Question is whether the decision to make a crime is irrational from the perspective of the person committing the crime? Criminals could choose to satisfy their immediate desires in spite of possible negative consequences, as they judge being in jail less negative than most people.

Is human behavior often unpredictable?

If the behavior is the result of known causes, like the desire for dominance on other people, then the logical analysis should lead to expected behavior that is in line with reality.  Most of the time we predict the behavior of people we are communicating with well enough.  This is also true for managers predicting the response of their people and vice versa.

When negative behavior of employees can be predicted then it might appear on one side of the core conflict of the organization. For instance, when management distrust their employees they could suspect that the employees would not cooperate in introducing a change.  Such a conflict looks like that:

cloud management employees

Suppose that indifference is causing many undesired effects that reduce the potential of the organization to achieve more of the goal. Does it mean the indifference is the core problem?  Or the indifference is a symptom caused by another effect that causes several other undesired effects?

Most of the time indifference is caused by the reluctance of management to trust their employees, or by poor performance of the organization that harms the morale and the trust of the employees in the management. Both causes have more negative ramifications on the organizations.  Having to control everything has a huge negative impact on management attention, and from that on the ability of the organization to grow.

This basic trust and sense of purpose should be carefully maintained by the management of the organization. When there is a change in the mutual trust between management and employees then new emerging undesired effects should signal that such a change is happening.  A drop in the delivery performance to customers could be such a signal.  Failing to meet commitments, reduced quality and increase in customer complaints should be carefully viewed and monitored.  When such a change in the mutual trust is validated, the next step is to understand what happened to this sensitive balance.   Understanding the causes for human behavior is very much needed at this stage.  When such signals are not observed, then the focus of management should be elsewhere – on what truly constrains the performance of the organization.

The importance of Big Data

By Amir and Eli Schragenheim

Is Big Data important? Can every organization draw considerable value from it?

Amir and I assumed that the ultimate answer of most people in management would be: Yes, there is a big potential, but there is also a problem of drowning in the ocean of data (Goldratt in The Haystack Syndrome).

Well, as it seems too many people think that there is not much value to find in Big Data. So, maybe we, who think there is a very substantial potential value, need to back up this assertion.

Big Data in its narrow form is the ability of every organization to store huge quantities of data relatively cheap by the use of the cloud software tools for extracting specific data from various databases and formats, and organize them in a way that allows the human manager to focus on what is truly relevant.

A much wider approach to Big Data includes the huge amounts of data from external resources that allow free access through the Internet. Google, Facebook and LinkedIn provide the tools to do it and there are also public databases that allow searching and using their data for a certain cost.

It seems obvious that some organizations, certainly the bigger ones, are drawing a lot of value from Big Data, like the three big data manipulators mentioned above. Those giant organizations offer focused ways to advertise to well-defined audience.  Having the means to approach very specific market segments can be used to gain knowledge on the preferences of their customers.

The business sector of e-commerce, especially digital stores, is using their own huge data, taken from everyone who enters the website and records every move the user does, to draw conclusions on what the customer is interested in. The analysis of this accumulation of data opens a way not only for offering more to that customer with good chance for selling, but also winning that customer for future deals.  Beyond guessing the specific taste of every single customer, the generic understanding of groups of customers, like the role of price in their choices, can be established.

Physical retail stores use much less efforts to capture data that would reflect the clients’ preferences, beyond the trivial analysis of actual sales. Without direct access to client information, and even worse, without knowing what data could help them to gain more sales, they are helpless.  The retail stores lose a lot from their incompetence to collect the data they need to become more effective.

So, companies that have easy access to pretty straight-forward relevant data find answers to critical questions and gain a lot of value. Other organizations don’t.

When a new technology, like the ability to store and analyze huge amounts of data, presents itself to the market it raises two seemingly similar, but actually different, questions.

  1. Given the existence of the technology can we utilize it to bring benefits?
  2. Given our current obstacles – does the new technology lead us to overcome them? If so, what are the benefits going to be?

Many organizations don’t immediately see the benefits of a major new technology, meaning their answer to the first question is NO.

However, we believe some more efforts should be given to analyze what might overcome obstacles. Currently the organization accepts them as hard facts of reality, but the new technology is able to vastly reduce the limitation imposed by the obstacle. Then, new opportunities could be identified.

Goldratt 2nd question, of the Six Questions for assessing the value of a new technology, states:

What current limitation or barrier does the new technology eliminate or vastly reduce?

The obvious limitation of storage is not the relevant answer to the above question, because the value of storing huge amount of data is not clear and could easily lead to waste of efforts. Also reducing the slow and cumbersome speed of collecting huge data and organize it in a friendly visible way does not always add value.

But, we always have the wish to have more relevant information on the critical issues the organization is dealing with. We never have perfect information when a decision has to be taken.  So, decision making is always under high uncertainty, due to variation, plus unknown facts.  While this basic life situation would continue in the future, the unknowns could be significantly reduced if the right relevant information is collected and given to the decision makers.

Thus, Amir and I suggested the following limitation/barrier that the new IT technology reduces:

Not being able to get reliable answers to questions that require data, which was before either unavailable or not accessible

For instance, what are the features that many customers miss in our current products?

It is possible to ask the customers such questions, and even store all answers, but many of them simply refuse to answer and maybe they do not know what they miss, but when they would see it, they will know. Can we answer the question if we analyze data on what caused certain products, from different sources, suddenly becoming highly popular?

Failing to answer critical question is a key limitation for every company, and a search for the truly relevant data should, many times, yield new information that, together with an effective analysis, should yield substantial value.

To clarify the sensitive connection between data and information let’s bring the definition Goldratt gave to ‘information’ in his book ‘The Haystack Syndrome’ from 1990:

Information is an answer to a question asked

The definition highlights two insights. One is the power of asking questions, because in most cases when you ask something it is something that bothers you, so the answer to the question is also an answer to a need.

The other insight is that in order to answer a question certain data is required, and through the question that data becomes information.

In order to manage an organization successfully questions have to be asked and each one of them is directed to highlight a required aspect for one of two categories of managerial needs:

  1. Identifying new opportunities and how to draw the value from them
  2. Identifying emerging threats and how they could be eliminated or controlled

The first category is about new initiatives for success. The second category is about protecting your back.  Both are critical to every organization.

Goldratt’s third question is:

What are the current usage rules, patterns and behaviors that bypass the limitation?

Without the means of gathering data from many sources the decision makers have to make decisions, the practice has to be based on the following elements:

  • Using the routine data from the ERP or legacy system of the organization
  • Using the intuition of the key people in the organization closest to the specific topic
  • Employ a general ultra conservative approach, due to the unknowns and the perceived risk

The most important element is the use of intuition, based on one’s past experience.  So, it is certainly relevant data, but its quality is questionable.  The lack of objectivity, the various personal biases and being very slow to embrace any change, comprise the problematic side of intuition.

Intuition will still take a big role in the future. However, the ability provided by certain analysis based on data, unavailable before true big data, to check the validity of the initial intuition (especially the hidden assumptions) and also to be the source for new insights that could inspire new intuition could settle a new relationship between hard analysis and intuition.

TOC people argue that on top of intuition there should be cause-and-effect analysis that enables great managers to speculate right even when actual data is minimal. This is sometimes true, but as all cause-and-effect are based on observed effects, which are not always true facts, then even the most robust logic cannot deal with too many unknowns without data to rely on.

So, how could we improve our ability to spot new opportunities and emerging threats with the aid of the new IT capabilities of accessing huge amount of data?

The big trap of using the new IT capabilities is: losing the focus, investing huge amount of efforts on searching for data, analyzing it and eventually come up with almost nothing.  This is a real threat to many organizations.

The direction of solution we offer is building a high-level strategic process, run by a special team operating as a headquarter function that follows these steps:

  1. Decide on a prioritized list of worthy objectives that are not satisfactorily achieved
  2. For each of the objectives identify the key obstacle(s) and what is required to overcome them. We assume many of the obstacles are due to unknowns
  3. Based on the above come up with a prioritized list of specific questions that require good answers, which currently are not available with a reasonably high confidence level.
  4. Search for the specific data required for answering the questions. Many times the search is for external data, but then import the data to a central internal storage
  5. Generating the global picture how to achieve more of the top objective. The answers to the questions are merged with cause-and-effect plus intuition to create possible alternatives for actions. The final analysis is submitted to the decision makers.

The above process is similar to what Intelligence Bureaus are doing for countries. The priorities and the means are clearly different.  Countries most critical questions are about threats, much less emphasis on opportunities, and their means to collect data are usually illegal with special permit from the government.

Customizing the process for true business intelligence isn’t trivial. The big mistake of imitation is ignoring the basic differences.  However, ignoring the similarities and the opportunity to learn from a well established process is another huge mistake.  Given the difference in ethics, priorities and means, the basic need and the analysis tools are similar enough, and the emergence of Big Data gives the potential value great chance of being materialized.

What makes these efforts worthy to go after is the simple fact that the underlying new insights do not clash with any deep paradigm of big companies.

We, Amir and I, will be glad to take part in such an endeavor. We have delivered a webinar on the topic that goes deeper into analyzing the value of Big Data.  The recording of our webinar on the topic can be viewed on TOCICO site, https://www.tocico.org/page/replay?.

In another post we intend to deal with the potential value of simulations to gain new insights and answering very troubling questions.  Like Big Data, and actually any new technology, simulations could bring huge value, but require special care from severe pitfalls.