What Brick and Mortar Retail can learn from Big Data?

By Amir and Eli Schragenheim

Physical stores face a wicked problem that gets bigger and bigger. An increasing number of customers buy through online shops, offering lower prices and wider choice.  Online shops also face a huge business problem, coming from fierce global competition and lack of a clear competitive edge other than price. This does not help the physical stores to re-invent themselves and offer a real alternative to the customers, being offered by online shops wider variety, getting everything shipped home and with lower prices.

Part of the reason that so many customers buy online is that e-commerce stores succeed to understand better the specific taste of customers and approach them with lucrative suggestions. This understanding is achieved by investing huge efforts to gather a lot of data from every user entering their site, and perform sophisticated analysis of that data.  These are ongoing efforts so we can expect more improvements in manipulating customers to buy through the Internet.

How come physical stores do not take similar actions?

Physical stores have harder access to the relevant information. For instance, currently there is no good way to record customers who do not find the product they’re looking for.  The behavior patterns of the customers are not recorded.  There might be video cameras in a store, but their objective is security and all other behavioral aspects are not considered.  So, it seems there is not much the physical stores can do to study better their customers.

This is a HUGE mistake.  There is plenty of available data that can be processed into valuable information, and there are ways to access more data, which could be used to yield even more information.

Customers coming to big stores, certainly to supermarkets and drug-stores, buy, many times, more than one item. This is an opportunity to learn more about the possible relationships between different items, and the personal tendencies to brands, plus the role of the price in choosing from a variety of similar products.

The total purchase of different items purchased by a customer carries hidden information that testifies to the taste and economic level of the customer. To reveal the relevant information certain analysis has to be carried out, with the aid of statistics & machine learning (ML), to be able to come up with answers to key questions concerning the most important decisions every retail store has to make:

What new items to hold? What items should be eliminated?  What is the relative importance of keeping perfect availability of an item?  What items should be placed close together?  What items could be used for promotions?  What additional services, like an on-site bakery, should be added (or removed)?

Inquiring the total purchase of a customer reveals so much more than just looking at the sales of every item. The mix of items purchased at one time reveals wider needs, taste and economical behavior. When it is legally possible to identify the specific buyer, then the previous purchases of the buyer can be analyzed in order to define in greater detail the market segment that customer belongs to.  Part of the value of maintaining customer loyalty clubs is the ability to link different purchases, at different times, to the same client.  Thus, a client profile can be deduced.

The most obvious outcome is mapping the customers according to market segments, noting the different characteristics between the segments. Inquiring the purchases could highlight aspects regarding the family of the customer: spouse and children, their approximate age, their financial status and their preferences.  These characteristics can be revealed through analysis of the purchases and the frequency of purchasing.  Certain preferences, like smoking or being vegetarian, can be identified.  Together the key characteristics are revealed in order to define several layers of the market segment.

Another important value that can be deduced from inquiry of purchases is the dependencies between different items: when item X is purchased then there is a good chance that item Y is purchased as well.  These dependencies are sometimes intuitively deduced by some managers and they definitely impact decisions concerning maintaining the availability of both items, their placement within the store and even the possibility to sell them together as a package.  Understanding the linkages between products helps to check changes of purchasing habits over time.  For instance, when the economy goes down, how it impacts different segments, which is much more valuable than just watching the impact on individual products.  We can expect a general shift to cheaper products, but which brands are replaced by cheaper ones, and which segment makes those changes more than the other, should be valuable in forecasting those changes before the actual change in the economy takes place.

The structure of typical purchases by different market segments would certainly initiate marketing moves that would capitalize on that understanding. Analytical knowledge, translated into operational policies, would impact the performance of the various branches of the retail chain, as the specific needs of the branch are recognized, but also some of the generic insights.

When every purchase of a specific client can be linked with the previous purchases of that client, then the frequency of buying could lead to initiatives to influence the content of a typical purchase of a specific market segment.

Developing the machine-learning (ML) module to categorize better the different segments the store serves, should enhance both the marketing and the logistics of every retail store. There are always dilemmas in holding slow movers given the amount of the logistical efforts required to keep that slow mover available.  Being exposed to the right priorities, by understanding the full financial impact of the slow mover sales, would lead to better decisions about what items to hold in stock.  The relative value of a slow mover includes its impact on the sales of other products. Understanding the relative importance of that particular item to a specific segment contributes to determine the slow mover impact on the desirability of the store from the viewpoint of the market segment.

Through ML the retailers can get better understanding of the customer loyalty to specific brands and items. When it is already established that a certain segment prefers item X to item Y, then by intentionally creating unavailability of X for one day, it is possible to discover whether most clients from that segment bought Y, or refrained from buying a replacement.  It also answers the question whether buying the replacement would impact the brand loyalty.  Promotions also cause people to buy the less preferable items, but it is of major interest to know whether it impacts the brand loyalty.

It is highly desirable to have access to the information on the availability of all items at the time a certain purchase was made. When item X happens to be unavailable, then it provides an opportunity to check whether the seemingly brand-loyal customers switch easily to the replacement.

Supermarkets and drug-stores are typical retailers where purchasing usually includes several different items. It seems absolutely necessary that such retail chains would invest efforts in ML to learn more about their customers habits and develop the process of coming up with superior decisions to capitalize on that knowledge.

Advertisements

The Frustration of a Middle-level Manager – A short story by Eli Schragenheim

My boss, Dr. Christopher Logan, asked me to come to his office at 2pm sharp, and report in detail how come the delivery to MKM did not have all the 100 B12 units. I know how such meetings are conducted:  it looks friendly enough, but actually there is nothing friendly in such a meeting.  The kind of attitude, “we are nice understanding people” is supposed to hang over, but beneath the politeness you’re on TRIAL – try to prove it is not YOUR incompetence that caused the trouble.

Such a hostile inquiry takes place every two months. Many things go wrong every day, but only few receive this kind of treatment.  It is usually because the client is very important, very big or new, and when such a client is making a complaint then the incident becomes critical and somebody has to be blamed and punished.

MKM is both big and new. From whatever reason, getting 97 good units, out of 100, exactly on the formal due-date, was not enough for them.  I promised immediately to deliver the missing three units in five business days.  Apparently this was not good enough, so a formal complaint was put on Dr. Logan’s desk.  I have no idea how come the delivery of all 100 units at the promised day is so sensitive that 3% less is such a disaster.

I’m now preparing my defense, trying not to exaggerate too much the role of Freddy in the blunder.   I know that almost every one of my people might have done the same mistake he did, and that mistake only partially led to missing the delivery.  As MKM demanded 100 good units that would pass their test in full no later than June 1st, 2018, we decided to produce 120 units, as there is no viable way to identify such quality exceptions in B12 production in an early stage.  Freddy mistake was assuming that a temperature of less than 1 degree above the standard is still within the control limits.  Usually this is right, but for MKM specifications it is not.  That small deviation impacted, at most, five units, because the temperature was fixed very fast.  Five out of the extra buffer of twenty cannot be the only reason that only 97 units passed the MKM test.  We, by the way, sent 101 units that passed our own test, rejecting 19 due to variety of reasons.  How come four units, which passed our test, failed in their similar test is an open question. No one, I repeat, NO ONE, has any explanation for this fact.

This is the situation I face and I just hope it won’t turn out as bad as the blunder of last year, which led to the layoff of two good people. The charge was not paying enough attention to rare and unfortunate incident that caused the breakdown of expensive equipment.  In such cases, Dr. Logan is taking the juridical authority to find who to blame.  I actually understand him; his superiors would blame him unless he succeeds to find another scapegoat.  So, it is now my mission to avoid being the scapegoat.  I hope I’d succeed also to prevent Freddy from such undeserved verdict.

Part of the pain we in feel in Operations is that things could have been much better if we would have known more about the clients true needs. We are told not to be in touch with the clients, so all I’m able to know is what is written in the documents submitted by the clients.  We only see the name of the client, the name of the responsible product manager and a list of specifications without much detail.  The product managers also know very little on the client true needs.  I suggested Larry, the product manager responsible for MKM, that we deliver first 50-60 units of the order already in May 16th and two weeks later all the rest.  The reason was that we had to split the order into two batches due to some technical difficulties.  Larry called them and got a refusal, but no explanations.

Why?

Why do I have to function under strict instructions which I fail to see their rational?

Why I meet the executives, to whom Dr. Logan reports, only in special public ceremonies?

All I see above me is Dr. Logan. The rest are located far away and there is no active dialogue between me and them.

I stay in this company, first of all, because I have a wife and two small daughters. I also think I’m doing a very good job.  I cannot prove it, but I think that under someone else, with less experience and technical knowledge, the MKM failures would triple.  Most of the time the procedures we have, and the willingness of my people to react to any signal pointing that something might be wrong, keep what I consider to be very good overall quality of operations.

But, the ridiculous performance measurements look as if our performance is just moderate. The cost of our operations is, according to the measurements and the funny benchmarking they use, somewhat higher than the “average” of similar facilities.  This is so wrong that it is an insult!  If you don’t even contemplate to listen to us in order to understand what we do and why we do it this way, how do you expect us to improve?

I feel the situation is “us against them” – the people who do the job against the people who play the role of God and judge our performance, even though in some public speeches they say “we have to do much better”, making the wrong impression that they think they should improve as well. I don’t think Dr. Logan believes he has to improve.  He has me and my people to improve, and it is just me who has to learn the lesson and make sure the MKM case would never happen again.

So, here is a potential action plan. I shall argue that the whole incident happened because Sales intentionally concealed part of the detailed specifications of the order, being concerned we’d reject the order, because we don’t have the capability to do it right.  Larry, the product manager, told us that the chief salesperson hinted that MKM has the most sophisticated equipment in the world.  I didn’t see the implications at the time, but it is evident now that such equipment allows more precise measurements, so it could be that the true specifications were not given to us because our equipment is unable to meet such precise specifications.

Does MKM really needs more precise specifications for their products?

Fact is that 97 units have passed the test. It means that our equipment is able to achieve the required specifications. But how can we test the final quality when our testing equipment is not the same as MKM new testing equipment?

Human relationships as a part of the holistic approach in managing organizations

Management is about achieving results for the organization. The obvious meaning is that integrating the various organizational functions, like Sales, Operations, Finance and R&D, together to achieve the best performance is what the CEO has to accomplish.   This is the holistic approach: integrating the parts into a whole entity.

What is the role of HR in this need for integration?

In every part of the organization there are people who are truly required to achieve the global objectives. Every resource has a set of capabilities and a certain capacity that limits the amount of output in a period of time.  When it comes to human resources both capabilities and capacity are much more difficult to define and measure than the other types of resources, like machines, space and money.  But, the limitations of both human capabilities and capacities play a considerable part in the performance of the organization.

The characteristic of human resources is such that to properly utilize their capabilities the right motivation has to be active. For instance, a salesperson is meeting a new potential client.  Would she do everything she can to bring the client in?  Is this true also when she isn’t entitled to a special bonus for that?  Similar situations are a purchasing agent negotiating price and terms with a supplier and a foreman who gets a special request to expedite an order, which requires extra efforts.  Eventually all the above examples depend on the willingness of employees to help the company to prosper.

People might cause unintentional damage by failing to act in the right way. This could happen because of being untrained, or incapable, to do the job properly.  Another cause is flawed procedures and measurements that push people to do what harms the performance.  TQM, Lean, Six Sigma and TOC act in their different ways to fix that.

There are very few cases, but causing huge damage, where employees intentionally harm the performance of the organization.  This could take the form of refraining from doing what is required, like a strike, or even taking concrete actions that disrupt the performance of the organization.  It is definitely the responsibility of top management to prevent such high damaging cases, which raise emotions, like rage, preventing win-win solutions.

In the vast majority of the cases employees simply do their job as being told by their superiors.  When top management is doing a good job in integrating all the parts into synergetic performance than the results are positive, otherwise the employees cause damage, exactly because they follow instructions.

Can employees add high value that is far beyond just doing their job?

High level employees, like executives and highly professional employees, are expected to make huge efforts beyond their job. Question is: what is expected from the rest of the employees?

It is definitely possible that relatively lower level employees might know how to help the company to do better. In most cases the employees decide to keep quiet, believing the boss would not listen or appreciate their ideas.  Many employees feel that helping the company, beyond the formal description of the job, is a waste of their intellect.  This is a declaration of indifference:

“This is just my job, not my life. I’m not going to waste my intellect and special efforts for the organization that does not employ me for that purpose.”

So, the message for top management is that the employees might become a problem, either because they are not capable or because they are not motivated to do everything they can for the sake of the organization.

Henry Camp is the CEO of Shippers Supply Inc. and the owner of four other companies. Henry conducted a TOCICO webinar highlighting the 10 steps required to achieve the active collaboration of the employees for the company.  A special emphasis was on being ready to assist implementing a change in the way the company operates.

Henry Camp webinar is focused on what management should do to ensure that this indifference would never happen, preventing also the damage of intentional acts of frustration of the employees.

I recommend the reader to watch the recording of the webinar on the TOCICO site. A somewhat shorter alternative is watching his 30 minutes video on YouTube:  https://www.youtube.com/watch?v=4B0Azc6MNn0

I like to raise the issue of a CEO who is either new to the organization, or have not paid too much attention to the human relationships culture in the organization and now realizes that the time has come to diagnose the current state.

How much effort should management dedicate to diagnose problems with the motivation of their subordinates and solve them?

The objective is to find out whether the current performance of the organization is seriously harmed by existing level of distrust between employees and management. In the terminology of TOC the actual question is:

Is the internal human relationship the core problem of the organization?

The easy, but not always the best way, is inviting organizational behavior consultants to do the diagnostic. The result often is lack of focus, as the tendency is to come up with long list of what needs to be fixed.  The true damage to the Throughput is usually not defined.

There are two key organizational flows that determine the rate of achieving the goal. The first is the current flow-of-value to the customers.  The second is the flow of initiatives to improve the flow-of-value.  The concern from the impact of behavior on the current flow-of-value is creating blockages and by that harming the reputation of the organization.  The main concern for the initiatives is not trying hard enough to come up with great innovative ideas.

The chronic problem of the organizational culture is not with individual employees. Such problems are relatively easy to handle.  The problem is when most employees radiate indifference to achieve more of the goal.

Dealing with power groups within the organization is a situation that can easily become disastrous. Every airline has to manage its relationships with the pilots with extra care, while all the other groups are watching and might react to any change in the status quo.  In hospitals the surgeons have extra dominance and universities are run by full-time professors.  The balance between the power group, top management and the other groups is quite sensitive.  It is possible to get a win-win for all the groups, but it is not easy to maintain it for long time.

Can we apply rational cause-and-effect to diagnose existing or emerging behavioral problems and then find the effective win-win?

There is a common claim that people behave irrationally and thus analyzing it with rational logic is not effective. The argument is that we, human beings, often act based on impulses, stirred by emotions, leading to behavior that seems irrational because the actual results to the person are bad.  For instance, criminals behave in a way that eventually leads them to be in jail.  Question is whether the decision to make a crime is irrational from the perspective of the person committing the crime? Criminals could choose to satisfy their immediate desires in spite of possible negative consequences, as they judge being in jail less negative than most people.

Is human behavior often unpredictable?

If the behavior is the result of known causes, like the desire for dominance on other people, then the logical analysis should lead to expected behavior that is in line with reality.  Most of the time we predict the behavior of people we are communicating with well enough.  This is also true for managers predicting the response of their people and vice versa.

When negative behavior of employees can be predicted then it might appear on one side of the core conflict of the organization. For instance, when management distrust their employees they could suspect that the employees would not cooperate in introducing a change.  Such a conflict looks like that:

cloud management employees

Suppose that indifference is causing many undesired effects that reduce the potential of the organization to achieve more of the goal. Does it mean the indifference is the core problem?  Or the indifference is a symptom caused by another effect that causes several other undesired effects?

Most of the time indifference is caused by the reluctance of management to trust their employees, or by poor performance of the organization that harms the morale and the trust of the employees in the management. Both causes have more negative ramifications on the organizations.  Having to control everything has a huge negative impact on management attention, and from that on the ability of the organization to grow.

This basic trust and sense of purpose should be carefully maintained by the management of the organization. When there is a change in the mutual trust between management and employees then new emerging undesired effects should signal that such a change is happening.  A drop in the delivery performance to customers could be such a signal.  Failing to meet commitments, reduced quality and increase in customer complaints should be carefully viewed and monitored.  When such a change in the mutual trust is validated, the next step is to understand what happened to this sensitive balance.   Understanding the causes for human behavior is very much needed at this stage.  When such signals are not observed, then the focus of management should be elsewhere – on what truly constrains the performance of the organization.

The importance of Big Data

By Amir and Eli Schragenheim

Is Big Data important? Can every organization draw considerable value from it?

Amir and I assumed that the ultimate answer of most people in management would be: Yes, there is a big potential, but there is also a problem of drowning in the ocean of data (Goldratt in The Haystack Syndrome).

Well, as it seems too many people think that there is not much value to find in Big Data. So, maybe we, who think there is a very substantial potential value, need to back up this assertion.

Big Data in its narrow form is the ability of every organization to store huge quantities of data relatively cheap by the use of the cloud software tools for extracting specific data from various databases and formats, and organize them in a way that allows the human manager to focus on what is truly relevant.

A much wider approach to Big Data includes the huge amounts of data from external resources that allow free access through the Internet. Google, Facebook and LinkedIn provide the tools to do it and there are also public databases that allow searching and using their data for a certain cost.

It seems obvious that some organizations, certainly the bigger ones, are drawing a lot of value from Big Data, like the three big data manipulators mentioned above. Those giant organizations offer focused ways to advertise to well-defined audience.  Having the means to approach very specific market segments can be used to gain knowledge on the preferences of their customers.

The business sector of e-commerce, especially digital stores, is using their own huge data, taken from everyone who enters the website and records every move the user does, to draw conclusions on what the customer is interested in. The analysis of this accumulation of data opens a way not only for offering more to that customer with good chance for selling, but also winning that customer for future deals.  Beyond guessing the specific taste of every single customer, the generic understanding of groups of customers, like the role of price in their choices, can be established.

Physical retail stores use much less efforts to capture data that would reflect the clients’ preferences, beyond the trivial analysis of actual sales. Without direct access to client information, and even worse, without knowing what data could help them to gain more sales, they are helpless.  The retail stores lose a lot from their incompetence to collect the data they need to become more effective.

So, companies that have easy access to pretty straight-forward relevant data find answers to critical questions and gain a lot of value. Other organizations don’t.

When a new technology, like the ability to store and analyze huge amounts of data, presents itself to the market it raises two seemingly similar, but actually different, questions.

  1. Given the existence of the technology can we utilize it to bring benefits?
  2. Given our current obstacles – does the new technology lead us to overcome them? If so, what are the benefits going to be?

Many organizations don’t immediately see the benefits of a major new technology, meaning their answer to the first question is NO.

However, we believe some more efforts should be given to analyze what might overcome obstacles. Currently the organization accepts them as hard facts of reality, but the new technology is able to vastly reduce the limitation imposed by the obstacle. Then, new opportunities could be identified.

Goldratt 2nd question, of the Six Questions for assessing the value of a new technology, states:

What current limitation or barrier does the new technology eliminate or vastly reduce?

The obvious limitation of storage is not the relevant answer to the above question, because the value of storing huge amount of data is not clear and could easily lead to waste of efforts. Also reducing the slow and cumbersome speed of collecting huge data and organize it in a friendly visible way does not always add value.

But, we always have the wish to have more relevant information on the critical issues the organization is dealing with. We never have perfect information when a decision has to be taken.  So, decision making is always under high uncertainty, due to variation, plus unknown facts.  While this basic life situation would continue in the future, the unknowns could be significantly reduced if the right relevant information is collected and given to the decision makers.

Thus, Amir and I suggested the following limitation/barrier that the new IT technology reduces:

Not being able to get reliable answers to questions that require data, which was before either unavailable or not accessible

For instance, what are the features that many customers miss in our current products?

It is possible to ask the customers such questions, and even store all answers, but many of them simply refuse to answer and maybe they do not know what they miss, but when they would see it, they will know. Can we answer the question if we analyze data on what caused certain products, from different sources, suddenly becoming highly popular?

Failing to answer critical question is a key limitation for every company, and a search for the truly relevant data should, many times, yield new information that, together with an effective analysis, should yield substantial value.

To clarify the sensitive connection between data and information let’s bring the definition Goldratt gave to ‘information’ in his book ‘The Haystack Syndrome’ from 1990:

Information is an answer to a question asked

The definition highlights two insights. One is the power of asking questions, because in most cases when you ask something it is something that bothers you, so the answer to the question is also an answer to a need.

The other insight is that in order to answer a question certain data is required, and through the question that data becomes information.

In order to manage an organization successfully questions have to be asked and each one of them is directed to highlight a required aspect for one of two categories of managerial needs:

  1. Identifying new opportunities and how to draw the value from them
  2. Identifying emerging threats and how they could be eliminated or controlled

The first category is about new initiatives for success. The second category is about protecting your back.  Both are critical to every organization.

Goldratt’s third question is:

What are the current usage rules, patterns and behaviors that bypass the limitation?

Without the means of gathering data from many sources the decision makers have to make decisions, the practice has to be based on the following elements:

  • Using the routine data from the ERP or legacy system of the organization
  • Using the intuition of the key people in the organization closest to the specific topic
  • Employ a general ultra conservative approach, due to the unknowns and the perceived risk

The most important element is the use of intuition, based on one’s past experience.  So, it is certainly relevant data, but its quality is questionable.  The lack of objectivity, the various personal biases and being very slow to embrace any change, comprise the problematic side of intuition.

Intuition will still take a big role in the future. However, the ability provided by certain analysis based on data, unavailable before true big data, to check the validity of the initial intuition (especially the hidden assumptions) and also to be the source for new insights that could inspire new intuition could settle a new relationship between hard analysis and intuition.

TOC people argue that on top of intuition there should be cause-and-effect analysis that enables great managers to speculate right even when actual data is minimal. This is sometimes true, but as all cause-and-effect are based on observed effects, which are not always true facts, then even the most robust logic cannot deal with too many unknowns without data to rely on.

So, how could we improve our ability to spot new opportunities and emerging threats with the aid of the new IT capabilities of accessing huge amount of data?

The big trap of using the new IT capabilities is: losing the focus, investing huge amount of efforts on searching for data, analyzing it and eventually come up with almost nothing.  This is a real threat to many organizations.

The direction of solution we offer is building a high-level strategic process, run by a special team operating as a headquarter function that follows these steps:

  1. Decide on a prioritized list of worthy objectives that are not satisfactorily achieved
  2. For each of the objectives identify the key obstacle(s) and what is required to overcome them. We assume many of the obstacles are due to unknowns
  3. Based on the above come up with a prioritized list of specific questions that require good answers, which currently are not available with a reasonably high confidence level.
  4. Search for the specific data required for answering the questions. Many times the search is for external data, but then import the data to a central internal storage
  5. Generating the global picture how to achieve more of the top objective. The answers to the questions are merged with cause-and-effect plus intuition to create possible alternatives for actions. The final analysis is submitted to the decision makers.

The above process is similar to what Intelligence Bureaus are doing for countries. The priorities and the means are clearly different.  Countries most critical questions are about threats, much less emphasis on opportunities, and their means to collect data are usually illegal with special permit from the government.

Customizing the process for true business intelligence isn’t trivial. The big mistake of imitation is ignoring the basic differences.  However, ignoring the similarities and the opportunity to learn from a well established process is another huge mistake.  Given the difference in ethics, priorities and means, the basic need and the analysis tools are similar enough, and the emergence of Big Data gives the potential value great chance of being materialized.

What makes these efforts worthy to go after is the simple fact that the underlying new insights do not clash with any deep paradigm of big companies.

We, Amir and I, will be glad to take part in such an endeavor. We have delivered a webinar on the topic that goes deeper into analyzing the value of Big Data.  The recording of our webinar on the topic can be viewed on TOCICO site, https://www.tocico.org/page/replay?.

In another post we intend to deal with the potential value of simulations to gain new insights and answering very troubling questions.  Like Big Data, and actually any new technology, simulations could bring huge value, but require special care from severe pitfalls.

The Challenge of Facing Complexity and Uncertainty

Mickey Granot has published a very interesting article entitled: “The 3 mistakes that prevent exploiting your business potential”, see at https://www.linkedin.com/pulse/3-mistakes-prevent-exploiting-your-business-potential-mickey-granot/?published=t.  The mistakes Mickey has identified are:

  1. Spreading management attention too thin.
  2. Misunderstanding the customer.
  3. Misusing measures.

I agree that each of the three mistakes has major negative impact on preventing better exploitation of the current capabilities and capacity in the vast majority of businesses. I think that there is a core problem causing management to repeat the above mistakes all the time.

The constant fear from negative consequences from changes that look promising

The fear is invoked by the inherent complexity coupled with uncertainty. There are simply too many unknown facts for every proposed idea that could, maybe, generate more throughput (T) without significant additional operating expenses.   The difficulty to handle complexity coupled with uncertainty is the key obstacle for every manager. The fear is partially on behalf of the organization and partially due to the personal potential negative consequences of a “failure”.

Example: Offering variety of packages of regular products with a price tag that is 10% less the regular.   The idea is, first of all, to combine products that aim at the same end-consumer.  Another parameter is combining fast movers with medium movers, and by that expanding the market of the medium movers.  Another aspect is the ability to use excess capacity of most resources even when the organization has to add overtime on the weakest link.  The idea is that the resulting delta-T would be much larger than delta-OE.  For instance, publishers can offer packages of several books of a known author. It is known in this market that while the newest book of the famous writer is sold very well, the previous books are sold now much less and might not even be available on the shelf.  Offering a package of the newest book coupled with the first book by that author, could be relevant to fans of that writer that missed the older book.

How would managers approach such an idea? It is not a-priori clear how much more sales will be generated this way and what will be the impact on the bottom line, taking into account the reduced price of the package, meaning significantly reduce throughput per copy.

So, a decision to test such an idea very carefully and for long time seems reasonable. In practice it means introducing very small number of packages and monitor their sales.  The result is that the impact on the bottom-line is usually not so clear.  So, the management, while giving the idea very limited attention, need to try several other new ideas at the same time.  The unavoidable result is spreading the management attention very thinly. This is one effect caused by the basic fear from uncertainty.

The second mistake is trickier to fully understand the causalities behind it.  How come we frequently fail to recognize the right value as perceived by the customer?   When the customer is an organization we can assume the generated value is based on the practical needs of the organization.  Understanding the business of the customer should guide the supplier-organization to identify the true needs and by that gain major insights on how the products/services could be more valuable.  Problem is that such an understanding is not common at all – most marketing people have very little knowledge on the business of their customers because of two key obstacles.

  1. Analyzing a different business from afar seems too complex and hence uncertain.
  2. The current tool to understand how the customer appreciates the products/services is to analyze the complaints raised by the customer.  This proves to be a very partial and problematic tool, which give rise to secondary elements and ignore the more critical ones, sometimes just because the customer does not expect the supplier to be able to deal with the real missing, or flawed, element.  Yet, having a practice seems good enough to many.

When it comes to the end-consumer, understanding the value of the product is even tougher because the consumer sees, many times, value that is not practical. For instance, taste preferences cannot be logically defined by objective attributes, or the aesthetics of the product design. I wrote in the past about the three categories of value, see https://elischragenheim.com/2015/08/03/the-categories-of-value/.

When we analyze the example, the creation of the right packages has to be based on good understanding of the perceived value of the customer, even the question whether 10% reduced price is a good cause for buying a whole package depends on the overall perception of value by the customer.

The fear of negative consequences causes organizations to be very careful especially with assumptions, based on intuition, about the external world, like customers and also vendors.  Understanding the end consumer is difficult because analyzing hard data is not sufficient.  Some logical analysis is certainly required.  But, even then several not fully proven assumptions have to be in place in order to understand the end-consumer and be able to, reasonably, predict the reaction to certain moves.  The fear of failing to predict the behavior of customers limits the efforts to create a ‘theory’ of the true needs of specific market-segments and by that prevents the actual test of the ‘theory’ and by that missing many powerful opportunities that might be much worthier than the current ideas.

The use of performance measurements to measure people is a clear announcement of mistrust created by the fear of failure. Measurements are definitely required for diagnostic of emerging problems and as necessary inputs to decision making.  A wicked flawed part is to assume that the measurements reflect the capabilities and motivation of the people in charge.  This lack of confidence in people leads to many local performance measurements and we know how distorting those are.  See also my previous post, https://elischragenheim.com/2018/03/30/the-problem-with-performance-measurements-and-how-to-deal-with-them/.

It is my view that eventually fearing complexity and uncertainty is the ultimate core problem of the vast majority of the organizations. Only very small organizations, where everyone knows well everybody else, are able to overcome the obstacle of fear from potential negative outcomes of each specific decision or action.

While TOC provides us with great tools to manage common and expected uncertainty in the key TOC applications for Production, Project Management and Distribution, the Pillars of TOC relate to handling uncertainty only indirectly. Humberto Baptista already offered to include a fifth pillar to cover the need to live with uncertainty and be able to handle it effectively, actually use it in order to truly flourish.  Humberto verbalization is:  “Optimizing in the noise increases the noise.”  This insight, which is also part of Dr. Deming methodology for quality, should lead to realize that in order to improve one has to beat the natural variability.

We should come up with a detailed approach to “managing expectations” that will include full recognition of uncertainty and by that reduce the fear and let people, managers and executives included, exploit their own capabilities.

The problem with Performance Measurements and how to deal with them

Dr. Goldratt famous saying “Tell me how you measure me and I’ll tell you how I’ll behave” shows one dark side of any measurement: it impacts the behavior of the people involved, actually the behavior of the whole system.  The hope of management is that the impact would be positive:  people will do their best to achieve the best results.  Unfortunately in most cases the opposite is happening.  Just to illustrate another problematic side we don’t always keep attention to:  when the prime measurement is to make money then some managers might break the law and other moral rules in their quest for making more money.  The 2008 crisis is just an example of the impact of money as a prime, measurement.

The Theory-of-Constraints (TOC) went deep into the clash between the local performance measurements and the global ones, showing how the local disrupt the global. This is certainly one of the most concerning issues and a lot have been written and presented on that issue.  However, performance measurements pose several additional negative branches (potential negative consequences).

The key objective of performance measurements is showing a full picture of the current performance in order to lead the required actions for improved performance.

A devastating side of all performance measurements is the personal interpretation of them. Management, actually many lower level people in the organization, are actually measured by these measurements whether they have succeeded or failed in their job.  This linkage causes the devastating effects that lie behind the famous comment by Goldratt.  I like to state an effect that looks obvious to me:

Performance measurements, at best, represent the current state; they do NOT answer the question “how come?”

In order to conclude what to do next, performance measurements, expressing the current state, are absolutely necessary, but they are definitely not sufficient. An analysis has to be carried to explain the results.  I’m aware that “explanations for poor results” have bad reputation, but that is part of the big problem. The poor results have to be openly recognized in order to identify the core cause.  An explanation like “the people involved were dumb” should lead to immediate questions how come incompetent people were given that particular job!

What makes performance measurements even more problematic is the tendency to outline a target for them.  There are two basic negative characteristics of determining targets:

  1. Parkinson Law claims “work expands so as to fill the time available for its completion”. The same law applies to any quantitative target. The simple rational is: outperforming the target is bad for the future of the individual, because next time the target will be set higher. So, the best case is to reach the target – no more and no less. Almost all means are allowed, including lowering the target to be reasonable, and then definitely not trying to achieve more. I have seen several cases where the organization claimed that 90% of the tasks finish exactly on time. This statistically impossible result gives evidence that Parkinson Law works.
  2. Determining the target is a problematic issue in itself. Sometimes targets are determined by hope and prayers. Sometimes there is a certain rational for the top target, but then all the lower levels are given targets that are, more or less, arbitrary with the sole requirement that they have to support the higher level.  These lower level targets are those that middle-level management try their best to restrain.

The idea behind setting targets is determining “success” or “failure” of people involved. On one hand the idea is to push people to excel.  On the other hand it creates fear, mistrust and manipulations.  This is a typical generic conflict.  The basic assumption that without given clear and quantitative targets people would not do everything they can to accomplish their missions at the highest level is, to my mind, flawed.  The tricky point is that it is a self-fulfilling prophecy.  When people are used to targets, stopping the targets leaves people wonder what they should do, which drives them to do a little, but definitely not too much.  Only a very clear message from management would make a change, and it’d take time to be believed.

Another problematic side of performance measurements is their dependency on the time periods. Suppose that this year the organization has to produce considerable stock because of an expected peak demand at the start of next year.  The annual T is relatively low, while the OE, maybe containing overtime, is relatively high.  Next year T will get much higher.  Question:  would management be aware of the causality?  If not, would Operations support producing stock for next year, when the TOC accounting practices do not reward any increase in inventory?

How can we deal with the negative ramifications of performance measurements?

I think this is the critical question for any organization striving to become ever-flourishing. To call the solution “Leadership” is to underestimate the obstacles and relying on a vague term as if it is a solution.  I think that implementing a structure for management decision making, where predictions are based on ranges, rather than a blind commitment to a number, where the potential risks are openly discussed and the management team eventually reach consensus – until the next management meeting where the actual signals are observed and the discussion might be opened again.  This should be a procedure that is not over-depended on the charisma of a specific leader.

In short, a procedure that truly respects uncertainty, recognizing mistakes without automatic blaming, and trying to correct them is a solution that could work.

Throughput (T) and Operating Expenses (OE), and the Capacity critical connection – the key for decision making

T represents the added value generated by the organization. Operating expenses represent the financial cost required to provide the capacity for all the required resources with the appropriate capabilities, required for generating the value to customers that would be able to generate the T.

Confused? Read it again, this comprises most of the truly required data for managerial decisions.  The division between T, which is focused on sales data and OE focused on the internal resources, is of immense simplifying value to all managerial decisions.

Here is a rough diagram:

Value to customers

I apologize for the poor graphics; I’m not very good with the use of graphical tools.

OE is just the cost for providing capacity. The goal is to have Throughput (T) much bigger than OE and then find the way to grow T faster than OE.  That should be the sole objective of every single decision taken any manager in the organization.  There might be difficulties to do the analysis, but the objective is the same. T for business organizations is defined as Revenues minus the Truly-Variable-Costs (TVC).  The truly variable costs are those that occur with every single sale.  So, T is the added value as measured by the customers, who are willing to pay the price. But, the value for customers includes also what others, who are not part of the organization, have contributed.

Thus, T is the true performance measurement of what the organization succeeded to achieve. OE is what the organization has to pay in order to achieve the T.

Well, I should have also included ‘I’, standing for ‘Investment’, as the part of the capital being invested to make it possible to achieve the T. But, I think there is no conceptual difference between ‘I’ and ‘OE’.  The difference is about time frame.  ‘I’ refers to expenses that stretch beyond one year.  There are mechanisms to convert multi-year expenses into equivalent stream of annual expenses – and these are part of the OE.  So, a $10M machine, which is supposed to work for 10 years, represents an annual expense of $1.1M or whatever conversion rate you think is appropriate.

Comment on a minor complication: Originally Goldratt defined ‘I’ as ‘Inventory’. He moved to the more generic term later.  But, what is a little missing point in the above rough chart is that the materials being purchased are in a temporary state of Inventory (part of Investment) until it either become part of T or part of OE when scrapped.  I don’t think it really complicates the simple picture.

The key point is to understand that OE is the critical enabler to generate T.  And being made of many individual items creates a technical problem to predict how much OE would support future T, for instance taking actual initiatives to double the current level of T might require additional delta-OE that could be more, or much less, than the current level of OE.

The majority of management decisions are about growing or just maintaining the current level of T. After all Sales is about achieving T, and the efforts of Operations are aimed at delivery.  But, there is a constant pressure to reduce OE, mainly because OE represents an ongoing threat to the organization:  you have to pay the OE no matter whether you made enough T or not.  The tricky point of saving OE is that in most cases the negative impact on T is ignored.  The emphasis on T makes you aware that you need to be very careful not to reduce T.

So, we have to understand the dependencies between T and OE, and they look very complicated, because OE is about capacity of so many different, seemingly independent, items.

TOC, through Throughput Accounting plus understanding the full impact of the five focusing steps, the role of buffers in planning and buffer management in execution, gives a much simpler answer to the connection between OE and T.

Critical insight #1: It is enough that one resource would be overloaded, receiving more load than its available capacity to seriously harm the expected T unless significant additional OE is added.

Critical insight #2: There is a real need to maintain protective capacity, certain amount of excess capacity, in order to provide enough flexibility to overcome market fluctuations and other types of uncertainty.  There is no safe formula to calculate precisely the required protective capacity, so conservative assessment is required and then getting the appropriate feedback to ascertain that it is enough.

Critical insight #3: Every internal resource has a finite capacity being covered by a portion of OE, but many times there are temporary ways to increase capacity for a cost, usually much more expensive per unit of capacity than the regular available capacity.  Such means could be part of the protective capacity, but their real value is allowing taking opportunities that clearly require more capacity than the current OE covers. That means delta-OE has to be considered and compared to the expected delta-T.

Any decision that deals with ways to increase T has to analyze the possibility that one or more of the critical resource would be overloaded, and if so find a way to either reduce other sales or increase the capacity of the specific resource(s).

The cost of capacity changes in stepwise ways, which makes the behavior of OE to be clearly non-linear.  One might look at it as a complication, and it really makes the whole notion of “per-unit” measurements non-usable in reality.  But, when the full impact of uncertainty is recognized, then simulating ‘what-if’ scenarios could reveal when the connection between T and OE are clear enough in supporting a decision, or when there is a doubt.

Another realization is that ideas for increasing T are usually significant and their expected impact, both on Sales/Throughput and on the required capacity, is far from being deterministic.  So, some means to check both the conservative realistic possibilities and the more optimistic ones have to be carefully checked.

Another insight: When judging the impact of an idea on sales it seems that if the conservative assessment of the impact is already good, then there is no need to check the option that the impact would be far greater. This is a mistake! When the market reacts very favorably then more problems in capacity, causing delays in delivery, have to be taken into careful consideration.  So, there are clear possible negative impacts of succeeding too well.  It can be called “The curse of blessing”.  I heard that interesting insight from Shimon Pass.  This is a devastating insight if you are not aware of it.

Is the above “simple”?

I think it is as simple as we can get when we strive to be right most of the time.

People who like to know more on what I have briefly outlined above could ask me for a presentation and demo of Throughput Economics, a detailed methodology for evaluating decisions for achieving better much more T than OE.