While at Motorola we stretched the bounds of Six Sigma. We took the techniques into sales and marketing; at one point I even worked with our lobbyists in Washington DC. I recall talking to a marketing executive who was working on his Black Belt, he explained how they had set up a designed experiment between 4 different retail locations with similar sales and similar demographics. They tested different rebates, discounts, and promotional activities and measured the responses to sales. I asked, Did the experiment work? Did you find significant effects? He smiled wryly and said, “yes… but we didn’t actually learn anything new.”
This is my 9th article discussing data and the need for more fact based decision making. Too often I see leaders and decision makers utilizing merely previous experience or shared organizational wisdom to make decisions. This is because it is hard work to collect data. I always remind my Green Belts that measurement is not free; there must be a return on the investment. While I believe we undervalue the worth of taking extra time to collect data, I also recognize that data will not tell us everything, and often confirms what we already know. For my friend in marketing at Motorola, he already knew the impact of the different marketing strategies that he employed. The experiment provided quantification of these relationships, but from his perspective, the practical value of this information was too low. The model does not provide more clarity than a leader with solid experience.
Data is powerful, but it has limitations. We should always start with observation and fact collection, but data will never remove the need for judgment based on experience. Here are my guiding rules:
- Start by collecting something.
- Think about the cost of being wrong.
- Make a decision to collect more or experiment.
- Leadership is making decisions in the face of uncertainty.
Start by collecting something. I recommend reading the book, “How to Measure Anything” by Doug Hubbard. One of the premises of his book is that even a handful of data points are extremely useful. Much of what we experience in business is highly repetitive. Even if the repetition only occurs a few times a year. Even one or two examples can be enough information to better understand the phenomenon of interest. Basic observation does not cost much nor does interviewing people that are involved in the process. Go to the Gemba and collect some basic information. This may also help you understand how much it will cost to perform a deeper dive.
The amount of work you do in analysis must always be balanced against the cost of being wrong or the cost of running an experiment. While doing some work with a large quick serve company we considered making procedural sequencing changes that would impact 15,000 locations. We spent a month of work performing analysis and two days testing the operation at a test site. Implementing the change was an expensive task in layout changes and training. The decision to make this change required plenty of data and analysis. The analysis was well worth the value it returned. Alternatively, a manufacturing client was interested in reducing changeover times on one of their machines. We spent a few hours observing the process. As a result, we considered the production impact of this change low. Then we implemented the change, checking the results afterwards. Delaying the change to collect more information would have been unnecessary and wasteful.
Make a decision to collect more data, experiment, or implement the change. My rule of thumb is to look for a factor of 10. If the change costs $1MM then leadership should not be afraid to spend $100,000 in analysis. If the change costs $1000 then even a small amount of analysis may exceed any potential loss or gain from the change. I once proposed a $100,000 simulation to a company that was implementing a new $10MM production line. They turned me down based on the large cost. With 10 new employees to run the line a reduction in one person could easily pay for the project. Additionally, one small misstep could add an additional person reducing the profitability of the production line for years. It was short-sided by the company’s management. Some managers only see the accounting costs and miss the economic value that thoughtful analysis can produce. Savings from analysis are less tangible, but in the long run they become real costs and benefits. Short sided leaders often miss this to their own detriment, not that I have any hard feelings!
And sometimes it is appropriate to just implement a change. Too much analysis is a problem too, I have found people use the request for more data or an ROI analysis as a tactic to avoid simply trying out a new procedure, tool, or technique. Sometimes it is better to just make the decision and perform a small experiment, checking the results afterwards.
This leads me to my final point. Leadership is often about making decisions in the face of uncertainty. At one point in my journey as a Black Belt I thought it might be possible to take all decisions and turn them into algorithms that provide consistent and correct answers. I later learned that, as my marketing friend pointed out, just because we can create a model doesn’t mean that the model has taught us anything. Practical application has far more variables than we can stuff inside a mathematical model. Every statistical model contains an error term. Every model output has a confidence interval. Every model output has its own probability associated with a confidence interval. In simple terms, data will not tell us everything. If we focus on the learning then we can assess the value of performing the analysis. Before doing analysis or making the change ask what do we hope to learn?
The more learning potential the more value and more time we can put into analysis. The less learning potential the less time we can put into the analysis. What we hope to learn is the most effective perspective to take when determining what should be measured.