KPI Library  misLeading Indicators: how to reliably measure your business
https://kpilibrary.com/experts/misleadingindicatorshowtoreliablymeasureyourbusiness.rss
Recent topics from "misLeading Indicators: how to reliably measure your business" at KPI Library
en
© 2021 ServiceNow. All rights reserved.
Mon, 17 May 2021 12:49:07 +0000
Mon, 17 May 2021 12:49:07 +0000
10
KPI Library  Recent Expert Corner Topics
http://kpilibrary.com/images/logo.png
200
44

phil@greenbridge.com (Phil Green)
Eat oats!
<p>Official inflation numbers are very low and do not correspond with the reality people face when they shop. The US Bureau of Labour Statistics reports inflation at 1.1%. This <a href="http://www.financialsense.com/contributors/jakeweber/chartoftheweekinflationintherealworld">post</a> shows yearoveryear increases for basic staples such as wheat (74%) beef (18%) and coffee (27%). I have been occasionally recording food prices to do my own inflation calculations, using prices at <a href="http://www.grocerygateway.com/Default.aspx">Grocery Gateway</a>. The chart below shows the annualized change in basic food prices based on total changes from January 2008 to today. A one litre carton of 2% milk went from $1.18 on January 16, 2010 to the listed price of $2.69 today. The only food item on my list that did not similary increase was a one kilogram bag of oat cereal.</p>
<p> <a href="http://misleadingindicators.com/wpcontent/uploads/2010/10/food_Items1.jpg"><img title="food_Items" src="http://misleadingindicators.com/wpcontent/uploads/2010/10/food_Items1.jpg" alt="" width="421" height="333"></a><a href="http://misleadingindicators.com/wpcontent/uploads/2010/10/food_Items.jpg"></a></p>
<p>For more posts on this see <a title="FP inflation" href="http://network.nationalpost.com/NP/blogs/fpcomment/archive/2010/04/28/hidinginflation.aspx">here</a> and <a title="blog inflation" href="http://misleadingindicators.com/?p=81">here</a>.</p>
13 Jun 2013
https://kpilibrary.com/experts/misleadingindicatorshowtoreliablymeasureyourbusiness/topics/eatoats
https://kpilibrary.com/experts/misleadingindicatorshowtoreliablymeasureyourbusiness/topics/eatoats

phil@greenbridge.com (Phil Green)
Reinhart and Rogoff story shows how important it is to check your indicators
<p>This month two academic researchers <a href="http://www.nextnewdeal.net/rortybomb/researchersfinallyreplicatedreinhartrogoffandthereareseriousproblems">showed</a> that there were errors in the calculations behind the <a href="http://www.nber.org/papers/w15639.pdf">claim</a> by economists Carmen Reinhart and Kenneth Rogoff that the relationship between government debt and real GDP growth is weak for debt/GDP ratios below a threshold of 90 percent of GDP. Above 90 percent, median growth rates fall by one percent, and average growth falls considerably more.</p>
<p>Reinhart and Rogoff’s finding has been cited <a href="http://www.reinhartandrogoff.com/relatedresearch/growthinatimeofdebtfeaturedin">widely</a> during the economic crisis of the last few years. Turns out there were some simple spreadsheet errors.</p>
<p>It does not matter how many degrees you have or what prestigious institution you work for, errors in indicators can still happen, and are common. Rogoff works at Harvard University and Reinhart at the University of Maryland. Their paper was published by the National Bureau of Economic Research.</p>
<p>Something similar happened when scientists were trying to measure ozone concentrations over Antarctica in the 1980’s. NASA scientists had programmed their satellite to flag measurements under 180 Dobson units (a measure of ozone) as possible errors. They original excluded them. Meanwhile scientists at the AmundsenScott ground station at the south pole were reporting measurements of 300 Dobson units. The NASA scientists figured their satellite was reading incorrectly. In fact it was the ground station that was making the measurement error. These problems combined to delay the discovery of the ozone hole by about eight years.</p>
15 May 2013
https://kpilibrary.com/experts/misleadingindicatorshowtoreliablymeasureyourbusiness/topics/reinhartandrogoffstoryshowshowimportantitistocheckyourindicators
https://kpilibrary.com/experts/misleadingindicatorshowtoreliablymeasureyourbusiness/topics/reinhartandrogoffstoryshowshowimportantitistocheckyourindicators

phil@greenbridge.com (Phil Green)
How statistical process control charts mislead
<p>The statistical process control chart is designed to help differentiate between what are usually called “common” causes of variation and “special” causes. But this is very misleading.</p>
<p>Typically, sixsigma consultants and various authors advise you to ignore data between control limits because it is “random.” They say to only make process adjustments when the process falls outside the control limits. In most cases, the premise behind this advice is false.</p>
<p>The justification given goes as follows:</p>
<blockquote><p>Consider all possible measurements that we could get from this process over the long term (whatever that is). The control limits show how far we would expect 99% of the measurements to wander from the mean of the process if the process does not change. Ignore what you think you have learned over the years about what causes the process to wiggle within the control limits because it is just random variation (whatever that is).</p></blockquote>
<p>This explanation is irrelevant. It gives the same weight to future measurements—that have not yet occurred—as to the current measurement. It requires you to throw away information you have learned over a long time. And it is not what you need to know. What you need to know is:</p>
<blockquote><p>The probability that the process has changed, given the measurement you just took, and given the background information about the process.</p></blockquote>
<p> The statistical process control and sixsigma dogma tells you instead:</p>
<blockquote><p>The probability that future measurements will fall between the control limits if the process does not change and if you ignore your background information about the process.</p></blockquote>
<p>These are not at all the same probabilities. They can be very different, depending on circumstances. Proponents of statistical process control treat them as if they were the same. Therein lies the misleading indicator.</p>
<p>More to come….</p>
29 Apr 2013
https://kpilibrary.com/experts/misleadingindicatorshowtoreliablymeasureyourbusiness/topics/howstatisticalprocesscontrolchartsmislead
https://kpilibrary.com/experts/misleadingindicatorshowtoreliablymeasureyourbusiness/topics/howstatisticalprocesscontrolchartsmislead

phil@greenbridge.com (Phil Green)
Top ten reasons for reading misLeading Indicators: How to Reliably Measure your Business
<p>There are many books that tell you what to measure to succeed in a business strategy or to improve business performance. Ideas on what to measure may indeed be very useful. But these ideas won’t help you, and may even be harmful, if you do not know whether what you are measuring is misleading you.</p><p> <br><a href="http://misleadingindicators.com/wpcontent/uploads/2013/03/coverA3464C2.jpg"><img title="A3464C_Green.indd" src="http://misleadingindicators.com/wpcontent/uploads/2013/03/coverA3464C2198x300.jpg" alt="" width="198" height="300"></a></p><br><p><strong><em>misLeading Indicators</em></strong> reveals the hidden and potentially misleading nature of indicators that can make or break a business.</p><br><p>Here are our top ten reasons you should read <em>misLeading Indicators</em>:</p><br><ol>
<br><li>It will provide you with four clear principles for determining which indicators and measurements can be trusted, and which mislead; it illustrates these principles with many indicators from across a wide spectrum of businesses and functions.</li>
<br><li>It shows how common measurement clichés (e.g. “you can’t manage what you can’t measure”) and metaphors (e.g. automobile and airplane dashboards) can lead you to make misleading interpretations of indicators.</li>
<br><li>It will show you how to determine whether indicators based on counts are reliable (e.g. inventory, opinion polling).</li>
<br><li>It will show you how to determine whether indicators based on instrument measurements are accurate and precise (e.g. process measurements, temperature).</li>
<br><li>It will show you how to determine whether rankings and ratings are reliable (e.g. customer satisfaction ratings, audit scores).</li>
<br><li>It will show you how people, knowingly or unknowingly, manipulate and modify indicators to make them misleading.</li>
<br><li>It will show you techniques to develop indicators that focus employee efforts and will show how some indicators, by oversimplifying and glitzing up information displays, misdirect their efforts.</li>
<br><li>It will show you how common indicators of time series, such as Statistical Process Control charts, are misleadingly explained, justified and interpreted.</li>
<br><li>It will show you how averages can distort indicators and mislead about the underlying data behind.</li>
<br><li>It will show you why probability, and thus risk, cannot be measured, and why attempts to create measures of risk so often lead to spectacular failures such as mine explosions and business failures.</li>
</ol><p>You can buy misLeading Indicators at <a href="http://www.amazon.com/misLeadingIndicatorsReliablyMeasureBusiness/dp/0313395950/ref=sr_1_1?ie=UTF8&qid=1364590251&sr=81&keywords=misleading+indicators+how+to+reliably+measure+your+business">Amazon</a>, <a href="http://www.barnesandnoble.com/w/misleadingindicatorsphilipgreen/1110997969?ean=9780313395956">Barnes and Noble</a>, or direct from the <a href="http://www.abcclio.com/product.aspx?isbn=9780313395956">publisher</a> in paper or electronic formats.</p>
<p> </p>
<p>Philip Green and George Gabor are coauthors of <em>misLeading Indicators: How to Reliably Measure Your Business</em>, published by Praeger. <a href="http://misleadingindicators.com/">www.misleadingindicators.com</a></p>
<p>© 2013 Greenbridge Management Inc.</p>
2 Apr 2013
https://kpilibrary.com/experts/misleadingindicatorshowtoreliablymeasureyourbusiness/topics/toptenreasonsforreadingmisleadingindicatorshowtoreliablymeasureyourbusiness
https://kpilibrary.com/experts/misleadingindicatorshowtoreliablymeasureyourbusiness/topics/toptenreasonsforreadingmisleadingindicatorshowtoreliablymeasureyourbusiness

phil@greenbridge.com (Phil Green)
How Western Electric rules mislead in statistical process control
<p>The statistical model behind control charts for incontrol processes is based on the assumption a Gaussian process with no autocorrelation (i.e. independent) with a constant mean and constant variance: in other words a white noise process. The various <a href="http://en.wikipedia.org/wiki/Western_Electric_rules">Western Electric rules</a> try to find patterns that are not white noise, and thus show that the process is out of control.</p>
<p>It is quite easy to do a simple experiment to illustrate the flaw in the Western Electric rules. Generate, using some statistical software, several columns of Normal “random” numbers. Then apply the Westinghouse rules. You will see that most of the columns fail the Westinghouse tests, even though they are by definition “white noise.” For example, I generated 10 columns of n=100, with a Normal distribution, and used Minitab to plot IMR charts and apply all tests. All 10 columns failed at least 1 test.</p>
<a href="http://misleadingindicators.com/wpcontent/uploads/2013/02/I_Chart_N01.png"><img title="I_Chart_N(0,1)" src="http://misleadingindicators.com/wpcontent/uploads/2013/02/I_Chart_N01300x202.png" alt="" width="300" height="202"></a><p>Individuals SPC chart on simulated Gaussian data. It failed four of the Western Electric rules.</p>
<p>The Western Electric rules in this experiment conclude that all 10 columns are outofcontrol, or “non random” (whatever that is), even though I generated the data with a socalled random number generator. The rules state (<a href="http://www.quinncurtis.com/SPCNamedRuleSets.htm">see here for example</a>) that the probability of an outofcontrol process (for one we know to be in control) is very small. For example, the probability that eight points in a row will be on the same side of the centreline is (supposed to be) 1/256. What went wrong?</p>
<p>Several things went wrong. The probability the rules are based on is the probability that the <em>next</em> seven (or whatever) <em>future </em>points fall into some particular pattern (for example, they are all above the mean). These are points that have not even happened yet. This is rarely acknowledged when people explain the rules.</p>
<p>This (false) probability is not useful to someone controlling a process. What is useful is the probability that the process is out of control, given the measurements you already have (and your knowledge about the process and how it works).</p>
<p>Statistical Process Control and sixsigma promoters turn around and pretend that these probabilities are the same. But the probability that you will find suchandsuch a pattern on a control chart, using measurement that you already have, is most definitely not equal to the probability that the pattern will occur in the next few measurements. To calculate the probability that particular points that have already been measured indicate an “outofcontrol” process, one would have to use a very different procedure than the one used by the Western Electric rules. Our simulation example illustrates this.</p>
<p>The second thing that went wrong is that there are multiple tests on the same data. This changes the probability that there will be a “false alarm,” in other words, a signal that the process is out of control when it is not.</p>
<p>The third thing that went wrong, or that often goes wrong, is the way people interpret the probability. Take the simple rule that says a process is out of control if a measurement goes outside a three standard deviation control limit. The chance of that happening in the next measurement, for a white noise process, is about 0.27%. On average, if we had 10,000 data points, we would expect about 27 to be outside these limits. In another experiment, I generated 100 columns of n=100 Normal (i.e. Gaussian) data, or 10,000 data points and ran that test. There were 27 instances where a point was outside the limit, as expected. These 27 instances came from 24 different columns. It would be tempting to infer that this means that 24 out of the 100 columns were “out of control.” But that is not what it means at all.</p>
<p>Philip Green and George Gabor are coauthors of <em>misLeading Indicators: How to Reliably Measure Your Business</em>, published by Praeger. <a href="http://misleadingindicators.com/">www.misleadingindicators.com</a></p>
<p>© 2013 Greenbridge Management Inc.</p>
4 Mar 2013
https://kpilibrary.com/experts/misleadingindicatorshowtoreliablymeasureyourbusiness/topics/howwesternelectricrulesmisleadinstatisticalprocesscontrol
https://kpilibrary.com/experts/misleadingindicatorshowtoreliablymeasureyourbusiness/topics/howwesternelectricrulesmisleadinstatisticalprocesscontrol