Evaluation and Control Program Warning Flag 3 – Using Data as Information
Too often, individuals align themselves with a particular statistic or data point as though it infallibly supported their position. In these instances, raw data is assigned meaning absent context from the surrounding environment and possibly in spite of flaws and biases in its collection. While the assignment of meaning to a particular data point may serve one’s immediate purpose, it often leads to erroneous conclusions and may result in undesirable outcomes.[wcm_restrict plans=”41305, 25542, 25653″]
For data to be of value in decision-making or driving behaviors, it needs to be properly interpreted both in context of other supporting and contradicting data sets and placed in context with other quantitative and qualitative environmental factors. Only when data is combined with other associated factors does it truly become information that can be appropriately used.
“Say you were standing with one foot in the oven and one foot in an ice bucket. According to the percentage people, you should be perfectly comfortable.”
Bobby Bragan
Example errors in assigning meaning to a single or few raw data points include:
- Drawing conclusions from survey results without considering both the question asked and responder demographics
- Identifying trends based on the data from one performance indicator without the context of influencing operational factors and counter-balancing indicators (see StrategyDriven articles, Organizational Performance Measures Best Practice – Diverse Indicators and Diverse Metric Groupings)
- Relying on averages particularly when derived from or highly influenced by extreme data points
- Making time-based performance evaluations without considering when data is counted, particularly near transition events such as shift turnovers, days, months, quarters, years, etcetera
- Committing logic fallacies when interpreting a single or limited number of data points (see StrategyDriven article, Decision-Making Warning Flag – Logic Fallacies Introduction)
- Presenting data as possessing unsubstantiated accuracy often the results of manipulations using spreadsheets and scientific calculators (see StrategyDriven article, Evaluation and Control Program Warning Flag – The Illusion of Accuracy)
Representing an individual data point or statistic frequently leads to erroneous conclusions and poor decisions. Business leaders and professions must exhibit a skeptical, questioning attitude when presented with conclusions and recommendations so as to not fall into this trap. While not all inclusive, the four lists below, Process-Based Warning Flags, Process Execution Warning Flags – Behaviors, Potential, Observable Results, and Potential Causes, are designed to help organization leaders and individual contributors recognize whether they adequately process and challenge received data to ensure it has been properly contextualized. Only after a problem is recognized and its causes identified can the needed action be taken to move the organization toward improved performance.
Process-Based Warning Flags
- Data analysis processes do not provide guidance for data synthesis (see StrategyDriven article, Evaluation and Control Program Best Practice – Data Synthesis)
- Data evaluation processes do not require the use of multiple inputs (see StrategyDriven article, Organizational Performance Measures Best Practice – Diverse Indicators
- Data analysis processes do not engage local staff for contextualization (see StrategyDriven article, Business Performance Assessment Program Best Practice – Seek Local Participation for Context)
- Data evaluation processes do not engage multidiscipline teams for data interpretation (see StrategyDriven articles, Decision-Making Best Practice Multidiscipline Teams and Business Performance Assessment Program Best Practice – Multidiscipline Teams)
Process Execution Warning Flags – Behaviors
- Executives, managers, and/or individual contributors accept data and conclusions presented to them without question
- Organization members exhibit a lack of a questioning attitude when observing circumstances that differ with presented data, statistics, and/or conclusions
- Employees at all levels of the organization stop searching for other, particularly contrary information, once a data point or statistic is found that confirms their desired conclusion
- Executives, managers, and/or individual contributors frequently omit or cannot cite references for data or statistics supporting their conclusions
Potential, Observable Results
- Frequent number of events causing diminished productivity, higher costs, and elevated attrition for which precursor indicators existed but in retrospect had little visibility
- Organization becomes ‘blind sided’ by circumstances assumed not to exist or unconsidered during the decision-making process
- High or increasing decision failure rate
Potential Causes
- Executives, managers, and/or individual contributors feel it would be insulting to a presenter to challenge their conclusions
- Organization members erroneously trust data, statistics, and conclusions because of the individual presenting it (the corollary of the ad hominem logic error
- Organizational leaders foster a workplace environment that does not challenge others, particularly seniors and peers
- Organization members rely on questionable sources of data or statistics to support their conclusions
- Employees are not trained on the proper qualification, verification, validation, of data sources (see StrategyDriven article, Human Performance Management Best Practice – Qualify, Verify, and Validate)
Final Thoughts…
Leaders should demand information, not data, during the decision-making process. Such a demand comes in the form of questions asked to challenge and explore the presented meaning behind the data offered and should address the data relationships and quality. It is not so important for a leader to have all of the answers but rather to ask the right questions challenging the conclusions presented. (see StrategyDriven article, Business Performance Assessment Program Best Practice – Three Whys Deep)
Robust data interpretation requires multidimensional interpretation of the data based on its interrelationships with other data sets. Experience should also be applied to the interpretation of data so to identify otherwise not readily apparent relationships, causes, contributors, and insights. (see StrategyDriven article, Evaluation and Control Program Best Practice – Identify Data Relationships)[/wcm_restrict][wcm_nonmember plans=”41305, 25542, 25653″]
Hi there! Gain access to this article with a StrategyDriven Insights Library – Total Access subscription or buy access to the article itself.
Subscribe to the StrategyDriven Insights Library
Sign-up now for your StrategyDriven Insights Library – Total Access subscription for as low as $15 / month (paid annually). Not sure? Click here to learn more. |
Buy the Article
Don’t need a subscription? Buy access to Evaluation and Control Program Warning Flag 3 – Using Data as Information for just $2! |
[/wcm_nonmember]
About the Author
Nathan Ives is a StrategyDriven Principal and Host of the StrategyDriven Podcast. For over twenty years, he has served as trusted advisor to executives and managers at dozens of Fortune 500 and smaller companies in the areas of management effectiveness, organizational development, and process improvement. To read Nathan’s complete biography, click here.