The business landscape is rapidly changing thanks to big computing, big data and correspondingly robust machine learning and optimization techniques. Forward-thinking companies that embrace analytics early can change the competitive dynamic by executing their core competencies and strategies better than their competitors.
“The central concern of administrative theory is with the boundary between rational and nonrational aspects of human social behavior.” — Herbert Simon, Administrative Behavior
Uncovering the realities that lie behind the data is what business analytics is all about. Precisely because they are hidden to the casual observer, they lend competitive advantages to the organizations that discover and implement them in business first.
Analytics typically involves sifting through mountains of data – that often send confusing or seemingly conflicting signals – in search of nuggets of insight that can be used to make better decisions. Though irksome, initially confusing analytical situations are often the ones that reveal deeper truths, valuable predictive patterns or competitive insights. Perplexity often breeds discovery; surface inconsistency can yield to deeper consistency; and challenge can unseat convention. Uncovering the realities that lie behind the data is what business analytics is all about. Precisely because they are hidden to the casual observer, they lend competitive advantages to the organizations that discover and implement them in business first. Ironically, something like this very process is needed to make sense of the mountains of literature that have accumulated around the field of business analytics itself.
Writings in the business and popular press, blog postings, and conversations with executives reflect many seemingly contradictory ideas. While it is often touted as one of today’s most important business—and even cultural—trends, analytics has been around in one form or another for many decades and, in a few cases, for well over a century. Depending on whom you speak with, analytics projects are technology projects, actuarial projects, applications of the latest developments in machine learning, exercises in applied statistics and/or operations research, or nothing more than Business Intelligence (BI) clothed in fresh marketing or journalistic spin. Even the generic nature of the newly fashionable term “analytics” itself might invite skepticism: After all, techniques that claim to apply to everything often end up solving nothing.
Yet there are important messages that should not get lost in the hype. While many subdisciplines and core techniques of analytics are not new, the competitive and operational necessity of analytics is. Hundreds of terabytes of information are now produced each day in forms as diverse as unstructured text, transactional records, Internet clicks, digital multimedia, and RFID and geospatial GPS signals. Analytically adept organizations are able to use all of this data to make refined operational decisions more economically, objectively, consistently and in greater quantities than ever before.1 Organizations not so adept are at risk of drowning in all of this data and falling behind competitively. Today’s business landscape has therefore changed in ways that put analytics “have-nots” at a substantial disadvantage relative to analytics “haves.”
But to form an effective strategy around analytics, it is first necessary to hear beyond the chatter surrounding analytics, develop a clear understanding of what analytics is fundamentally about, and then to structure and prioritize analytics strategies in a way appropriate to one’s larger business objectives and competitive situation. The use of information technology to convert raw data into actionable information is an important part of the story. But as we will see, it is only a part: Data and information technology account for only two dimensions of what is ultimately a three-dimensional topic. The third and crucially important dimension of analytics is the human and social one.
In Competing on Analytics, Tom Davenport and Jeanne Harris help clarify the multifaceted concept of analytics. According to them, analytics is: “the extensive use of data, statistical and quantitative analysis, explanatory and predictive models, and fact-based management to drive decisions and actions.”2 This definition is excellent. Still, the sudden ubiquity and seemingly airy generality of “analytics” might distract from the subject’s deep roots and the technical and managerial sophistication needed to master it.
Davenport and Harris correctly point out that using data to build statistical models is central to analytics. And indeed one of the earliest instances of business analytics was performed by an important figure in the history of statistics. The 18th-century mathematician and Unitarian minister Richard Price is best known for his early role in promulgating what is today known as Bayesian statistics, the science of using probabilities to quantify uncertain knowledge in the light of data. Less often recognized is the fact that Price applied his quantitative expertise by working as a consultant to the Equitable Life Assurance Society in London.3 By analyzing historical mortality patterns, Price was able to appropriately link the society’s insurance premiums to the age of the insured, thereby ensuring that it could adequately meet its future commitments. As a result, the Equitable flourished during the succeeding decades while a string of competitors came and went due to inadequate premium and loss reserves.
We are not historians, but Price’s work—performed a quarter of a millennium ago—gets our vote as the first example of business analytics in action. Not coincidentally, it is also an early example of actuarial science. Price trained his nephew William Morgan who went on to become the Equitable’s actuary and is today regarded as the founding father of the actuarial profession.
A similar story comes from Victorian Britain. Charles Babbage is best known as the inventor, along with his associate Lady Lovelace, of the earliest prototype of the modern computer. (Because Babbage and Lovelace predated Edison, their computer was powered by a hand crank.) A man of protean talents, Babbage was also an early pioneer in operations research. By analyzing data, Babbage demonstrated that varying the price of mail delivery by the distance traveled was in fact more expensive than the cost of transporting the mail. The post office acted upon Babbage’s analysis and became more efficient by charging a flat rate based on weight. The so-called “penny post” idea was subsequently adopted around the world and persists even today.4
Information technology is also crucial to analytics ... the best algorithm in the world provides no value sitting on a shelf. It must be implemented.
While the term “analytics” is a new entry in the business vernacular, the subject itself is certainly not. So what accounts for its sudden ubiquity? The short answer is information technology. The availability of computing power and data storage capacity have expanded at an exponential pace, and this trend seems unlikely to abate anytime soon.5
Several consequences of Moore’s Law collectively account for the newfound ubiquity of analytics. Most notably:
In an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.6
Simon’s foresight is today’s reality: Analytics is increasingly regarded as a necessity to focus decision-makers’ attention—a scarce resource—on the predictive insights hidden in the depths of oceans of data.
Advances in information technology have therefore dramatically magnified both the practical availability and business necessity of data and analytics tools and methods. As a result, novel applications of analytics are taking root at an impressive clip. But it is important to remember that the core ideas and methods of modern analytics are rooted in fundamental statistical and optimization ideas that have been in wide use for decades.
Information technology is also crucial to analytics for a second reason: The best algorithm in the world provides no value sitting on a shelf. It must be implemented. A common goal of analytics projects is to create predictive models or other types of algorithms intended to improve key business processes and/or help human experts make more effective decisions. In practice this means that once the algorithm has been developed and validated by a team of analysts, it must be implemented in the organization’s information systems and used to automatically generate business rules, recommendations or messages tailored to individual cases. Without an effective IT plan in place, the ROI of such an analytics project is likely to be negative.
Because of its substantial IT component, people sometimes mistake analytics for a variety of IT or a software implementation project. But this confuses the delivery vehicle with what is being delivered. Predictive models, collaborative filtering algorithms, business process optimizations, pricing solutions and analytically driven collections of business rules are generally not off-the-shelf software products. Rather, they are developed by data scientists with expertise in fields such as statistics, operations research, computer science and machine learning, linguistics, actuarial science, marketing science and psychology. The reason the general term “analytics” is such helpful shorthand is because of the huge variety of methods and applications that it encompasses. In short, while IT is indispensible to analytics, analytics projects should not be conceived as IT projects.
All of this background still doesn’t explain one of the most notable aspects of analytics: its impressive range of applicability. In the past decade, analytical methods have been adopted in fields as disparate as consumer business, human resources and talent management, insurance, banking, entertainment, professional sports, medicine, education, telecom, automotive, manufacturing and government. Consider these cases, culled from the popular and academic literature as well as our own project experience.
The reason the general term “analytics” is such helpful shorthand is because of the huge variety of methods and applications that it encompasses. In short, while IT is indispensible to analytics, analytics projects should not be conceived as IT projects.
This list could easily go on for pages.12 But the diversity represented even in this short list should be enough to drive home the point that analytics is the ultimate transferable skill. In a striking array of domains, intelligent applications of analytics have provided organizations with enormous opportunities for efficiency gains, rationalized operational decision-making processes, expense containment and profitable growth. Borrowing an image from philosopher Daniel Dennett, analytics can usefully be thought of as a sort of “universal acid” in that it has the potential to eat through a virtually unlimited variety of decision and operational problems.13 This is because analytics is the science of better decision-making; and decision-making is the heart of business.14
Albert Einstein famously wrote that “the whole of science is a refinement of everyday thinking.”15 This insight contains the key to understanding the broad applicability of analytics and is an excellent thought for leaders to keep in mind as they guide their organizations in cultivating analytic competencies. To paraphrase Einstein, analytic initiatives are systemic refinements of an organization’s core operational decision-making processes.
To illustrate, consider the case of a human resources manager who is faced with the task of making offers to a small subset of a large group of applicants. Perhaps they are recent graduates applying for internships at a bank or applicants for flight attendant positions with an airline. How does the hiring manager go about choosing? In essence he combines the available information (from sources like resumes, interviews, references and perhaps even external sources like professional networking Web sites) about each candidate in a way that enables him to compare each candidate’s likely success on the job. How does the HR manager decide which factors to consider and how to combine them? Ideally, he generalizes from his prior experience as well as that of his colleagues and mentors. These professionals have perhaps made hundreds of hiring decisions and, ideally, know which factors are more or less important and by how much. Put another way, they generalize from examples: They use inductive reasoning.
Predictive modeling is a refinement of the very human process of inductive reasoning. It is not necessary for business leaders to understand the finer points of multivariate regression, classification and regression trees, artificial neural networks or support vector machines. It is important for them to understand that these are all tools that “learn” from large databases of cases to arrive at general conclusions – in an analogous way that the HR manager in our story learned from his and his colleagues’ experience.
Our story could easily be retold, changing the protagonist to a marketing professional, an insurance claims adjuster, a fraud investigator, an emergency room doctor, a movie studio executive selecting scripts, a psychologist, a social services case manager, a retail store executive or a university admissions officer going about their daily decision-making business. When such professionals use prior experience to arrive at decisions, they informally “build predictive models in their heads.” Analytics uses large databases containing both traditional and novel sources of information to formalize, improve and scale up this process. Business analytics is therefore widely applicable for precisely the same reason that everyday inductive reasoning is widely applicable. The former is a scientific refinement of the latter.
So far we have discussed predictive modeling in relation to Herbert Simon’s picture of a business decision-maker who makes “satisficing” rather than optimal decisions due to his or her bounded rationality. Simon pointed out that real-life decision-makers are not the idealized rational calculators—homo economicus—from classical economics.16 Predictive models, operations research solutions and analytically motivated business rules are therefore needed to serve as decision-making prostheses. Just as eyeglasses help correct myopia, predictive models can help correct bounded cognition.
But it turns out that actual decision-makers fall short of even Simon’s more realistic picture of satisficers making complex decisions in uncertain environments. In their recent book Nudge, the behavioral economists Richard Thaler and Cass Sunstein likened homo economicus to Mr. Spock from Star Trek. According to Thaler and Sunstein, not only do real-life decision-makers fall short of this ideal, they in fact have more in common with the cartoon everyman, Homer Simpson!17Thaler, the Nobel laureate Daniel Kahneman and the late Amos Tversky are considered the founding figures of behavioral economics, a discipline that describes how human decision-makers predictably deviate from the rational ideal.
Dozens of surprising, yet utterly predictable, decision-making biases have been discovered. For example, the anchoring phenomenon teaches us that people’s estimates of an unknown quantity can be affected in often comical ways by arbitrary reference points. For example, if a group of people are first asked to add 200 to the last three digits of their phone numbers and then asked when Attila the Hun invaded Europe, their answers are correlated with their phone numbers.18 Loss aversion is the well documented fact that the pleasure of gaining an item is less intense than the pain of giving it up. In Predictably Irrational, Dan Ariely illustrated this by studying a group of basketball fans who had won tickets to a Duke Blue Devils game in a random lottery. Ariely found that the winners were willing to part with their tickets for an average of $2,400, while the losers were willing to pay only $175 on average. Not a single ticket changed hands.19 Many such cognitive biases have been repeatedly demonstrated in easily replicated experiments.
Sunstein and Thaler’s point is not that professionals are foolish or uneducated; it is simply that they are human. Of necessity, they rely on fallible intuitions, mental heuristics and tribal wisdom when processing information to make decisions.
The fact that human cognition is not merely bounded but predictably biased has an important strategic implication: Humans’ predictable cognitive biases result in inefficient markets. Decision-making anchored in the analysis of data rather than (biased) intuitions can therefore be used to profit from such markets. Thaler and Sunstein make this point in a discussion of
Why do professional baseball executives, many of whom have spent their lives in the game, make so many colossal mistakes? They are paid well, and they are specialists. They have every incentive to evaluate talent correctly. So why do they blunder? In an intriguing passage, Lewis offers three clues. First, those who played the game seem to overgeneralize from personal experience: “People always thought their own experience was typical when it wasn’t.” Second, the professionals were unduly affected by how a player had performed most recently, even though recent performance is not always a good guide. Third, people were biased by what they saw, or thought they saw, with their own eyes. This is a real problem, because the human mind plays tricks, and because there is “a lot you couldn’t see when you watched a baseball game. 20
Sunstein and Thaler point out that the phenomena Lewis describes are not unique to the culture of baseball scouts; indeed they are central to behavioral economics. Sunstein and Thaler connect Lewis’ discussion to another central finding of behavioral economics: the availability heuristic.
As Daniel Kahneman and Amos Tversky have shown, people often assess the probability of an event by asking whether relevant examples are cognitively “available.” Thus, people are likely to think that more words, on a random page, end with the letters “ing” than have “n” as their next to last letter – even though a moment’s reflection will show that this could not possibly be the case. 21
Sunstein and Thaler’s point is not that professionals are foolish or uneducated; it is simply that they are human. Of necessity, they rely on fallible intuitions, mental heuristics and tribal wisdom when processing information to make decisions. The problem is that cognitive biases such as anchoring, the availability heuristic, loss aversion and herd behavior can prevent markets from becoming efficient.
Sunstein and Thaler comment, “Even when the stakes are high, rational behavior does not always emerge. It takes time and effort to switch from simple intuitions to careful assessments of evidence.”22
“Switching from simple intuitions to careful assessments of evidence” is not a bad working definition of business analytics. With its fundamental postulate of the perfectly rational homo economicus, classical economics has long abetted the belief that real-life markets are efficient. The recent discoveries of scientists like Simon, Kahneman, Tversky and Thaler teach us that “it ain’t necessarily so.” The implication is that those organizations early to adopt predictive analytics can profit from market inefficiencies resulting from traditional decision-making practices that are infused with cognitive biases.
There is therefore more to analytics than software, data and models. Successful analytics projects do not begin with data and end with models; rather, they begin with strategy and end with models and analytic insights driving improved operational decision-making and business processes. The implication is that analytics projects require strong executive leadership along a number of avenues. In particular:
Because of its inherently technical nature, it is easy to underestimate the importance of the human and social dimensions of analytics. But recall that the true drama of Moneyball centered around culture change. The most challenging part of Billy Beane’s job was not hiring Paul DePodesta or overseeing his data analysis; it was convincing his recalcitrant scouts to trust DePodesta’s analysis rather than their unaided intuitions about which players to recruit.
Two other principles of behavioral economics are relevant in this connection: Status quo bias (the tendency to stick with the current situation even when better ones are available) and optimism bias (the tendency to be overly optimistic about one’s own abilities and the outcomes of one’s actions). Such cognitive biases—together with garden-variety turf wars and organizational politics—are major reasons why organizations’ traditional, intuitive decision-making cultures often resist analytics initiatives. Executives should not let their own “optimism bias” lead them to underestimate such risks. Generally speaking, a vigorous organizational acceptance of analytics does not just happen. Well-conceived strategies must be properly communicated, incentivized and executed in order for cultures of evidence-based decision-making to take root.
What is an opportunity for analytically sophisticated organizations is simultaneously a threat to their competitors.
As computing power and data sets have increased exponentially, so has the embrace of analytic methods in business, government, medicine, education, entertainment and beyond. The revolution is here, and it has been a long time coming. As discussed above, analytics first took root in the insurance industry for a very good reason. Insurance is the rare business in which the producer does not know the cost of production at the time of sale. Two seemingly identical risks can in fact present the insurer with much different risks. Without actuarial science and advanced analytics, insurers can easily fall prey to adverse selection resulting from the information asymmetry of policyholders and competitors possessing a superior understanding of the drivers of insurance claims.
Two centuries after Richard Price consulted for the Equitable, the world is a different place. The business landscape is rapidly changing thanks to big computing, big data and correspondingly robust machine learning and optimization techniques. Forward-thinking organizations that are the first to embrace analytics can change the competitive dynamic by executing their core competencies and strategies better than their competitors.
What is an opportunity for analytically sophisticated organizations is simultaneously a threat to their competitors. The nature of insurance is such that analytically impaired insurance companies are not long for this world: Analytically sophisticated competitors will use their deeper understanding of risk factors to skim the cream and select against them. Progressive Insurance’s use of personal credit scores to more accurately price auto insurance is a recent chapter in this decades-old story.24Stories such as Billy Beane exploiting an inefficient market for talent and Netflix using analytics to challenge traditional retail business models illustrate that this data-driven phenomenon is no longer specific to insurance. Because data is now everywhere, so is analytics. This fundamentally changes the competitive landscape in a way that makes analytics central to competitive strategy.
One, therefore, does not need a predictive model to foresee that ever more organizations in ever more sectors will realize that analytics must be as essential to their core operations as it has always been to insurers. Paraphrasing Milton Friedman speaking in a much different context, “we are all actuaries now.”