Cognitive analytics has been saved
Cognitive analytics offers a way to bridge the gap between big data and the reality of practical decision making.
Artificial intelligence, machine learning, and natural language processing have moved from experimental concepts to potential business disruptors—harnessing Internet speed, cloud scale, and adaptive mastery of business processes to drive insights that aid real-time decision making. For organizations that want to improve their ability to sense and respond, cognitive analytics can be a powerful way to bridge the gap between the intent of big data and the reality of practical decision making.
For decades, companies have dealt with information in a familiar way—deliberately exploring known data sets to gain insights. Whether by queries, reports, or advanced analytical models, explicit rules have been applied to universes of data to answer questions and guide decision making. The underlying technologies for storage, visualization, statistical modeling, and business intelligence have continued to evolve, and we’re far from reaching the limits of these traditional techniques.
Today, analytical systems that enable better data-driven decisions are at a crossroads with respect to where the work gets done. While they leverage technology for data-handling and number-crunching, the hard work of forming and testing hypotheses, tuning models, and tweaking data structures is still reliant on people. Much of the grunt work is carried out by computers, while much of the thinking is dependent on specific human beings with specific skills and experience that are hard to replace and hard to scale.
Learn more about cognitive analytics.
Create and download a custom PDF of the Business Trends 2014 report.
For the first time in computing history, it’s possible for machines to learn from experience and penetrate the complexity of data to identify associations. The field is called cognitive analytics™—inspired by how the human brain processes information, draws conclusions, and codifies instincts and experience into learning. Instead of depending on predefined rules and structured queries to uncover answers, cognitive analytics relies on technology systems to generate hypotheses, drawing from a wide variety of potentially relevant information and connections. Possible answers are expressed as recommendations, along with the system’s self-assessed ranking of how confident it is in the accuracy of the response. Unlike in traditional analysis, the more data fed to a machine learning system, the more it can learn, resulting in higher-quality insights.
Cognitive analytics can push past the limitations of human cognition, allowing us to process and understand big data in real time, undaunted by exploding volumes of data or wild fluctuations in form, structure, and quality. Context-based hypotheses can be formed by exploring massive numbers of permutations of potential relationships of influence and causality—leading to conclusions unconstrained by organizational biases. In academia, the techniques have been applied to the study of reading, learning, and language development. The Boltzmann machine1 and the Never-Ending Language Learning (NELL)2 projects are popular examples. In the consumer world, pieces of cognitive analytics form the core of artificial personal assistants such as Apple’s Siri® voice recognition software3 and the Google Now service, as well as the backbone for the Xbox® video game system’s verbal command interface in Kinect®.
Even more interesting use cases exist in the commercial realm. Early instances of cognitive analytics can be found in health care, where systems are being used to improve the quality of patient outcomes. A wide range of structured inputs, such as claims records, patient files, and outbreak statistics, are coupled with unstructured inputs such as medical journals and textbooks, clinician notes, and social media feeds. Patient diagnoses can incorporate new medical evidence and individual patient histories, removing economic and geographic constraints that can prevent access to leading medical knowledge.
In financial services, cognitive analytics is being used to advise and execute trading, as well as for advanced fraud detection and risk underwriting. In retail, cognitive systems operate as customer service agents, in-store kiosks, and digital store clerks—providing answers to customers’ questions about products, trends, recommendations, and support. Another promising area for cognitive analytics involves the concept of “tuning” complex global systems such as supply chains and cloud networks.
In practical terms, cognitive analytics is an extension of cognitive computing, which is made up of three main components: machine learning, natural language processing, and advancements in the enabling infrastructure.
Machine learning, or deep learning,4 is an artificial intelligence5 technique modeled after characteristics of the human brain. A machine learning system explores many divergent concepts for possible connections, expresses potential new ideas with relative confidence or certainty in their “correctness,” and adjusts the strength of heuristics, intuition, or decision frameworks based on direct feedback to those ideas. Many of today’s implementations represent supervised learning, where the machine needs to be trained or taught by humans. User feedback is given on the quality of the conclusions, which the system uses to tune its “thought process” and refine future hypotheses.
Another important component of cognitive computing is natural language processing (NLP), or the ability to parse and understand unstructured data and conversational requests. NLP allows more data from more sources to be included in an analysis—allowing raw text, handwritten content, email, blog posts, mobile and sensor data, voice transcriptions, and more to be included as part of the learning. This is essential, especially because the volume of unstructured data is growing by 62 percent each year6 and is expected to reach nine times the volume of structured data by 2020.7 Instead of demanding that all information be scrubbed, interpreted, and translated into a common format, the hypothesis and confidence engines actively learn associations and the relative merits of various sources.
NLP can also simplify a person’s ability to interact with cognitive systems. Instead of forcing end users to learn querying or programming languages, cognitive computing allows spoken, natural exploration. Users can ask, “What are the sales projections for this quarter?” instead of writing complicated lookups and joins against databases and schemas.
Finally, cognitive computing depends on increased processing power and storage networks delivered at low costs. That’s because it requires massively parallel processing, which allows exploration of different sets of data from different sources at the same time. It also requires places where the massive amounts of data can be continuously collected and analyzed. Options include the cloud, large appliances and high-end servers, and distributed architectures that allow work to be reduced and mapped to a large collection of lower-end hardware.
Cognitive analytics is the application of these technologies to enhance human decisions. It takes advantage of cognitive computing’s vast data-processing power and adds channels for data collection (such as sensing applications) and environmental context to provide practical business insights. If cognitive computing has changed the way in which information is processed, cognitive analytics is changing the way information is applied.
The breakthrough could not have come at a better time. As more human activity is being expressed digitally, data forms continue to evolve. Highly structured financial and transactional data remain at the forefront of many business applications, but the rise of unstructured information in voice, images, social channels, and video has created new opportunities for businesses to understand the world around them. For companies that want to use this information for real-time decision making, cognitive analytics is moving to center stage. It is both a complement to inventorying, cleansing, and curating ever-growing decision sources and a means for machine learning at Internet speed and cloud scale to automatically discover new correlations and patterns.
Cognitive analytics is still in its early stages, and it is by no means a replacement for traditional information and analytics programs. However, industries wrestling with massive amounts of unstructured data or struggling to meet growing demand for real-time visibility should consider taking a look.
A multinational consumer goods company wanted to evaluate new designs for its popular men’s personal care product. The company had sizeable market share, but its competitors were consistently developing and marketing new design features. To remain competitive, the company wanted to understand which features consumers valued.
Thousands of testers filled out surveys regarding the company’s new product variant. Although some of the survey’s results were quantitative (“Rate this feature on a scale from 1–5”), many were qualitative free-form text (“Other comments”). This produced more text than could be processed, efficiently and accurately, by humans.
The company used Luminoso’s text analytics software to analyze the responses by building a conceptual matrix of the respondents’ text—mapping the raw content onto subject and topic matters, statistical relationships, and contexts that were relevant to the business. Luminoso’s Insight Engine identified notable elements and patterns within the text, and measured the emotional and perceived effects of the product’s design and functionality.
The discoveries were impressive, and surprising. The company rapidly identified design features important to consumers, which mapped closely to the numerical ratings testers had assigned. Unexpectedly, the product’s color strongly affected how emotionally attached a tester was to his product. When writing freely, testers frequently mentioned color’s significance to the product experience—but when faced with specific questions, testers only spoke to the topic at hand. The company also uncovered that the color findings were mirrored in those testers who did not specifically mention color.
The company, able to finally quantify a color preference, conducted a study to select the preferred one. The product is now on the shelves of major supermarkets and convenience stores—in a new color, selling more units.
Some of the building blocks of cognitive analytics have found homes in our pockets and purses. Intelligent personal assistants such as Apple’s Siri, Google Now, and Microsoft Cortana use natural language processing, predictive analytics, machine learning, and big data to provide personalized, seemingly prescient service. These are examples of complex technologies working together behind a deceptively simple interface—allowing users to quickly and easily find the information they need through conversational commands and contextual prompts based on location, activity, and a user’s history.
Such programs are first steps toward harnessing cognitive analytics for personal enhanced decision making. For example, Google Now can check your calendar to determine that you have a dentist appointment, or search your communication history to know that you are seeing a movie—contextually determining your destination.8 It can then use GPS to determine your current location, use Google Maps to check traffic conditions and determine the best driving route, and set a notification to let you know what time you should leave. And these systems are only getting better, because the programs can also learn your behaviors and preferences over time, leading to more accurate and targeted information.
In 2011, WellPoint, one of the nation’s largest health benefits companies, set out to design a world-class, integrated health care ecosystem that would link data on physical, financial, worksite, behavioral, and community health. By establishing a singular platform, WellPoint could enhance its ability to collaborate, share information, automate processes, and manage analytics. To do this, WellPoint needed an advanced solution, and therefore teamed with IBM to use the capabilities of Watson—IBM’s cognitive computing system.
“We decided to integrate our health care ecosystem to help our care management associates administer member benefits, while providing a seamless member experience and working to reduce costs,” said Gail Borgatti Croall, SVP of Care Management at WellPoint. “Cognitive analytics was important in creating a system that could drive effectiveness and efficiencies throughout our business.”
Today, WellPoint uses cognitive analytics as a tool for utilization management:9 specifically, in reviewing pre-authorization treatment requests—decisions that require knowledge of medical science, patient history, and the prescribing doctor’s rationale, among other factors. With its ability to read free-form textual information, Watson can synthesize huge amounts of data and create hypotheses on how to respond to case requests. In fact, WellPoint already has “taught” its cognitive engine to recognize medical policies and guidelines representing 54 percent of outpatient requests.
“It took us about a year to train our solution on our business, and the more we taught the faster the Watson cognitive platform learned,” said Croall. “Now it’s familiar with a huge volume of clinical information and professional literature. This reduces a significant amount of time needed for nurses to track down and assess the variables when making a well-informed decision on an authorization request.”
For each case reviewed, the system provides nurses with a recommendation and an overall confidence and accuracy rating for that recommendation. In some outpatient cases, the system already can auto-approve requests, reducing the timeframe for patient treatment recommendations from 72 hours to near-real time. As the cognitive system develops its knowledge database, the accuracy and confidence ratings will continue to rise, and the ability to approve greater numbers and types of cases in real time becomes a reality.
Furthermore, nurses have experienced a 20 percent improvement in efficiency in specific work flows due to the one-stop-shop nature of the integrated platform. The integrated platform will create not only efficiency savings but also enable improvement in speed of response to provider requests.
WellPoint’s use of cognitive analytics for utilization management represents the tip of the iceberg. Its integrated health care ecosystem is a multiyear journey that the company approaches with iterative, small releases, keeping the effort on time and on budget. In the future, WellPoint may look into how the system can support identification and stratification for clinical programs or many other applications.
“We’d like to see how our system can support a more holistic, longitudinal patient record—for example, integrating electronic medical record (EMR) data with claims, lab, and pharmacy data,” said Croall. “We also see opportunities on the consumer side. Imagine using cognitive insights to create an online, interactive model that helps you, as a patient, understand treatment options and costs. We’ve barely scratched the surface with our cognitive analytics capabilities. It truly will change the way we perform utilization management and case management services.”
Each year, thousands of safety-related events occur around the world at nuclear power plants.10 The most severe events make headlines because of disastrous consequences including loss of life, environmental damage, and economic cost. Curtiss-Wright, a product manufacturer and service provider to the aerospace, defense, oil and gas, and nuclear energy industries, examines nuclear safety event data to determine patterns. These patterns can be used by energy clients to determine what occurred during a power plant event, understand the plant’s current status, and anticipate future events.11
Curtiss-Wright is taking its analysis a step further by developing an advanced analytics solution. The foundation of this solution is Saffron Technology’s cognitive computing platform, a predictive intelligence system that can recognize connections within disparate data sets.12 By feeding this platform with structured operational metrics and decades of semi-structured nuclear event reporting, the ability to foresee future issues and provide response recommendations for evolving situations is made possible.13 Ultimately, Curtiss-Wright hopes to improve nuclear safety by means of a solution that not only enables energy companies to learn from the past but also gives them the opportunity to prepare for the future.
In 2011, I was given the opportunity to lead IBM’s Watson project and build a business around it. I am passionate about the process of “presentations to products to profits,” so this endeavor really excited me. The first decision I had to make was which markets and industries we should enter. We wanted to focus on information-intensive industries where multi-structured data are important to driving better decisions. Obvious choices such as insurance, health care, telecom, and banking were discussed. We chose to first focus on health care: a multitrillion-dollar industry in which our technology could help improve the quality of care delivered, drive toward significant cost reduction, and have a positive impact on society. In 2012, we reduced the footprint of our Watson system—then the size of a master bedroom—to a single server and took our first customer into production.
To be successful with cognitive computing, companies should be able to articulate how they will make better decisions and drive better outcomes. Companies will struggle if they approach it from the “technology in” angle instead of “business out.” The technology is no doubt fundamental but should be coupled with business domain knowledge—understanding the industry, learning the theoretical and practical experience of the field, and learning the nuances around a given problem set.
For example, in the health care industry, there are three primary aspects that make Watson’s solution scalable and repeatable. First, Watson is being trained by medical professionals to understand the context of the relevant health area and can present information in a way that is useful to clinicians. Second, when building the tools and platform, we created a model that can be reconfigured to apply to multiple functions within the industry so that learnings from one area can help accelerate mastery in related fields. Third, the delivery structure is scalable—able to tackle problems big or small. The more it learns about the industry, the better its confidence in responding to user questions or system queries and the quicker it can be deployed against new problems. With Watson for contact center, we are targeting training the system for a new task in six weeks with a goal of achieving business “break even” in six months.
However, cognitive computing may not always be the right solution. Sometimes businesses should start with improving and enhancing their existing analytics solutions. Companies considering cognitive computing should select appropriate use cases that will generate value and have enough of a compelling roadmap and potential to “starburst” into enough additional scenarios to truly move the needle.
In terms of the talent needed to support cognitive solutions, I liken this to the early stages of the Internet and web page development when people worried about the lack of HTML developers. Ultimately, systems arose to streamline the process and reduce the skill set required. With Watson, we have reduced the complexity required to do this type of work by 10–15 times where we were when we first started, and recent startups will continue to drive the curve down. So less highly specialized people will be able to complete more complex tasks—PhDs and data scientists won’t be the only ones capable of implementing cognitive computing.
There are three things I consider important for an effective cognitive computing solution: C-suite buy-in to the vision of transforming the business over a 3–5 year journey; relevant use cases and roadmap that are likely to lead to a compelling business outcome; and the content and talent to drive the use case and vision. If you approach a project purely from a technology standpoint, the project will become a science project, and you can’t expect it to drive value.
Rather than having a team of data scientists creating algorithms to understand a particular business issue, cognitive analytics seeks to extract content, embed it into semantic models, discover hypotheses and interpret evidence, provide potential insights—and then continuously improve them. The data scientist’s job is to empower the cognitive tool, providing guidance, coaching, feedback, and new inputs along the way. As a tool moves closer to being able to replicate the human thought process, answers come more promptly and with greater consistency. Here are a few ways to get started:
Start small. It’s possible to pilot and prototype a cognitive analytics platform at low cost and low risk of abandonment using the cloud and open-source tools. A few early successes and valuable insights can make the learning phase also a launch phase.
Plant seeds. Analytics talent shortages are exacerbated in the cognitive world. The good news? Because the techniques are so new, your competitors are likely facing similar hurdles. Now is a good time to invest in your next-generation data scientists, anchored in refining and harnessing cognitive techniques. And remember, business domain experience is as critical as data science. Cast a wide net, and invest in developing the players from each of the disciplines. Consider crowdsourcing talent options for initial forays.14
Tools second. The tools are improving and evolving at a rapid pace, so don’t agonize over choices, and don’t overcommit to a single vendor. Start with what you have, supplement with open-source tools during the early days, and continue to explore the state of the possible as tools evolve and consolidate.
Context is king. Quick answers and consistency depend on more than processing power. They also depend on context. By starting with deep information for a particular sector, a cognitive analytics platform can short-circuit the learning curve and get to high-confidence hypotheses quickly. That’s why the machinery of cognitive computing—such as Watson from IBM—is rolling out sector by sector. Early applications involve health care management and customer service in banking and insurance. Decide which domains to target and begin working through a concept map—part entity and explicit relationship exercise, part understanding of influence and subtle interactions.
Don’t scuttle your analytics ship. Far from making traditional approaches obsolete, cognitive analytics simply provides another layer—a potentially more powerful layer—for understanding complexity and driving real-time decisions. By tapping into broader sets of unstructured data such as social monitoring, deep demographics, and economic indicators, cognitive analytics can supplement traditional analytics with ever-increasing accuracy and speed.
Divide and conquer. Cognitive analytics initiatives can be broken into smaller, more accessible projects. Natural language processing can be an extension of visualization and other human-computer interaction efforts. Unstructured data can be tapped as a new signal in traditional analytics efforts. Distributed computing and cloud options for parallel processing of big data don’t require machine learning to yield new insights.
Know which questions you’re asking. Even modest initiatives need to be grounded in a business “so what.” An analytics journey should begin with questions, and the application of cognitive analytics is no exception. The difference, however, lies in the kinds of answers you’re looking for. When you need forward-looking insights that enable confident responses, cognitive analytics may be your best bet.
Explore ideas from others. Look outside your company and industry at what others are doing to explore the state of the possible. Interpret it in your own business context to identify the state of the practical and valuable.
As the demand for real-time support in business decision making intensifies, cognitive analytics will likely move to the forefront in high-stakes sectors and functions: health care, financial services, supply chain, customer relationship management, telecommunications, and cyber security. In some of these areas, lagging response times can be a matter of life and death. In others, they simply represent missed opportunities.
Cognitive analytics can help address some key challenges. It can improve prediction accuracy, provide augmentation and scale to human cognition, and allow tasks to be performed more efficiently (and automatically) via context-based suggestions. For organizations that want to improve their ability to sense and respond, cognitive analytics offers a powerful way to bridge the gap between the promise of big data and the reality of practical decision making.