Decoding the path to purchase Using autonomous analytics for customer mapping
Given the dizzying amounts of data hurtling across cyberspace each second, companies that wish to glean customer insights through analytics should explore adding autonomous analytics to their sales and marketing arsenal along with plain old intuition and artisanal analytics.
You could be forgiven for being overwhelmed by customer data. Every minute, for example:
- There are 3 million Google search queries
- 140 million emails are sent
- 5 million YouTube videos are viewed
- 300,000 Facebook users update their status
- There are 400,000 tweets on Twitter
- Amazon.com sells 25,000 items1
It’s likely that some of those transactions and data involve your customers and your company. But it’s a dizzying prospect to figure out how you can translate all that activity into implications for customer experience.
In the old days of database marketing and customer segmentation, we practiced what might be called “artisanal analytics.” The bulk of our activities involved generating queries and reports on what our customers had done in the past. On the rare occasions when we employed advanced analytics, we handcrafted our prediction or clustering equations based on human hypotheses and intuitions about what might be going on in the data. When the models didn’t fit quite as well as we liked, we tinkered with them over time (often lots of it) until they got better.
While some organizations and analysts still employ the artisanal approach, the current flood of data about customer behavior and interests makes artisanal analytics seem almost quaint, and certainly not feasible for understanding today’s customer experience. The data come in far too quickly and in far too large volumes to allow for handcrafted analytics. If we want to respond to the almost infinite number of variations in customer attributes and preferences, we need a different approach.
When artisanal analytics can’t do the job, we have to adopt a new set of more autonomous analytical methods. These might take a variety of different forms, but the most common one is called machine learning. Machine learning uses a computer program to automatically employ alternative variables (sometimes called “features” in machine learning) in combination, to create multiple models with an optimized fit to the data. Often the models are trained on data for which we know the outcome on the dependent variable of choice (for example, we know when the customer actually bought something from us). Some versions of machine learning use traditional regression analysis, but they can create many models to fit many different data segments. Other forms use different statistical and mathematical model types. Neural and deep networks, for example, are another type of machine learning that can be used to categorize images and sounds.
Machine learning has a number of advantages compared to artisanal analytics, and some disadvantages. The primary advantage is that it greatly improves the productivity of quantitative analysis. If there are thousands of different customer segments at your company, machine learning can define them and create models to predict what they will buy.
Challenges with machine learning include the requirement for a large amount of data to successfully use the method. The outputs of machine learning are also not easily interpreted; they are generally something of a “black box.” This may mean that managers are reluctant to implement models of which they have no intuitive understanding. This is a clear shortcoming of machine learning, but for many organizations the transparency may be worth sacrificing in order to deal with the massive amount of data. It’s a particularly good tradeoff when the cost of a suboptimal decision is relatively low—as with digital advertising.
Autonomous analytics at Cisco Systems
About a decade ago, a small analytics group within Cisco’s Strategic Marketing Organization began to help salespeople decide on which accounts to focus their attention while selling particular products. Using artisanal analytics, a small group of analysts developed analytics to predict what customers would be likely to buy. There were only a few of these “propensity models,” which meant that only broad attributes of customers—large company versus small business, for example—could be considered in them.
About five years ago, as machine learning began to be practical in standard computing environments, Cisco began to generate many more models using the approach. By 2014, the company was generating about 25,000 propensity models a quarter, using data on 160 million businesses around the world. Because of the industrial scale of the modeling, Cisco began to refer to the approach as a “propensity-to-buy factory.”
By 2016, the factory was generating 60,000 models a quarter. The greater the number of models employed, the higher the granularity and accuracy of the analysis with regard to specific products, geographies, and business characteristics. The marketing analytics team comprised about 20 people, and they were thinly spread across the many models. The computer horsepower Cisco was using had also run low by this point. The sales force had come to depend on the propensity models, and they became frustrated that training and scoring so many models took almost a month of the quarter. But then Cisco adopted some new technology—an in-memory server cluster with open source machine learning software—that sped up the analysis 15-fold. Now it takes a matter of hours, and Cisco is able to use a variety of different machine learning algorithms. Depending on the situation, Cisco sees results of between three and seven times those without the propensity models.
Autonomous analytics and the customer experience
Cisco uses autonomous propensity models for sales and marketing, other companies use “programmatic” digital advertising, and still others use the technology for fine-grained and accurate recommendation engines. Some firms are even beginning to use “deep learning” systems to recognize, classify, and act on customer photos from social media postings. Every aspect of the customer experience that involves large volumes of data and the need for a personalized approach is a potential candidate for autonomous analytics and machine learning.
None of this means, of course, that we still don’t need to develop deep intuitive understandings of what customers want, or that we don’t have to develop creative ideas for marketing and selling to them. But the age of purely intuitive approaches to customers, and even of artisanal analytics to analyze their data, is largely over. Vestiges of them may remain, but the companies that move rapidly to autonomous analytics to understand and structure the customer experience will be more successful in the marketplace.