When you consider how customers interact with organizations these days, it quickly becomes apparent that much of that interaction is through digital channels. “CX” suggests a customer experience via laptops or mobile devices, and that digital experience is driven entirely by data. The question is, how do we make it the most relevant and seamless experience possible, given the needs and objectives of the user, and what data can we leverage to do so? In addition to voice of the customer feedback through surveys and social media monitoring (which provide high-level themes), three principal ways of leveraging data can be used in order to create an excellent customer experience:
- The first approach is to validate core design choices by testing before deployment, and then continue to monitor user behaviors after deployment.
- The second approach is to use data from past transactions to adjust the experience according to equipment or products owned, segment, interests or demographic.
- The third approach is to use real-time data from interactions and responses, campaigns, and messaging to dynamically drive a personalized experience based on the user’s “electronic body language.”
These three approaches are of increasing maturity and complexity. The first, designing and testing iteratively, is a reasonably well understood approach; however, certain nuances can make a meaningful difference. These include the actual phrasing of questions about the tasks being tested (“task phrasing”), terminology, and testing instructions. Web analytics should include a range of behavioral metrics as well as contexts for those behaviors – the paths leading to actions such as bounces or abandoned carts, as well as those that could take place outside of the site. (For example, a customer who leaves the site might do so to call a rep to place an order after leaving a product detail page. What may look like a bounce is really a conversion.)
The second approach, using enriched data about a customer’s profile including industry, interests, role or past purchases, can be applied either to creating preselected navigational paths tuned to that customer’s needs, or to presenting selections of products and content that persona research has supported. The downside is in making assumptions that are not validated by testing, and limiting choices incorrectly or unnecessarily. A mechanism of reverting to a less personalized experience should be evident to the users (so they can make selections outside of what is being presented as “personalized”). I once purchased red shoes for my wife, and during subsequent searches, no matter what I needed I was presented with red shoes (in case I had not bought enough red shoes) and was still in the market. The rudimentary effort at personalizing my experience based on a past purchase was the case not only for the website itself but also for retargeting. Such actions can be very annoying to the user, since these repetitive promotions are just getting in the way of the current task.
Capturing and Acting on “Digital Body Language”
The third approach, using real time “digital body language,” is more complex and requires a level of maturity across multiple domains. In this case, a customer data platform or similar technology collects and harmonizes data from multiple touch points. Those signals inform content management systems and product information management systems to present offers and products in real time. This approach requires that content be broken up into reusable components that can be assembled into a template to change an offering, call to action, hero image or other piece of content. These combinations must be optimized over large numbers of experiments in order to get enough data to make accurate predictions.
Using this third approach also requires a model of the customer attributes that can signal interests, tastes, objectives or derived metadata such as propensity to buy, loyalty score, churn risk, and lifetime value. Other potentially hidden attributes, can be leveraged with machine learning algorithms Latent Dirichlet Allocation( LDA), a popular method for extracting relationships based on less visible or apparent (at least to humans) attributes that become signals for surfacing content. Effective use of the customer model requires nuanced understanding of customer needs expressed through detailed use cases and scenarios.
A sequence of actions by the customer becomes a set of signals – for example, a new user registration, followed by a search for a class of product, viewing a buyer’s guide, and then launching a product configurator are all registered as signals that can be acted upon. The identification of these signals as relevant is based on the expertise of a merchandizer or product manager. Another signal could be a response to an email campaign. The customer data platform (CDP) harvests these signals, which can be aligned with offering experiments on the website.
With enough data, machine learning can derive rules for associating content and apply those derivations and variations a wider range of use cases, extending the rules that were initially developed by product managers and customer persona experts. These more refined rules can improve the relevance of offers that are eventually made to the customer. ML algorithms are both adding derived rules to the ones initially developed by managers and providing more accurate predictions and better offers. The following table lists some of the requirements of such a solution.
Getting into the Head of Your Customer: Understanding Their Mental Model
The objective of developing these models is to remove friction by simplifying and streamlining processes and to “reduce the cognitive load” on the user by surfacing specific content, products, or information as they need it. A great site takes the burden from the user reducing the mental effort required to achieve their objective. When you go to a new website and love the experience, it’s because the site accurately replicates your “mental model” of what you want or need. Well organized products or a great selection fit your way of thinking about the products you are looking for, your particular style, taste or preferences. Things are organized in a way that makes sense — you know where to find things and can get what you need quickly and easily. The site gets your way of thinking – it (actually, the human designer of the site) understands and has organized information according to your mental model.
Scenarios, Use Cases, and User Tasks
No matter the level of maturity or capability, the prerequisite for any of these approaches is a detailed understanding of scenarios, use cases, and user tasks for a given type of user (represented by a persona – a representation of the role, with information about their background and preferences). These collectively provide insights about the user and their mental model – how they go about their tasks and how they think about the information they need. The typical pushback from program sponsors and engagement owners is “we can’t model all of that – it would be a monumental task – we can’t justify that effort – it isn’t worth it.” This reaction is usually based on a misinterpretation of how use cases are developed and extended, as well as a lack of understanding of the power of use cases.
The objective is to develop classes of use cases that represent large numbers of variations in tasks. You don’t need to model every task. A use case can represent hundreds or thousands of task variations. For example, a salesperson at an investment management firm working with institutional investors needs to retrieve recent thought leadership of interest to her client. This class of use case applies to an enormous number of task variations based on the topic, theme, audience, product, region, and other parameters. The objective of testing is to be sure the correct “about-ness” parameters (attributes) are being included so the salesperson can retrieve the specific item from a large pile of content.
Libraries of use cases become an asset of increasing value. They should be developed, expanded, and maintained over time. If properly managed, they become the gold standard against which design and user experience decisions are evaluated and scored. They represent accumulated knowledge of the needs, behaviors and preferences of users. The outcome is that they become the foundation for personalization capabilities.
Hand Crafted, Artisan Personalization vs Automated, Algorithm-Driven Personalization
Building customer-facing hierarchies for products on an ecommerce site (called “display hierarchies” as opposed to back-end product hierarchies that could be used for financial reporting and ERP systems) is a hand-crafted approach. Customer-facing hierarchies are based on human judgment and their development cannot be automated. Fine tuning display hierarchies for specific regions or customer segments is also a hand-crafted approach.
The disadvantage of hand-crafted approaches is that they do not scale well, so companies may want to transition to an automated approach. But how can they go from manual to automated development? Well, these three approaches form the foundation for algorithmically driven experiences. As this work reveals patterns and insights about user needs and behaviors, the patterns can be abstracted to more generalized principles that can then be more automated. A process must be fully understood before it can be automated, however. Understanding how customers move through the sales funnel, make purchases and consume content in support of their objective becomes a building block for automation. For example, the content or offer that is presented is informed by making the connection between a type of user, the activities that indicate they are in the early stage of investigation, and the “top of the funnel” content that will move them to the next stage of the buying cycle.
A CDP can consolidate signals indicating the customer is in the early stages of research and can then update attributes of the customer model indicating this stage of the customer journey. Now, content associated with that stage (and tagged as supporting that stage) can be presented. This is one way that a personalization algorithm can be trained – starting with manually curated content and tagging, and then extending to new audiences and content.
There are many details behind the various approaches for acting on customer signals, depending on the tech stack. The main takeaway is to begin with a deep understanding of customer needs across the lifecycle about the types of information and content that will support their journey.
Efficiency versus Competitive Advantage
Standardization of data and processes leads to efficiencies, which is a positive thing. Many organizations follow the lead of their competitors and strive to align their naming and organization approaches with those that are typical of the industry. However, differentiation is the source of competitive advantage. Looking and acting like the competition means that the only differentiation is based on price – which is a race to the bottom. Customers will buy more and pay more if they get better service, easier processes, better selection, and a better overall experience. Harvesting data about user behavior and modeling their use cases and scenarios tells the organization how to provide the user the best possible experience, which is actually the best differentiator for competitive advantage. The more an organization understands its customers, the better it can serve them. Harvesting, interpreting and acting on we know and continue to learn about the customer provides the building blocks for delivering a continually evolving customer experience.