Evolved Thinking Blog

The CX Principles that can guarantee a successful program – Part 4 - Evolved Thinking

Written by Garreth Chandler | June 4, 2019

At The Evolved Group, there are a number of CX Principles we follow to ensure we build successful CX programs. Essentially these principles are divided into four important groups – employees and customers, program owners, stakeholders, and the leadership group – and I’ll explore them in detail over the course of this year.

This month’s focus is bringing issues to life – the second principal for program owners.

The best way to convey my thoughts on this principle is to go back to how things were and contrast them with how things are now, the reason for these changes and their implications.

A good place to start is the beginning of my career in insights (or market research as it was quaintly known back then). My job was telephone interviewer at a company that was at the apogee of research in the 1990s called Reark Research. At the time it delivered the largest single CX program of its day to Telstra – which at that stage was coming to terms with a new player in the field called Optus.

The Telstra customer research program was called TELCATS. Many things were different back then. For a start there was no such thing as CX or insights – these are relatively new terms. The discipline was called Customer Satisfaction. The internet was in its infancy, so all surveys were completed by telephone …and there were a lot of them. I can still recite the surveys we ran almost verbatim. They spanned every aspect of being a Telstra customer and every component was known by an acronym. For example, Operator Assisted Services (OAS) and Call Quality Monitor (CQM).

These days each component would be called an episode and our goal would be to create an understanding of the customer journey – how the accrued experiences of customers influenced loyalty and spend. Back then, such a view was Nirvana (the place, not the band who were really big at the time).

Piled around the office like white stalagmites were towering stacks of paper called ‘computer tables’. Printed on spooled A3 paper from a noisy printer that ran ceaselessly, they explained why so many of the older generation of researchers are myopic. Each computer table was a series of cross tabulations of the survey questions (e.g. overall satisfaction) and a banner (a categorical variable such as customer age, gender, product type, etc.). Typical tables were 300 or more pages long depending on the number of questions and the length of the banner. The researcher’s job was to pore over these tables to find significant differences.

Other people in the ‘coding team’ (often aspiring researchers who looked at the tables with envy) read thousands of open-ended comments from people who were asked why they said they were satisfied or dissatisfied and coded them into a code frame. This allowed us to say 23% were dissatisfied because the call took too long to answer.

The process of reporting worked through four steps:

1. What & When

Representing insights through calling out key insights supported by engaging data visualisation e.g. 83% of customers were satisfied, a 2% increase over the last wave

2. Who

Segmenting the data and identifying sub-groups with divergent experiences e.g. more likely to be people in NSW than Victoria

3. Why

Identifying and root causes based on service delivery processes and touchpoint delivery e.g. they were 20% more likely to mention poor call quality in the evenings

4. What

Improving call quality in NSW e.g. repair pits where water entry caused line crackling

The results were compiled into a ring-bound Word report that ran for hundreds of pages. The Executive Summary attempted to make sense of what it all meant. The report was compiled with the help of chartists and typists and overseen by the Research Director whose job it was to pull it all together. Kind of like Mad Men but neither interesting nor fun.

Presentations sounded like a Gregorian chant – overall satisfaction has increased by 2%, satisfaction with time to answer has decreased by 1%, customers in NSW have increased satisfaction.

Things changed and progressed, driven by technology and a general recognition that the way the industry worked was inefficient and not terribly effective at actually driving change. As it happened, I worked at Telstra about 10 years after this and recall a meeting in a colleague’s office where some of the reports we produced were sitting neatly on a shelf …gathering dust. Impressive, monumental and just another number in a place full of numbers.

Fast forward to now. The underlying goals of a good CX program haven’t really changed – I need to know how I am doing, why and what should I do to improve.

However, the process of getting that information has improved vastly. Most research is now conducted online; results are available in real-time and we can drill into data to understand what people are saying with advanced text analytics. We can use sophisticated tools in our platform to identify cause and effect and determine relationships between performance and customer experience. Our platform can even alert you to issues as soon as they manifest. However, the ability to determine what it all means, at the highest level, is largely still a human endeavour. It requires understanding the business, the customer needs and pulling it all together into a story that people can relate to and galvanize around. The ability to communicate the story with punch and gain buy-in and action is both art and science. Most people that get CX data are short on time and we are competing for their attention. We have to get our point across quickly, effectively and with a focus on what needs to happen.

That 300-page Word report is now an online dashboard and the operations report is a snappy 10- page slide deck. The presentation is now a workshop and less time is spent on why (good insights providers are trusted to get it right) and more on what. The mantra of our era today is what, so what, now what.

What
The challenge here is to convey a lot of information to people in a way that is not boring, and which allows us to drill into the interesting and useful bits quickly.

The key tool is the online dashboard. We design online dashboards using Human-Focused Design principles. They are literally like the dashboard on a plane: tell me what I need to know and tell me what’s happening using the simplest and most direct means available. This level of data is consumed by operational roles and is an important part of the concept of closing the loop – I need to know how I did last week so that I can determine what impact my work is having on the customer.

At the next level up, understanding what is happening at a strategic level is the task of data integration. Most customer relationships span multiple experiences – active and reactive, at a point in time and over time. To truly understand the customer, we need to see how they are experiencing the customer relationship across all touchpoints and over time. This is where an application like HumanListening’s Data Wrangler is so important. It meshes internal data and data from multiple surveys into a coherent picture of the real customer experience and cause and effect. We still seek to determine who is satisfied and why but doing so can be largely automated and data driven.

So what
The next layer is to understand return on investment – this is the so what. So what if customers have poor call quality – it can depend on what they are paying and their expectations. Ultimately, its impact and importance can be understood in terms of the behavioural impact on the customer relationship. This ability is developed by linking financial data and customer experience data so that we can understand the commercial impact both now and in the future. Businesses run lean and decisions on investment need to be supported by understanding the relative ROI. It is par for the course now to look at results and understand their impact in terms of conversion. If I do this, will it increase satisfaction, which will increase advocacy and decrease churn? How much will we benefit?

Now What
Communicating the story of what and so what is the foundation for now what. A big part of our work as consultants is providing good advice on how our clients can drive improved customer experience. The underlying principles that were in play 20 years ago really have not changed. Generally, driving positive change from CX data is about identifying a failure of process or failure in an application of process. The details that help us diagnose which is which are in the open-ended feedback and our use of conversational chatbots is pivotal to eliciting it from customers through deeper consideration of their experiences and describing the causes through better text analytics.

Our Human Listening technology empowers us as never before. If a customer says they have a poor experience onboarding a new product because of the form, our chatbot technology probes to identify the process, the form and the specific issue – e.g. lack of clear instructions or poor formatting. Our analytics captures and consolidates similar feedback and estimates the impact it is having on customer experience. We can identify an exact issue and an exact outcome of it with an exact cause. This is modern insights.

Ultimately though, this level of detail still requires human to human interaction to package up the story. It isn’t about the form. It is about the narrative of the customer experience and how it is shaped by needs and expectations and what happens when you deliver to them …or not.

 

Find out more

Check out our customer experience page to learn more.