Blog Home

AllTop StoriesAIAutoQACoaching
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Understanding Customer Effort Score (CES) & How to Measure It

Team MaestroQA
August 23, 2021
0 minute read

Should you stop trying to delight customers? That question seems absurd, especially for customer service leaders. But the answer might surprise you. Hear us out.

Making customer service interactions easier — not fast or delightful — is the key to keeping customers loyal and satisfied. And there’s data to prove it.

96% of people who experience a high-effort customer service interaction decrease their loyalty to the brand compared only 9% with low-effort interactions, according to research from Gartner.

This insight is a wake-up call for CX teams. Coaching agents to remove obstacles and lower customer effort not only increases loyalty — it saves money and improves agent retention, too. The first step is knowing your customer effort score (CES).

What Is Customer Effort Score (CES)?

Customer effort score is a customer service metric that indicates how easy (or difficult) it is for a customer to resolve an issue or find the information they need from a brand. CES is influenced by factors like the duration of interactions, having to switch between agents, or the lack of quality information in a self-service hub.

CX teams determine CES by sending a single CES survey question to customers after they interact with a support agent, such as, “How much effort did it take to resolve your question today?”

The sooner the survey is sent after the interaction, the more accurate the results.

3 Reasons Every CX Team Needs to Track Customer Effort Scores

In a study to see how customer effort levels impact people’s perception of organizations, the global research firm Gartner gleaned three key insights that should be on every CX manager’s radar:

1. Customer effort predicts customer loyalty

94% of customers who have low-effort interactions intend to buy from a brand again, compared to just 4% of customers who have high-effort service interactions.

CX teams can use customer experience metrics like customer satisfaction (CSAT) scores and Net Promoter Score (NPS) to gauge customer loyalty and retention, but CES is usually a more reliable indicator for future purchase behavior — and there’s evidence to back it up. “Customer effort is 40% more accurate at predicting customer loyalty as opposed to customer satisfaction [CSAT],” says Andrew Schumacher, senior principal, Advisory, Gartner.

2. Low-effort interactions save money

Low-effort interactions are 37% less expensive than high-effort interactions, according to Gartner. That’s because low-effort customer experiences save considerable agent time: specifically, they eliminate up to 50% of escalations, 40% of customer call-backs, and 54% of channel switching (for example, when a customer tries self-service but ends up calling an agent instead).

3. Low-effort interactions improve agent retention

Agents build confidence when they consistently remove barriers and simplify customer interactions. As a result, agents’ intent to stick around increases up to 17% when they deliver better customer experiences.

How to Create a Customer Effort Score Survey

Customer effort scores are derived from customer feedback surveys. There are two components of a CES survey: the question and the type of scale.

1. Choose your survey question

Customer effort score questions have to reflect the insights you want. For example, if you want to know how much effort customers exert when they return a product, you’d ask: Was our return process easy for you to complete?

Here are a few additional examples of CES survey questions CX teams might ask:

  • How easy was it to resolve your issue today?
  • Do you agree/disagree [company] made it easy to handle your issue?
  • How much effort did it take to exchange your order?

Answers to CES survey questions can’t be a simple “yes” or “no,” so you need to choose a scale.

2. Choose a type of scale for your CES survey

There are two primary types of scales to use for CES surveys:

Likert Scale: This five-point scale lets customers express how much they agree or disagree with a statement. Here’s an example:

How much do you agree or disagree with the following statement: [Company] made it easy to resolve my issue today?

1 = strongly disagree
2 = disagree
3 = neutral
4 = agree
5 = strongly agree

Numerical Scale: Here, customers express their sentiment on a scale of 1-5, with 1-2 being negative and 4-5 being positive. Here’s an example:

How much effort did it take to resolve your issue today?

1 = too much effort
2 = decent amount of effort
3 = neutral
4 = minimal effort
5 = no effort

Note that for both options, you can also expand your scale to 1-10 instead of 1-5. Once you receive all of your responses, you’re ready to plug the numbers into an easy formula to get your CES.

How to Calculate Customer Effort Score

Calculating CES is quick and easy:

  1. Add up all CES survey scores.
  2. Divide the sum by the number of responses.

So, if 50 people responded to your 1-5 scale CES survey and the sum of their scores is 211, your equation looks like this:

211 ÷ 50 = 4.22

Your CES is 4.22 out of 5, which is great.

Customer Effort Score Benchmarks: What Is a Good CES?

Since CES questions and scales differ across industries and organizations, there isn’t a clear CES benchmark to aim for. That said, aim for your average CES to fall into the “positive” category. So, on a 1-5 scale, that means 4 or higher. Or if you use a 1-10 scale, 7 or higher.

No matter what your CES is, it doesn’t hurt to aim higher. Remember, better CES scores correlate with loyalty, agent retention, and money saved — there’s no such thing as a CES that’s “too good.”

How to Improve Your Customer Effort Score: 4 Tips

Reducing customer effort is a team sport — coaches, QA managers, and agents all have responsibilities. These four tips ensure every level of the CX team does their part.

1. Coach agents to focus on first call resolution (FCR)

The primary cause of excessive customer effort is having to reach out to support multiple times to resolve an issue, according to Harvard Business Review. This echoes Microsoft’s finding that the most important aspect of a customer service experience is fixing the issue in a single interaction — no matter how long it takes.

Customer service coaches can combat this by working with agents to improve their first call resolution (FCR) rates. That requires addressing the root causes of issues (not the symptoms) and preempting problems before they arise. Agents develop these skills by studying past interactions during coaching sessions to spot patterns.

2. Analyze CES in tandem with quality assurance (QA) scores

CES tells you how much effort a customer exerted during a support interaction, but it doesn’t tell you why. That’s where QA scores come in.

Let’s say several customers "disagree" with the following statement after working with the same agent: "The customer support agent made it easy to cancel my subscription."

Instead of simply hoping the agent does better next time, a manager or coach can look at the agent’s QA scorecard from that interaction to see what, exactly, made the experience difficult for the customer.

Maybe the agent didn’t adhere to the approved process or didn’t gather the correct information from the customer upfront. Whatever the case, analyzing CES alongside QA contextualizes sub-par scores to guide future coaching sessions and improve the customer feedback.

3. Give customers options for issue resolution

“Why can’t I just talk to a real human?” We’ve all asked that question before — and it’s a valid one. Customers may exert excessive effort trying to navigate self-service options like online help centers, which can hurt CES. That’s why it’s important to offer multiple support channels.

For example,’s support hub offers quick answers via a knowledge base, video tutorials, a community forum, webinars, and more. Then, if customers need personalized support, they have 24/7 support available.'s customer support screen provides multiple options for customers seeking help.

4. Turn negative customer feedback into learning material

Poor customer experiences are often more instructive for CX teams than positive ones when it comes to CES.

“Many companies conduct postcall surveys to measure internal performance,” says Matthew Dixon et al. in Harvard Business Review. “However, they may neglect to use the data they collect to learn from unhappy customers.”

Obviously, the priority is fixing the customer’s problem before asking them more questions and reducing the customer’s effort. But it’s important to follow up after a high-effort service interaction to get their side of the story. Then use those insights to inform agent training and prevent those mistakes from happening again.

The Myth of Going the Extra Mile

Conventional wisdom holds that customers stay loyal to brands that “go the extra mile” to wow them. But as Harvard Business Review points out, exceeding customer expectations during service interactions “makes customers only marginally more loyal than simply meeting their needs.”

Don’t overthink it: the quickest path to customer satisfaction is making interactions as easy as possible.

“Few things generate more goodwill and repeat business than being effortless to deal with,” says Matt Watkinson, author of The Ten Principles Behind Great Customer Experiences.

Previous Article

What CX Leaders Need to Know About Security and Compliance

Dan Rorke

Streamline Your Call Center's QA Program With 4 Key Features

Larrita Browning

Customer Loyalty vs. Customer Retention: Which Matters More?

Lauren Alexander

Empathy & Authenticity: Customer Service Skills to Improve CX

Leanna Merrell

The Past, Present, and Future of Quality Assurance

Leanna Merrell

Five Questions to Jumpstart your QA Scorecard Research Process

Larrita Browning

How High-Performing CX Teams Build Accountability

Dan Rorke

A Guide to Net Promotor Score (NPS) for Customer Service

Lauren Alexander

3 Strategies on How to Increase Customer Loyalty

Dan Rorke

3 Ways to Improve Your CSAT Score through Quality Assurance

Leanna Merrell

What’s Really Behind Your CSAT Scores? Diving Deeper

Larrita Browning

Reasons for Call Center Attrition Rate and How to Reduce It

Team MaestroQA

Voice of the Customer (VOC): A Guide for Great CX Teams

Team MaestroQA

How to Improve Call Center Agent Performance: 6 Key Tips

Larrita Browning

How CX University Improves Brooklinen’s Agents Performance

Dan Rorke

Understanding Customer Effort Score (CES) & How to Measure It

Team MaestroQA

5 Tips for Customer Service Coaches in Call Centers

Lauren Alexander

Your Most Important CX Metric Is Your QA Score - Here's Why

Leanna Merrell

Improving the Customer Experience with DSAT Scores

Larrita Browning

Customer Service Training and Quality Assurance – How Lessonly and MaestroQA Close the Loop

Dan Rorke

How to Grade Customer Service Calls

Dan Rorke

How to Calculate CX Quality Assurance Scores

Leanna Merrell

Why Top-Performing CX Teams Focus on Workforce Engagement

Leanna Merrell

How Agents Can Make the Most of Customer Service Coaching

Team MaestroQA

Improve Customer Satisfaction with a CX Quality Management Program

Larrita Browning