Every CX leader eventually faces the same question: Should we measure NPS, CSAT, CES—or all three?
The answer isn’t as simple as picking the “best” metric. Each measures something fundamentally different about your customer relationships. Get the combination wrong, and you’ll optimize for the wrong outcomes. Get it right, and you’ll predict churn before it happens, identify friction before customers complain, and drive growth with precision.
According to Qualtrics 2025 research, consumers gave satisfaction ratings of 4-5 stars for 76% of their recent experiences—yet many companies still can’t predict which customers will leave. The gap between measuring satisfaction and predicting behavior is where understanding these three metrics becomes essential.
This guide breaks down everything you need to know: how each metric works, when to deploy them, industry benchmarks, and the strategic framework that top-performing companies use to combine them effectively.
The Three Pillars of CX Measurement
Before diving into comparisons, let’s establish what each metric actually measures—because they’re often confused for each other.
| Metric | NPS | CSAT | CES |
|---|---|---|---|
| Full Name | Net Promoter Score | Customer Satisfaction Score | Customer Effort Score |
| Measures | Loyalty & advocacy | Interaction satisfaction | Ease of experience |
| Core Question | ”How likely to recommend?" | "How satisfied were you?" | "How easy was this?” |
| Scale | 0-10 | 1-5 or 1-7 | 1-7 (CES 2.0) |
| Timeframe | Long-term relationship | Specific moment | Task completion |
Here’s the critical insight most teams miss: these metrics aren’t competing alternatives—they’re complementary layers of understanding. NPS tells you whether customers will stay loyal. CSAT tells you how they felt about a specific moment. CES tells you why they might leave (friction creates disloyalty faster than dissatisfaction).
Net Promoter Score (NPS): The Loyalty Predictor
What NPS Measures
Developed by Bain & Company in 2003, NPS measures customer loyalty through a single question:
“How likely are you to recommend [Company] to a friend or colleague?”
Customers respond on a 0-10 scale, then get categorized:
| Score Range | Category | Impact on NPS |
|---|---|---|
| 0-6 | Detractors | Subtracted from score |
| 7-8 | Passives | Not counted |
| 9-10 | Promoters | Added to score |
How to Calculate NPS
The formula is straightforward:
NPS = % Promoters − % Detractors
For example, if 100 customers respond:
- 50 score 9-10 (Promoters) = 50%
- 30 score 7-8 (Passives) = Not counted
- 20 score 0-6 (Detractors) = 20%
NPS = 50% − 20% = +30
NPS ranges from -100 (everyone is a detractor) to +100 (everyone is a promoter).
What Makes a Good NPS?
2025 Global NPS Benchmarks:
- Global Median: 42 (across all industries)
- B2C Average: 49
- B2B Average: 38
According to NPS Prism by Bain & Company, here’s how NPS breaks down by performance tier:
| NPS Range | Interpretation | Typical Scenario |
|---|---|---|
| 70+ | Excellent | Industry leaders with devoted customers |
| 50-69 | Great | Strong competitive advantage |
| 30-49 | Good | Room for improvement but healthy |
| 0-29 | Needs work | Significant loyalty challenges |
| Below 0 | Critical | More detractors than promoters |
NPS Benchmarks by Industry (2025)
| Industry | Median NPS | Top Performer |
|---|---|---|
| Technology & Services | 66 | Apple, Adobe |
| Retail & E-commerce | 55 | Costco, Amazon |
| IT Services | 55 | — |
| Healthcare | 50+ | — |
| Hotels & Hospitality | 44 | Ritz-Carlton |
| Banking & Credit Unions | 41 | USAA |
| Automotive | 41 | Tesla, Toyota |
| Insurance | 35 | — |
| Telecommunications | 24 | T-Mobile |
Sources: Survicate, Retently, Delighted
The Business Impact of NPS
Research from CustomerGauge reveals the revenue connection:
- 10+ point NPS increase → 3.2% increase in upsell revenue
- 7-point NPS increase → 1% revenue growth (London School of Economics)
- NPS explains 20-60% of variation in organic growth rates among competitors (Bain & Company)
However, there’s an important caveat: Gainsight research found that NPS doesn’t always correlate to churn or renewal—only companies in the upper quartile of NPS show 5-10% higher retention. NPS is a directional indicator, not a prediction engine.
When to Use NPS
Ideal for:
- Quarterly or biannual brand health tracking
- Competitive benchmarking
- Board-level reporting
- Strategic decision-making
- Multi-respondent B2B feedback (entire buying committee)
Not ideal for:
- Immediate transactional feedback
- Identifying specific friction points
- Predicting short-term churn
- Process optimization
Customer Satisfaction Score (CSAT): The Moment Meter
What CSAT Measures
CSAT captures how satisfied customers feel immediately after a specific interaction—a purchase, support call, onboarding session, or product delivery.
The standard question:
“How satisfied were you with [specific experience]?”
Scales vary, but 1-5 and 1-7 are most common. The key is specificity: CSAT works best when tied to a defined moment, not general sentiment.
How to Calculate CSAT
Two primary methods exist:
Method 1: Top-Box Percentage (Most Common)
CSAT % = (Satisfied + Very Satisfied responses ÷ Total responses) × 100
Using a 5-point scale where 4 = Satisfied and 5 = Very Satisfied:
If 100 customers respond: 45 score “5”, 30 score “4”, 15 score “3”, 7 score “2”, 3 score “1”
CSAT = (45 + 30) ÷ 100 × 100 = 75%
Method 2: Average Score
Simply average all numeric responses. With the same data: (45×5 + 30×4 + 15×3 + 7×2 + 3×1) ÷ 100 = 4.07/5
What Makes a Good CSAT?
| CSAT Score | Rating |
|---|---|
| Below 60% | Poor |
| 60-74% | Fair |
| 75-84% | Good |
| 85%+ | Excellent |
CSAT Benchmarks by Industry (2025)
| Industry | Average CSAT | Top Performer Threshold |
|---|---|---|
| Consulting | 84% | 90%+ |
| Hotels & Hospitality | 82% | 88%+ |
| E-commerce/Retail | 82% | 88%+ |
| Banking & Financial Services | 79% | 85%+ |
| Grocery Retail | 78% | 84%+ |
| B2B Software/SaaS | 77% | 83%+ |
| Insurance | 70% | 78%+ |
| Utilities | 65% | 75%+ |
| Telecommunications | 58% | 70%+ |
Sources: Fullview, 1Flow, QuestionPro
When to Use CSAT
Ideal for:
- Post-purchase surveys (within 24 hours)
- Support ticket resolution feedback
- Onboarding experience evaluation
- Product delivery confirmation
- Feature usage satisfaction
- Any specific touchpoint measurement
Not ideal for:
- Overall brand loyalty assessment
- Long-term relationship health
- Predicting advocacy behavior
- Cross-company benchmarking (scales vary)
Customer Effort Score (CES): The Friction Finder
What CES Measures
CES emerged from Gartner research showing that reducing customer effort is more impactful than exceeding expectations. It measures how easy it was for customers to accomplish a specific task.
The modern CES 2.0 question:
“[Company] made it easy for me to handle my issue.”
Customers respond on a 1-7 scale from “Strongly Disagree” to “Strongly Agree.”
The Evolution: CES 1.0 to CES 2.0
| Aspect | CES 1.0 (Original) | CES 2.0 (Current) |
|---|---|---|
| Question | ”How much effort did you personally have to put forth?" | "[Company] made it easy for me to handle my issue.” |
| Scale | 1-5 | 1-7 |
| Focus | Effort-focused wording | Ease-focused wording |
| Interpretation | Lower scores = better | Higher scores = better |
How to Calculate CES
CES 2.0 Calculation:
CES % = (Responses scoring 5, 6, or 7 ÷ Total responses) × 100
If 100 customers respond:
- 35 score “7” (Strongly Agree)
- 25 score “6” (Agree)
- 20 score “5” (Somewhat Agree)
- 12 score “4” (Neutral)
- 8 score 1-3 (Disagree range)
CES = (35 + 25 + 20) ÷ 100 × 100 = 80%
Alternatively, calculate the average score (5.4 out of 7 in this example).
Why CES Matters: The Research
Here’s why CES deserves more attention than it typically receives:
Gartner Research Findings:
| Statistic | Finding |
|---|---|
| 40% | More accurate at predicting customer loyalty than satisfaction metrics |
| 2x | More predictive of future behavior than NPS alone |
| 96% | Disloyalty rate among customers with high-effort experiences |
The Effort-Loyalty Connection
The data from CEB (now Gartner) is striking:
| Effort Level | Repeat Purchase Likelihood | Disloyalty Rate |
|---|---|---|
| Low effort | 94% | 9% |
| High effort | 4% | 96% |
Operational impact of reducing effort:
- 40% fewer repeat support calls
- 50% fewer escalations
- 54% less channel switching
- 65-point higher NPS for low-effort vs. high-effort companies
CES Benchmarks
Unlike NPS and CSAT, CES lacks standardized industry benchmarks. This is partly because CES is relatively newer and partly because “effort” is highly context-dependent.
| CES Score | Interpretation (7-point scale) |
|---|---|
| 6.0+ | Excellent—effortless experience |
| 5.0-5.9 | Good—minimal friction |
| 4.0-4.9 | Adequate—some room for improvement |
| Below 4.0 | Problematic—significant friction exists |
Sources: SurveySensum, Sobot
The best approach: establish your internal baseline and track improvement over time rather than comparing against external benchmarks.
When to Use CES
Ideal for:
- Post-support interaction surveys
- Onboarding completion feedback
- Self-service experience evaluation
- Process optimization projects
- Digital platform usability testing
- Issue resolution assessment
Not ideal for:
- Brand perception measurement
- Competitive benchmarking
- Overall relationship health
- Long-term loyalty prediction (use NPS)
Head-to-Head Comparison: NPS vs CSAT vs CES
Now let’s compare these metrics across the dimensions that matter for implementation.
Measurement Focus
| Dimension | NPS | CSAT | CES |
|---|---|---|---|
| Core Question | Likelihood to recommend | Satisfaction level | Ease of experience |
| What It Reveals | Future advocacy | Current sentiment | Process friction |
| Time Horizon | Relationship-level | Moment-specific | Task-specific |
| Best Timing | Quarterly/biannual | Immediately after | After task completion |
| Benchmarking | Strong external | Industry-specific | Internal only |
| Loyalty Prediction | Moderate | Weak | Strong (40% better) |
Strengths and Limitations Summary
NPS Strengths & Limitations
| Strengths | Limitations |
|---|---|
| Simple, universally understood | Doesn’t explain “why” |
| Easy competitive benchmarking | Susceptible to score gaming |
| Correlates with revenue growth | Ignores passive segment (7-8) |
| High-level strategic indicator | Weak churn correlation alone |
CSAT Strengths & Limitations
| Strengths | Limitations |
|---|---|
| Immediate, actionable feedback | Moment-specific only |
| High completion rates | Response bias to extremes |
| Pinpoints specific issues | Doesn’t predict loyalty |
| Easy to deploy at scale | Scale variations complicate benchmarking |
CES Strengths & Limitations
| Strengths | Limitations |
|---|---|
| 40% better loyalty prediction | Task-specific only |
| Directly actionable (reduce friction) | No external benchmarks |
| Clear ROI (fewer escalations) | External factors can skew results |
| Identifies process problems | Doesn’t measure satisfaction |
The Strategic Framework: Using All Three Together
The most effective CX programs don’t choose between metrics—they layer them strategically.
The Customer Journey Measurement Map
| Journey Stage | Recommended Metrics | Key Questions |
|---|---|---|
| Awareness & Consideration | — | No metrics yet—customer hasn’t engaged |
| Purchase / Signup | CSAT, CES | Was checkout easy? Are they satisfied? |
| Onboarding | CES, CSAT | Was setup effortless? Training quality? |
| Support Interactions | CES, CSAT | Was issue resolved easily? Agent helpful? |
| Ongoing Usage (3+ months) | NPS | Overall relationship health & loyalty |
| Renewal / Expansion | NPS, CSAT | Will they renew? Upgrade? Refer others? |
The Multi-Metric Measurement Stack
According to Retently research, 49% of NPS users measure at least one additional metric. Here’s how leading companies structure their approach:
| Layer | Metric | Frequency | Purpose |
|---|---|---|---|
| Strategic | NPS | Quarterly | Overall loyalty & growth prediction |
| Tactical | CSAT | Per interaction | Touchpoint-specific satisfaction |
| Operational | CES | Per task | Process friction identification |
Real-World Multi-Metric Examples
E-commerce Platform:
- CSAT after purchase → Product quality satisfaction (high)
- CES for checkout → Revealed friction in payment process
- Result: Streamlined checkout → NPS improved 12 points, repeat visits up 23%
SaaS Company:
- CSAT after onboarding → Training quality feedback
- CES for account setup → Ease of first use
- NPS at 90-day mark → Overall platform loyalty
- Result: Targeted improvements at each stage reduced time-to-value by 40%
B2B Service Provider:
- CES after every support interaction → Identified knowledge base gaps
- CSAT for project milestones → Client satisfaction tracking
- NPS quarterly → Account health monitoring
- Result: 20% reduction in escalations, 15% improvement in renewal rates
Which Metric Should You Choose?
Use this decision framework to identify the right metric for your specific needs:
| Your Goal | Recommended Metric | Implementation |
|---|---|---|
| Track overall brand loyalty and predict growth | NPS | Quarterly with follow-up questions for context |
| Understand satisfaction with specific touchpoints | CSAT | Immediately after each key interaction |
| Identify and eliminate friction in processes | CES | After task completion, especially support |
| Build a comprehensive CX program | All three | CES + CSAT at touchpoints, NPS quarterly |
Industry-Specific Recommendations
| Industry | Primary Metric | Secondary Metric | Reasoning |
|---|---|---|---|
| SaaS | CES + NPS | CSAT for support | Effort in self-service critical; NPS predicts renewals |
| E-commerce | CSAT | CES for checkout | Transaction satisfaction immediate; ease drives conversion |
| Professional Services | NPS | CSAT per project | Relationships matter; project satisfaction informs renewals |
| Healthcare | CSAT | CES for administrative | Patient satisfaction regulated; ease of scheduling impacts loyalty |
| Financial Services | NPS + CES | CSAT for transactions | Trust/loyalty paramount; friction in banking highly punished |
| Telecom | CES | NPS quarterly | Industry known for friction; reducing effort is competitive edge |
Implementation Best Practices
Survey Timing Matters
| Metric | When to Send | Notes |
|---|---|---|
| NPS | After 3+ months of consistent usage | Quarterly or biannually. Adobe targets “regular and champion users” only. |
| CSAT | Within hours of the interaction | Send immediately while experience is fresh. Response rates drop 50%+ after 24 hours. |
| CES | Immediately after task completion | Right after checkout, support resolution, or feature use. Effort perception fades quickly. |
Survey Frequency Guidelines
| Metric | Minimum Frequency | Maximum Frequency | Survey Fatigue Risk |
|---|---|---|---|
| NPS | Annually | Quarterly | Low (if spaced properly) |
| CSAT | Per key interaction | 1x per week per customer | Medium (limit touchpoints surveyed) |
| CES | Per task type | 1x per task type per month | Low (highly contextual) |
Follow-Up Questions That Matter
Don’t rely on scores alone. Add context-gathering questions:
After NPS:
- “What’s the primary reason for your score?”
- “What would we need to do to earn a higher score?”
After CSAT:
- “What did we do well?” (for high scores)
- “What could we have done better?” (for low scores)
After CES:
- “What made this difficult?” (for low-effort scores)
- “What step took the most time?” (for process improvement)
Common Mistakes to Avoid
Survey Design Mistakes
| Mistake | Impact | Solution |
|---|---|---|
| Benchmarking across different companies without context | Misleading conclusions | Use industry-specific and internal benchmarks |
| Sending CSAT surveys too late | Low response rates, stale memories | Trigger within hours of interaction |
| Expanding surveys beyond core questions | Abandoned surveys, survey fatigue | Keep to 2-3 questions maximum |
| Inconsistent question phrasing | Incomparable data over time | Standardize all question text |
Analysis Mistakes
| Mistake | Impact | Solution |
|---|---|---|
| Not using cohort analysis | Miss patterns in customer segments | Compare new vs. long-term, by persona |
| Relying on single metrics | Incomplete picture | Layer NPS, CSAT, CES together |
| Ignoring the “passive” NPS segment | Miss dissatisfaction signals | Analyze 7-8 scorers separately |
| No follow-up on scores | Lack of actionable insights | Always include open-ended questions |
Implementation Mistakes
| Mistake | Impact | Solution |
|---|---|---|
| Incentivizing staff on NPS without addressing root causes | Score gaming, no real improvement | Tie incentives to action, not just scores |
| Surveying every single interaction | Survey fatigue, declining response rates | Implement sampling and throttling |
| Not closing the loop | Customers feel ignored | Respond to detractors within 24-48 hours |
| Treating CES as overall satisfaction | Misleading conclusions | Use CES for process, NPS/CSAT for sentiment |
The Future of CX Metrics: 2026 Trends
The Shift Toward Predictive Analytics
According to Gartner 2025 research, CX measurement is evolving rapidly:
- 41% of large enterprises now use AI to improve CX collaboration
- 39% are adopting predictive CX tools
- Companies using predictive models report 20% increase in retention rates
- Global Predictive Analytics for Customer Insights market: $18.89B in 2024, projected 28.3% CAGR through 2030
NPS’s Changing Role
Despite predictions that NPS would be “abandoned by 75% of companies by 2025” (Gartner 2021), the metric persists—but with declining priority:
- NPS fell from 2nd to 8th in CX metric priority (2024 data)
- Only 23% of U.S. enterprise CX leaders actively use NPS (TELUS Digital/Statista 2025)
- Reason: Shift toward more predictive, multi-metric approaches
CES Gaining Prominence
CES’s research-backed predictive power is driving increased adoption:
- 1.8x more predictive of loyalty than CSAT
- 2.0x more predictive than NPS
- CES 2.0 becoming the standard with improved wording and 7-point scale
- Recognition: Effort reduction is more reliable loyalty driver than satisfaction
The AI Integration Wave
Modern CX platforms are incorporating:
- Automated sentiment analysis of open-ended responses
- Theme detection across thousands of verbatim comments
- Predictive churn scoring based on metric patterns
- Real-time alerting for at-risk customers
- Closed-loop automation for detractor follow-up
Frequently Asked Questions
Which metric is most important: NPS, CSAT, or CES?
There’s no single “most important” metric—each measures different aspects of customer experience. CES is 40% more predictive of loyalty according to Gartner research, making it valuable for identifying churn risk. NPS provides the best competitive benchmarking and correlates with revenue growth. CSAT offers the most actionable immediate feedback. The best programs use all three strategically.
What’s a good NPS score in 2026?
The global median NPS is 42. A score above 50 is considered excellent for most industries, while 70+ indicates world-class performance. However, benchmarks vary significantly by industry—telecommunications averages 24, while technology leads at 66. Always compare against your specific industry.
How often should I survey customers?
NPS: Quarterly or biannually for relationship-level tracking. CSAT: After every key interaction, but limit to 1 survey per customer per week to avoid fatigue. CES: Immediately after specific tasks, sampled to avoid over-surveying. Implement throttling rules to ensure no customer receives more than 2-3 surveys monthly across all channels.
Can I use just one metric if I’m just starting out?
Yes, but choose wisely. If you’re a service business, start with CES to identify and eliminate friction. If you’re focused on growth and advocacy, start with NPS. If you’re optimizing specific touchpoints, start with CSAT. Plan to add additional metrics within 6-12 months as your program matures.
How do NPS, CSAT, and CES correlate with revenue?
NPS: 7-point increase correlates with 1% revenue growth (London School of Economics). 10+ point increase correlates with 3.2% upsell revenue increase. CES: Low-effort experiences drive 94% repeat purchase likelihood vs. 4% for high-effort. CSAT: Correlates with operational efficiency but doesn’t reliably predict revenue or loyalty on its own.
Should I include follow-up questions or just the core metric?
Always include follow-up questions. A score without context is actionable only at the aggregate level. Add one open-ended question (“What’s the primary reason for your score?”) to understand the “why” behind the number. This doubles survey length but dramatically increases actionability.
The Bottom Line
NPS, CSAT, and CES aren’t competing frameworks—they’re complementary tools for understanding different dimensions of customer experience.
- NPS tells you whether customers will advocate for your brand
- CSAT tells you how they felt about a specific moment
- CES tells you where friction will drive them away
The companies seeing the best results use all three strategically: CES and CSAT at touchpoints to identify and fix issues quickly, NPS quarterly to monitor overall loyalty and predict growth.
The key insight from Gartner’s research bears repeating: reducing customer effort is 40% more effective at building loyalty than exceeding satisfaction expectations. In a world where customers expect everything to be easy, effort is the metric that predicts behavior.
Take Your CX Measurement to the Next Level
Ready to move beyond basic surveys? ActionXM combines NPS, CSAT, and CES into a unified platform with AI-powered insights that identify patterns humans miss. Automated closed-loop workflows ensure every detractor gets attention, every friction point gets investigated, and every promoter becomes an advocate.
See the difference:
Have questions about measuring customer experience? Contact us—we help organizations of all sizes build world-class CX programs.
Sources
- Gartner - Unveiling the New Customer Effort Score
- Bain & Company - Measuring Your Net Promoter Score
- Qualtrics - ROI of Customer Experience 2025
- NPS Prism by Bain - 2024 Benchmark Reports
- Survicate - NPS Benchmarks 2025
- Retently - Customer Satisfaction Metrics Guide
- CustomerGauge - NPS Impact on Revenue
- Gainsight - Does NPS Correlate to Churn?
- Fullview - CSAT Benchmarks by Industry
- SurveySensum - Customer Effort Score Guide
- IBM - Customer Satisfaction Score
- Gartner - 2025 Strategic Technology Trends
- CMSWire - NPS Adoption Trends