None Escalated Correctly – AI customer service systems systematically fail to escalate complex issues to humans, creating frustration and eroding trust.

Key Data Points:
- 39% of AI customer service bots were pulled back or reworked due to errors in 2024
- Customer complaints about AI service rose 56.3% year-over-year in China
- Resolution rates vary from 17% for billing issues to 58% for returns
- 75% of customers feel chatbots struggle with complex issues
- 85% of consumers believe their issues require human assistance
- Global trust in AI dropped from 62% in 2019 to 54% in 2024
Last week I tested a billing discrepancy on subscriptions. Same issue. Five different AI chatbots. Different platforms. All claimed they could help.
Not one correctly escalated to a human.
This isn’t just my frustration. It’s a $47 billion problem that’s getting worse, not better. Companies invested that much in AI customer service in the first half of 2025 alone. 89% of it delivered minimal returns.
The numbers tell a story the marketing brochures don’t. Customer complaints about AI service jumped 56.3% year-over-year. 75% of users say chatbots struggle with complex issues. When dealing with billing problems specifically, AI resolution rates drop to just 17%.
Here’s what actually happened in my tests. Each bot confidently stated it understood my issue. Each provided generic troubleshooting steps. Each failed to recognize when the problem required human judgment. The escalation triggers that vendors promise? They didn’t fire.
Air Canada learned this the expensive way. Their chatbot hallucinated a bereavement discount policy. A customer relied on it. The company tried claiming the bot was a separate legal entity. A tribunal disagreed. They had to honor the fake policy.
AI hallucinates between 3% and 27% of the time. That’s not an edge case. It’s a design limitation. Companies are deploying these systems anyway, then burying the “speak to human” option three menus deep.
Trust is collapsing. Global confidence in AI customer service dropped from 62% in 2019 to 54% in 2024. In the US, it fell even harder, from 50% to 35%. Users aren’t stupid. They know when they’re being deflected.
The enterprise deployment stats are worse. Only 5% of enterprise-grade AI systems actually make it to production. 70-85% of projects fail outright. Gartner expects 40% of current agentic AI projects to be scrapped by 2027.
The companies getting this right treat AI as assistance, not replacement. They use it to handle routine questions while keeping humans accessible. They measure escalation accuracy, not just deflection rates. They train bots to recognize complexity, not pretend to solve everything.
Most companies aren’t doing that. They’re using AI as a barrier. Response times slow down. Contact options get hidden. Bots loop through the same unhelpful suggestions. By the time you reach a human, you’re already angry.
This isn’t inevitable. The technology can work when companies prioritize customer outcomes over cost cutting. But right now, 85% of consumers believe their issues require human assistance. They’re probably right.
Key Sources & Citations
Investment & Failure Rates
- CMSWire: Organizations invested $47B in AI initiatives H1 2025, 89% delivered minimal returns
- ASAPP/MIT: Only 5% of enterprise-grade generative AI systems reach production
- Gartner: 40% of agentic AI projects will be scrapped by 2027
- Various sources: 70-85% of AI projects fail
Customer Complaints & Trust
- China Daily: 6,969 complaints about AI customer service in 2024, up 56.3% year-over-year
- Sobot: Global trust in AI dropped from 62% (2019) to 54% (2024)
- Salesforce: Customer trust fell from 58% to 42% (2023-2024)
- Plivo: 75% of customers feel chatbots struggle with complex issues
- Plivo: 85% of consumers believe issues require human assistance
Resolution Rates & Bot Performance
- Plivo: Resolution rates vary from 17% for billing to 58% for returns/cancellations
- Fullview: 39% of AI customer service bots were pulled back/reworked due to errors in 2024
AI Hallucinations
- CMSWire: AI chatbots hallucinate 3-27% of the time
- EdStellar: 77% of businesses worry about AI hallucinations
Air Canada Case
- Moffatt v. Air Canada, 2024 BCCRT 149
- Multiple legal analysis sources: McCarthy, American Bar Association, Lexology


Leave a Reply