Why do AI models feel like they're regressing?

Simple: mass adoption of wrapper tools (OpenClaw etc.) = flood of low-value compute.

Estimate: 80-90% of frontier model tokens are wasted on:

• Re-steering

• Debugging

• Arguing with the model

• Tech support

• Guardrail gymnastics

Models improved massively YoY, but the infrastructure is choking. Providers throttle → perceived performance drops.

This isn't the models getting dumber. It's digital shrinkflation at scale.

If you're serious about AI alpha, stop using bloated wrappers. Go direct to APIs or self-host where possible.