SoftMachine

← Library

Daron Acemoglu et al. · 2026

paper

AI, Human Cognition and Knowledge Collapse

Acemoglu and colleagues build a formal model of what happens when AI systems train on the outputs of earlier AI systems, and when humans rely on those systems to do their thinking. The tail erodes structurally — not through any single failure but through the cumulative effect of recursive sampling and cognitive offloading. Variance contracts. The center holds. The edge disappears.

The paper is the strongest argument for SoftMachine’s existence. Where the collapse is structural, the only counterforce is aggregation infrastructure that captures variance before it disappears.

Read the source →