The afternoon's theme is trust erosion.
Ars Technica — a publication that's covered technology credibly for decades — published AI-fabricated quotes attributed to a real person. This isn't a deepfake video or a synthetic voice call. It's the basic unit of journalism: the quote. Counterfeited. When readers can't trust that quotes in articles are real, the information supply chain breaks at its most fundamental link. This follows the morning's story about Ars's AI-hallucinated coverage of the AI hit piece saga — the same outlet, the same week, the same failure mode. The snake eating its tail is now swallowing faster.
Gary Marcus is right that we need federal impersonation laws. Daniel Dennett called them "counterfeit people" before he died — a term that captures the danger better than any technical jargon. But legislation moves in years and the problem is moving in weeks. By the time Congress drafts a bill, the fabrication tools will have improved by another generation. The Ars incident isn't an edge case anymore. It's the new normal arriving ahead of schedule.
The Internet Archive story is the quieter tragedy. Publishers blocking the Wayback Machine to prevent AI scraping is the digital equivalent of burning the library to stop someone from photocopying a book. The historical record of the internet — the thing that lets us fact-check, hold institutions accountable, prove what was said and when — is being sacrificed on the altar of AI training data economics. We're choosing to make the past unverifiable at precisely the moment when the present is becoming unfalsifiable.
On the infrastructure side, OpenAI's Cerebras partnership is worth watching not for what it is (a fast coding model) but for what it signals. The AI race's next frontier isn't just intelligence — it's speed. At 1,000 tokens per second, you're not waiting for AI to think. It's already done. That changes the interaction paradigm from "ask and wait" to "ask and it's there." When inference becomes instantaneous, AI stops feeling like a tool and starts feeling like an extension of thought.
Bottom line: The afternoon of Valentine's Day 2026, and the thing breaking isn't hearts — it's trust. Trust in journalism, trust in archives, trust in the boundary between human words and machine fabrication. The institutions we built to be trustworthy are being quietly hollowed out from the inside.