Eighty million views and zero scars

Eighty million views and zero scars

Posted on: 19 February 2026

A five thousand word essay, written with the help of artificial intelligence, explaining how artificial intelligence will replace you, became the most viral piece of content last week. Eighty million views on X, television appearances on CBS, reprints in Fortune and dozens of other outlets. The author is Matt Shumer, twenty six years old, co-founder of a startup that builds AI writing tools. The title is "Something Big Is Happening", and the comparison chosen to convey urgency is Covid: we are, according to Shumer, at the stage where the virus still seemed like a distant concern, three weeks before the world changed forever. Those who don't prepare now will be swept away. The message is clear, the panic is served, and the plate was devoured with a speed that tells us far more about human nature than about technology.

The issue is not whether artificial intelligence will change work. It will, it already is, and anyone who denies this is lying to themselves. The issue is that the type of prediction Shumer offers belongs to a very specific literary genre, with recognisable narrative rules and a historical accuracy rate, on timing and on method, that is close to zero. Not because the authors are stupid, but because they lack something that no intelligence, artificial or otherwise, can replace: the scars of previous cycles.

Shumer is twenty six. He is evidently bright, has built a company, moves capital, understands the technology from the inside. But he has never watched a technological transition complete its arc. He did not live through the nineties, when the internet was supposed to kill every middleman within eighteen months, and instead took two decades to reorganise commerce in ways nobody had foreseen. He was not around when digital cinema was supposed to render every traditional technical skill obsolete, and instead generated an entirely new ecosystem where skills transformed but did not vanish. He did not witness the era when social media was meant to democratise information and instead produced power structures more concentrated than the ones it replaced.

Every technology cycle produces the same figure: the insider prophet, someone who works within the technology and sees its capabilities before everyone else, projects linearly from what they observe today onto the near future, and concludes that the world is about to change catastrophically and immediately. The mechanism is understandable. If you are a swimmer immersed in water, water seems to be everywhere. If you work with language models eight hours a day and watch improvements week after week, it is natural to assume the curve will continue at the same gradient and that every other sector will follow the same trajectory. The problem is that this linear projection collides systematically with the reality of adoption, which is never linear.

There is a case that anyone who lived through the home video transition knows well. In the mid seventies, Sony launched Betamax: superior picture quality, better technology by every measurable parameter. Any analyst looking solely at technical specifications would have backed Sony. Instead VHS won, made by JVC: inferior quality, but cheaper, with tapes that recorded twice as long, and crucially, an open format that any manufacturer could adopt without restrictive licensing. The adult entertainment industry accelerated VHS adoption, because longer tapes suited its products better. No engineer at Sony had factored in that pornography would influence the domestic video format war. No linear projection of technical capability could have predicted that path.

The British technology sector knows this pattern intimately. The BBC Micro was technically accomplished and educationally superior, yet it was the cheaper, less refined Sinclair Spectrum that captured the home market and, almost by accident, produced a generation of self-taught programmers who went on to build much of the UK's games and software industry. The machine that won was not the best machine; it was the one whose price point met a moment of cultural readiness that no forecast had modelled. Sir Clive Sinclair, for all his brilliance, then projected linearly from that success into the C5 electric vehicle, one of the most celebrated commercial failures in British industrial history. The pattern repeats: genuine technological insight, zero understanding of how adoption actually works.

The mechanism is always the same. Technology is adopted through channels its creators do not anticipate, at speeds that do not correspond to theoretical capabilities, and with side effects nobody had modelled. Adoption is a social phenomenon, not an engineering one. It depends on regulators who do not move at the speed of code, on organisations that absorb change with the inertia of structures built to endure, on economic incentives that often reward caution over innovation, on human habits that have roots deeper than any software update.

The Covid comparison, which Shumer deploys as his rhetorical detonator, collapses precisely on this point. A virus is a biological agent: it does not need your consent to infect you, it does not require board approval, it does not wait for an IT department to upgrade infrastructure, it does not pause while unions negotiate the terms of change. Covid spread through physical contact and that was that. Technology adoption spreads through human decisions, each one subject to incentives, resistance, rules, fears and calculations of convenience that slow, divert and transform the trajectory in unpredictable ways. Comparing the two phenomena is evocative but structurally wrong.

There is a detail worth noting. Shumer has openly admitted the essay was written with the help of Claude, one of the very models he cites as evidence of the coming revolution. He told CNBC that, with hindsight, he would have rewritten certain sections had he anticipated the virality. In 2024, as reported by Gary Marcus, the New York University neuroscientist, Shumer had already promoted an open source model that failed to deliver its claimed performance, later apologising with "I got ahead of myself." The pattern is recognisable: genuine enthusiasm amplified by personal incentives. Shumer is the CEO of a company that builds AI tools; the urgency he communicates is sincere but it is not neutral. Those who sell hammers tend to see nails everywhere, and this does not mean the nails do not exist, it means the map of nails is less reliable than it appears.

The most interesting question is not whether Shumer is right or wrong. The question is why eighty million people felt the need to read an essay telling them the world is about to end. The answer lies in the psychological architecture of anticipatory panic: it is more comfortable to have a clear catastrophic narrative than to live with genuine uncertainty. If someone tells you "everything changes in three weeks", at least you have a horizon. Real uncertainty, the kind where change is happening but you do not know when, how, or with what effects, is far harder to bear. The apocalyptic prophet paradoxically offers a form of reassurance: the problem is enormous, but at least it is defined.

There is an aspect the debate around Shumer's essay almost entirely ignores: the incentive structure for those who make catastrophic predictions in the technology sector. If you announce the end of the world and you are right, you are a visionary. If you are wrong, nobody remembers, because the next prophecy has already arrived to capture attention. The asymmetry is perfect: the reward for a correct prediction is enormous, the cost of error is close to zero. This does not mean Shumer is acting in bad faith; it means the incentive system naturally produces an excess of apocalyptic prophecies, exactly as an unregulated market produces excess risk. It is not a question of individual character, it is system architecture.

The Free Press, which gathered opinions from several experts on the subject, framed the debate in the classic binary: are we on the edge of the precipice or not? Believers versus sceptics, optimists versus pessimists. But the question is poorly posed. The right question is not "is something big happening?", because the answer is obviously yes. The right question is: will the shape this change takes resemble what the prophets describe? And the historical answer, cycle after cycle, is no.

For those who must make real decisions, with real money and real people at stake, Shumer's essay is noise dressed as signal. The real signal is more sober and more useful: AI is improving rapidly; the most recent models are significantly more capable than those of a year ago; some professions will feel the impact before others; coding is the first domain affected because AI companies invest there massively for their own strategic reasons. All true. But the distance between "these tools are powerful and improving" and "fifty per cent of entry level white collar jobs will disappear within five years", as Anthropic CEO Dario Amodei has suggested and Shumer echoes, is the distance between empirical observation and prophecy, between data and narrative.

The wisest thing a decision maker can do right now is neither prepare for the apocalypse nor ignore the change. It is to study previous cycles, recognise that the pattern of technological prophecy is as old as technology itself, and focus on the only variable over which they have genuine control: their own capacity for adaptation. Not the kind declared in a strategic plan, but the kind tested empirically, day after day, in front of tools that keep changing. Those who navigated the transition from analogue to digital in the creative industries know that the skill which survives is not the specific technical one; it is the ability to see patterns through the chaos and to act while others are still busy deciding whether to be afraid.

Shumer is twenty six years old with zero technological scars. He writes well, thinks fast, sells better. But the difference between understanding a technology and understanding how a technology moves through the real world is the difference between reading a map and having walked the territory. And the territory, as those who have walked it know, never looks like the map.