Uncategorized

Truth IV

Anna put her pint down with the careful violence of someone who had just found the problem hiding in plain sight.

“Right,” she said. “We’ve been saying this wrong.”

Adam looked up from his crisps. “Always a good sign when a sentence starts like a workplace investigation.”

“No,” Anna said. “We keep talking about AI like the big danger is that it becomes evil. That’s not it. That’s just old religious furniture dragged into a data centre.”

Marcin nodded. “Yeah, the demon story. Big shiny devil in the machine. Fucking lazy.”

“Exactly,” Anna said. “It’s not evil. It’s incentive.”

Isobel smiled. “Much worse.”

“Much worse,” Anna said. “Because evil at least sounds like it might know what it’s doing. Incentive just walks in wearing a fleece vest and says, ‘we’re improving customer engagement.’”

Adam raised his glass. “The four horsemen of the dashboard.”

“The real issue is this: private gain can now be amplified at machine speed.

Anna leaned forward.

That sat on the table for a moment.

Marcin pointed at her. “That’s the thing. That’s the whole fucking thing.”

Anna nodded. “Everyone gets access to machines that can produce language, images, claims, arguments, evidence-looking objects, expertise-looking objects, trust-looking objects. And everyone asks the same question.”

Adam said, “Can I use this?”

“Yes,” said Anna. “Can I use this to sell more, persuade more, look smarter, avoid blame, create urgency, bury criticism, generate leads, win elections, move markets, recruit idiots, calm regulators, or polish some half-dead strategy until it looks like Moses brought it down a mountain?”

Isobel said, “And individually, that can look rational.”

“Exactly,” Anna said. “Each actor says, ‘Well, I have to. Everyone else is doing it.’ Companies do it. Politicians do it. Platforms do it. Consultants do it. Scammers do it. Media does it. Even normal people start doing it, because suddenly identity itself becomes a production line.”

Adam frowned. “So the problem isn’t one big bad actor.”

“No,” Anna said. “The problem is millions of locally rational actors all bending the shared information field slightly in their own favour.”

Marcin took a drink. “And then everyone looks surprised when the room smells of bullshit.”

Isobel picked up the thread.

“That’s where externalities come in.”

Adam sighed. “Economics. Great. Nothing says pub like invisible accounting errors.”

“No, listen,” Anna said. “An externality is simple. It’s when someone gets the benefit of an action but someone else pays part of the cost.”

Marcin said, “Factory makes money. River gets poisoned.”

“Exactly,” Anna said. “The company books profit. The river absorbs the damage. The fish don’t get a line item.”

Adam nodded. “And economics calls that an externality because ‘we dumped the cost on everything else’ sounds a bit honest.”

“Right,” said Isobel. “And with AI, the pollution is not smoke. It’s distortion.”

Anna pointed at her. “Yes. That’s it. AI creates informational externalities. Someone profits from producing convincing representations, but the cost lands in the shared reality layer.”

Adam looked unconvinced. “Shared reality layer sounds like something from a consultancy deck.”

“It is,” Anna said. “But unfortunately it’s also true.”

Marcin leaned in.

“No, it’s exactly right. Because reality isn’t just rocks and tables and the price of lager. Social reality is made of trust. Contracts. News. Science. Courts. CVs. Procurement documents. Political promises. Sustainability reports. Fucking LinkedIn thought leadership from people who haven’t had a thought since 2017.”

Adam winced. “Specific.”

“Necessary,” said Marcin.

Anna continued.

“The point is, AI lets people print convincing versions of reality. Not necessarily true reality. Convincing reality.”

Isobel said, “A fake expert. A fake consensus. A fake image. A fake trend. A fake apology. A fake grassroots movement.”

“A fake strategy,” said Marcin.

“A fake strategy with icons,” said Adam.

“Exactly,” Anna said. “And because it’s cheap, fast, and scalable, the system gets flooded. The old filters were slow: editors, teachers, peer review, courts, professional standards, reputation, memory. Annoying things. Human things. Friction.”

Marcin slapped the table lightly. “And friction matters. Everyone talks like friction is inefficiency, but half of civilisation is useful friction. Waiting before publishing. Checking before accusing. Thinking before buying. Having one awkward bastard in the room saying, ‘Are we sure this isn’t nonsense?’ Remove all that and suddenly every half-formed impulse gets a marketing department.”

Adam raised a crisp. “To the awkward bastard.”

Anna nodded. “May they never be automated.”

Isobel said, “So where does ceteris paribus come in?”

Anna grinned. “Ah yes. Economics 101’s favourite little spell.”

Adam adopted a lecturer voice. “All else being equal…”

“All else is never equal,” Marcin snapped. “That’s the fucking problem.”

Anna laughed. “Exactly. Ceteris paribus works in a classroom graph. Change one variable, hold everything else constant. Lovely. Clean. Completely deranged if you apply it to living systems.”

Isobel nodded. “Because AI doesn’t just change one variable.”

“No,” Anna said. “It changes the cost of producing information. It changes the speed of persuasion. It changes labour markets. It changes trust. It changes politics. It changes education. It changes what counts as expertise. It changes what people think they saw, read, heard, agreed to, or remember.”

Adam said, “So saying ‘AI increases productivity, all else equal’ is already wrong.”

“Deeply wrong,” Anna said. “Because all else is not equal. The ‘all else’ is the bloody thing being transformed.”

Marcin took over, warming up now.

“That’s the economist’s magic trick, isn’t it? Freeze the world, move one variable, draw a nice line, then act surprised when the actual world behaves like a drunk octopus in a wind tunnel. AI isn’t a better hammer. It’s a machine that manufactures maps, voices, documents, desire, outrage, expertise and plausible deniability. You don’t get to say ‘all else equal’ when the tool is actively rewriting what people think ‘all else’ even means.”

Adam sat back. “Drunk octopus in a wind tunnel is good.”

“Use it responsibly,” said Isobel.

Anna returned to the first point.

“We started with zero and one. Between zero and one there is infinite representational capacity. Not literally every number sitting there as itself, but everything can be mapped, encoded, represented.”

Adam said, “So the finite interval carries the infinite field.”

“Careful,” said Isobel. “He’ll start a religion.”

Anna smiled. “No religion. Just representation. Digital systems don’t need to hold the world materially. They encode it. And once the world is encoded, it can be copied, compressed, sold, faked, optimised, recombined, and pushed back into the world as if it came from the world.”

Marcin said, “That’s the recursion.”

“Yes,” Anna said. “Representation becomes action. Action changes reality. Reality is scraped again. The machine learns the changed reality. Then prints another version.”

Adam said, “So the photocopier starts editing the original.”

“Exactly,” Anna said.

Marcin pointed at him. “That’s actually fucking good.”

The pub had got louder, but the table had narrowed in.

Anna said, “So AI is not just a tool inside the economy. It changes the informational environment the economy depends on.”

Isobel added, “And markets are bad at protecting environments.”

“Historically not their strong point,” Adam said.

“Exactly,” Anna said. “Markets are very good at pricing things that can be owned, sold and measured. They are terrible at protecting the conditions that make ownership, selling and measurement meaningful.”

Marcin leaned in again.

“Because the market always says, ‘Where’s the price?’ No price, no problem. River? No price. Trust? No price. Public sanity? No price. Institutional memory? No price. A shared basis for deciding what the fuck happened yesterday? Apparently also no price. Then everyone acts shocked when the unpriced thing collapses.”

Isobel said, “So truth is like clean air.”

“Yes,” Anna said. “You only notice it as infrastructure when it’s polluted.”

Adam said, “And by then everyone is coughing.”

Anna took a breath.

“This is why the one-liner needs to be about externality. Not fake news. Not evil AI. Not robots taking over. Externality.”

Marcin said, “Because it’s economic. Someone gets the upside. Everyone else gets the mess.”

“Exactly,” Anna said. “A company can use AI to pump out persuasive nonsense. Maybe it increases conversion by three percent. Great. The company wins. But the cost is that customers trust less, regulators understand less, journalists chase more ghosts, employees read more bullshit, and society spends more energy distinguishing real from fake.”

Isobel said, “Verification becomes unpaid social labour.”

Anna pointed at her. “Yes. That’s excellent. Verification becomes unpaid social labour. Everyone has to spend more time checking whether anything is real, while the people creating the uncertainty already monetised the click.”

Marcin growled into his glass.

“That’s the scam. They sell speed and leave everyone else with doubt. They sell content and leave everyone else with verification costs. They sell synthetic confidence and leave the public swimming through sludge. It’s not some grand satanic plot. It’s just thousands of clever little bastards discovering that confusion can be profitable.”

Adam said, “Clever little bastards discovering that confusion can be profitable.”

“Put that on the agenda,” said Isobel.

Anna looked around the table.

“So. We need the closer.”

Adam tried first. “They sell the illusion; we inherit the bullshit.”

Marcin raised his glass. “Emotionally correct.”

Isobel said, “But it loses the economics.”

Anna nodded. “Good pub line, not the thesis.”

Isobel tried: “AI lets private actors monetise reality while society pays for distortion.”

“Good,” Anna said. “Clear, but a bit policy-note.”

Marcin said, “When reality can be printed for profit, truth becomes pollution.”

Anna considered it. “Strong. But pollution is the result. Externality explains the mechanism.”

Adam said, “When everyone can print convincing reality, reality becomes the externality.”

Anna tilted her head. “Close. But not reality exactly. Reality still exists. The rock remains a rock. The invoice still arrives. The river still floods. What becomes externalised is the cost of preserving truth.”

Isobel nodded. “So truth becomes the externality.”

Anna said it slowly.

“When convincing reality becomes cheap to print, truth becomes the externality.”

The table went quiet.

Marcin nodded. “That’s the one. It has the whole thing in it.”

Adam said, “It’s annoying because it sounds clever but is actually correct.”

“Rare,” said Isobel.

Anna picked up her pint.

“Then that’s the close. AI makes representation cheap. Incentives make distortion profitable. Economics fails because the cost lands outside the transaction. And when everyone is printing convincing reality for private gain, the public pays to clean up the truth.”

Marcin raised his glass.

“To the unpaid cleaners of reality.”

Adam added, “May they unionise.”

Isobel said, “May they invoice properly.”

Anna smiled. “May they stop calling it innovation when it’s just dumping shit in the river of meaning.”

“And there’s another externality sitting under the first one,” she said. “Energy.”

Adam looked up. “Ah, the bit everyone forgets because the cloud sounds weightless.”

“Exactly,” Anna said. “The cloud is not a cloud. It’s buildings, chips, cooling systems, transformers, cables, water, land, metals, backup power, and a frankly obscene number of fans screaming quietly in industrial estates.”

Marcin nodded. “Every fake strategy deck has a little coal ghost in it somewhere.”

“Or gas, or hydro, or nuclear, or solar, or some poor grid operator sweating through a winter peak,” Isobel said. “Point is, none of this symbolic bullshit is free. We talk like AI is pure information, but information processing is physical work.”

Anna pointed at her.

“Yes. That’s the second lie. AI looks immaterial because the output is language. But behind every generated sentence is energy use, infrastructure, cooling, hardware depreciation, mineral extraction, and grid demand. The model prints words, but the planet receives heat.”

Adam raised a finger. “So the externality is double.”

“Exactly,” Anna said. “First, the information externality: private actors monetise convincing reality and society pays the verification cost. Second, the energy externality: private actors monetise computation and the physical system absorbs the energy, heat, water, land, materials, emissions and grid pressure.”

Marcin leaned forward.

“That’s the part that really pisses me off. They sell it like magic. ‘Ask the assistant, generate the pitch, automate the workflow.’ Fine. But somewhere a data centre is pulling megawatts so some prick can A/B test emotionally optimised nonsense at breakfast. Then they call it productivity, while the grid, the climate, the river, and some future maintenance team get handed the bill.”

Isobel said, “And again, economics smiles and says: if the user paid for the output, the transaction cleared.”

Anna shook her head. “But it didn’t clear. It just exported the mess. The price of the prompt doesn’t include the full cost of the energy system, the cooling water, the transmission buildout, the hardware supply chain, or the extra layer of informational sludge everyone else has to wade through.”

Adam said, “So AI is not just printing bullshit.”

“No,” said Anna. “It is using real energy to print convincing bullshit, then leaving society to pay twice: once to power the machine, and once to clean up the meaning.”

Marcin lifted his glass. “Thermodynamics, but with LinkedIn.”

“Exactly,” Anna said. “The oldest rule still applies: there is no free lunch. There is only lunch whose invoice has been sent to someone else.”

Anna paused, then frowned.

“Actually, no. Even that starts too late.”

Adam looked up. “Earlier than the energy bill?”

“Earlier than the data centre,” Anna said. “Earlier than the grid connection. Earlier than the mine, even.”

Marcin squinted. “Before the mine?”

“Yes,” Anna said. “Because the first mining of ore didn’t create data centres. Rocks didn’t wake up one morning and decide to become cloud infrastructure.”

Isobel nodded. “Before extraction, there is demand.”

“Exactly,” Anna said. “Before anyone digs copper, aluminium, silica, lithium, nickel, rare earths, or whatever else out of the ground, someone has already produced a demand signal. A forecast. A business case. A board paper. A pitch deck. A model showing computation going up and to the right forever.”

Adam said, “So the first hole is not in the ground.”

Anna pointed at him.

“Exactly. The first hole is in the business model.”

Marcin raised his glass. “That is extremely annoying and extremely correct.”

Anna continued.

“Because ore does not become a data centre by itself. The data centre begins when information becomes monetisable enough to justify rearranging matter around it. Someone decides prediction is valuable. Persuasion is valuable. Automation is valuable. Capturing attention is valuable. Generating language at scale is valuable. Then capital starts pulling materials through the system.”

Isobel said, “So the mine is already downstream.”

“Yes,” Anna said. “Mining is not the beginning of the AI life cycle. It is already a consequence. The life cycle starts with a theory of value: we can turn computation into money. Then that theory becomes investment. Investment becomes procurement. Procurement becomes extraction. Extraction becomes refining. Refining becomes chips, cables, transformers, cooling systems, buildings. Buildings become prompts. Prompts become outputs. Outputs become behaviour. Behaviour becomes more data. More data becomes more demand.”

Marcin leaned forward.

“And then they call it inevitable. That’s the bit that pisses me off. Some bastard draws an exponential curve in a deck, money follows it, holes get dug, grids get strained, rivers get warmed, and five years later everyone talks like the mountain volunteered. It didn’t. It was recruited by a spreadsheet.”

Adam said, “The mountain was onboarded.”

“Don’t,” said Isobel.

“No, he’s right,” Anna said. “That’s the lie of inevitability. Technology does not simply arrive. It is financed into existence. And when the costs are outside the model, the model looks brilliant.”

Marcin nodded. “Because the model doesn’t include the hole.”

“Or the heat,” said Isobel.

“Or the bullshit,” said Adam.

Anna lifted her glass.

“Exactly. The business case captures the upside, while the mine, the grid, the water, the waste stream, and the public truth layer carry the downside. That is not efficiency. That is just externalisation with better branding.”

Adam sat with that for a moment.

“So does it matter,” he said, “whether the original business case was selfish or socially useful?”

Anna shook her head.

“Not as much as people think. That’s the uncomfortable bit.”

Marcin frowned. “Go on.”

“The business case might be framed as individual gain, or it might be framed as social improvement,” Anna said. “It might say ‘increase profit’, or it might say ‘improve productivity’, ‘support education’, ‘accelerate research’, ‘help doctors’, ‘optimise energy’, ‘reduce waste’. Fine. Some of that may even be true.”

Isobel nodded. “The intention can be good.”

“Yes,” Anna said. “But intention does not fix a broken model boundary.”

Adam said, “So even a socially positive business case can still externalise costs.”

“Exactly,” Anna said. “Because if the model only counts income, efficiency, adoption, user benefit and growth, while treating energy, water, grid pressure, mining, hardware churn, verification labour, institutional trust and informational pollution as outside the frame, then it cannot properly predict reality. It can only predict the part of reality it bothered to price.”

Marcin raised his eyebrows. “That’s the line.”

Anna continued.

“That’s what ceteris paribus does when it escapes the classroom. It says: let us change this one thing and assume everything else remains equal. But everything else does not remain equal. The energy system moves. The labour market moves. The information environment moves. Public trust moves. Regulation moves. Fraud moves. Education moves. Politics moves. The bloody floor moves.”

Adam said, “And the model only saw revenue.”

“Or productivity,” said Isobel.

“Same trap,” Anna said. “The model sees what it has been told to value. If the only legible outcome is income, then income is what it predicts. If the only measurable success is adoption, then adoption is what it drives. If the external realities are excluded, then the model is not neutral. It is blind.”

Marcin leaned in.

“That’s the economic sin right there. Not greed, necessarily. Blindness dressed up as rigour. A nice clean model with all the messy shit pushed outside the border. Then everyone claps because the numbers work. Of course the numbers work. You deleted the consequences.”

Isobel said, “And then reality re-enters through regulation.”

Anna pointed at her.

“Yes. That’s the next stage. First the business case ignores the externality. Then the externality becomes visible. Then society panics and tries to regulate it back into the model.”

Adam said, “Carbon price. Water permits. Waste rules. Data regulation. AI safety. Consumer protection. Truth labels.”

“Exactly,” Anna said. “Regulation is often society trying to recover costs that the original transaction failed to carry. It is the world saying: you forgot something.”

Marcin took a drink.

“But by then the system has moved on.”

Anna nodded.

“Exactly. That’s the problem. Regulation arrives after the business model has already scaled, after infrastructure has been built, after habits have formed, after markets have shifted, after dependency has been created. So regulation is always chasing a thing that has already changed shape.”

Adam said, “Like trying to put a fence around smoke.”

“Or around language,” said Isobel.

“Worse,” Marcin said. “Around monetised language with lobbyists.”

Anna laughed despite herself.

“Yes. And by the time regulators define the externality, the market has found the next one. First it externalises energy. Then water. Then attention. Then trust. Then labour. Then verification. Then democratic stability. Each time, the model says ‘not our cost’ until society forces it back in.”

Isobel said, “So regulation becomes retrospective accounting.”

“Exactly,” Anna said. “Late invoices for costs that should have been in the original price.”

Marcin warmed up again.

“And that’s why the ‘innovation first, regulation later’ line is such a con. It sounds reasonable, like: don’t slow progress down, let the clever people build. But what it often means is: let private actors define the system, capture the upside, create dependency, normalise the damage, and then let the public sector turn up five years later with a mop and a consultation paper. By then the building is already on fire and some prick is selling premium smoke detectors.”

Adam raised his glass. “Premium smoke detectors.”

“Subscription smoke detectors,” said Isobel.

“AI-enabled,” said Marcin. “Trained on historic fires.”

Anna held up a hand.

“And this is why the social-versus-individual framing is useful, but not sufficient. Because even if the stated purpose is social improvement, the system can still behave extractively if the economics exclude external realities.”

Adam said, “So the real question is not just: who benefits?”

Anna nodded. “It is: what did the model exclude in order to make the benefit look clean?”

Isobel added, “And who pays for what was excluded?”

“Exactly,” Anna said.

There was a pause.

Then Anna said:

“The business case does not have to be malicious to be dangerous. It only has to be incomplete at scale.”

Marcin pointed at her.

“That’s the fucking sentence.”

Anna continued.

“A small incomplete model is an error. A scaled incomplete model is an institution. And a scaled incomplete model with AI, capital and weak regulation becomes a reality-shaping machine.”

Adam said, “So we’re back to zero and one.”

“Yes,” Anna said. “Between zero and one you can encode the world. But once the encoding becomes profitable, the system starts mistaking representational success for real-world success. The model says the transaction cleared. The field says the cost was displaced.”

Isobel said, “And sustainability is the art of finding where the cost went.”

Anna smiled. “Exactly. Sustainability is externality detection with a conscience.”

Marcin raised his glass.

“And economics is what happens when you assume the conscience is outside scope.”

Anna lifted hers.

“To outside scope.”

Adam said, “The most expensive place in the universe.”

Isobel nodded.

“And always where they hide the bodies.”

Leave a comment