The sales pitch was elegant enough to fool serious people.
Newsrooms would use AI to reduce bias. Human inconsistency out, machine objectivity in. The copy practically wrote itself. No mood. No ego. No reporter bringing too much personal weather into the frame. Just cleaner summaries, faster output, and better informational hygiene for the modern reader.
That story was always a little too flattering.
What most outlets actually built was not neutrality. They built acceleration. They trained systems on their own archives, their own habits, their own blind spots, their own institutional cowardice, and then acted surprised when the machine started speaking in a smoother version of the house voice. The bias did not disappear. It got laundered into process.
That matters because machine bias lands differently. Human bias can at least be attached to a person, a byline, a history, a set of choices somebody might have to defend later. Machine bias arrives looking like procedure. It wears the face of inevitability. The sentence is flatter, faster, more hygienic, and somehow harder to argue with precisely because it is less alive.
That is where the con was hiding.
The AI-news turn was never mostly about epistemology. It was about cost, scale, liability, and managerial fantasy. Executives wanted content velocity without content payroll. Platforms wanted endless synthesis without endless wages. Institutions wanted the authority of the newsroom without quite so much newsroom inside it. "Reducing bias" turned out to be a very convenient phrase for all of that.
And of course the archives matter. If a system is trained on years of official-source deference, class bias, imperial phrasing, investor-friendly euphemism, or sanitized violence, then those instincts do not vanish just because the byline becomes computational. They harden. The machine learns which bodies receive full humanity, which deaths get abstractions, which crimes become "instability," which market harms become "corrections," which people get described as emotional while others inherit the voice of procedure.
That is why the whole thing becomes clearest around war, policing, and empire. When the stakes are low, bias can still hide inside tone. When the stakes are high, it starts writing body counts and moral permission. A newsroom AI trained on old institutional habits does not transcend them in a crisis. It reproduces them at machine speed while everybody involved keeps using the language of efficiency.