My analysis of echo chambers as trust-manipulators is now available in two exciting different versions! First, there was Escaping the Echo Chamber, the short version written for a general audience. And now, fresh off the presses, there’s Echo Chambers and Epistemic Bubbles, the long scholarly version written for philosophers and social scientists, full of citations and more careful versions of all the arguments.
In Escaping the Echo Chamber (published in Aeon Magazine), I claim that the whole discussion about this stuff has been confusing two very different social phenomena. An epistemic bubble is a structure that limits what you see. When all your friends on Facebook share your politics, and you don’t get exposed to the other side’s arguments, that’s just a bubble. An echo chamber, on the other hand, is a structure that manipulates trust. Members of echo chambers are taught to distrust everybody on the outside. An echo chamber functions more like a cult. It isolates its members, not by restricting their access to the world, but by alienating them from the outside world.
In epistemic bubbles, other voices are not heard; in echo chambers, other voices are actively undermined.
This is crucial, because you have to know the disease to pick the right cure. Epistemic bubbles can be broken by simple exposure. But echo chambers cannot; members of echo chambers have been prepared to resist exposure to evidence from the outside. This radically overinflated their trust for insiders.
Crucially, this thing that people are calling “post-truth” – where people just ignore the outside evidence? Epistemic bubbles can’t explain that. Only echo chamber effects can explain it. And if that’s what’s actually going on, then the solution isn’t just to wave “the evidence” or “the facts” in an echo chamber member’s face. They’ve been given a basis for rejecting such outside evidence as corrupted, malignant. The only way to fix an echo chamber is by repairing the broken trust at its root.
In Echo Chambers and Epistemic Bubbles (published in Episteme), I offer extended versions of all of the above arguments. This is the scholarly director’s cut. The definitions are more carefully fleshed out (and, admittedly, much longer and uglier and less memorable). The arguments are laid out in more detail, with citations. There’s also an extended discussion of the social science literature, where I point out a lot of places where people have conflated these concepts. I target a lot of recent papers which claim to have disproved the existence of echo chambers and epistemic bubbles, and point out that they’ve studied only exposure, and not distrust. Finally, there’s a much longer discussion of who’s responsible for the beliefs of echo chamber members. I take on Quassim Cassam’s story about epistemic vice and laziness in conspiracy theorists. He thought that, basically, all conspiracy theorists were just lazy and corrupt. I argue the opposite; the echo chambers story shows how a person could be blameless, because they were caught in a bad social network.
If you’re really interested in going all the way down the rabbit-hole, my analysis here is based on some earlier work. In Cognitive Islands and Runaway Echo Chambers, I analyze those domains where you need the help of experts, but you can only find experts by exercising your own abilities. This opens the door to a harmful sort of runaway bootstrapping, where people with bad beliefs use them to pick bad experts, and this only compounds their error. In Expertise and the Fragmentation of Intellectual Autonomy, I lay out the case for why we have to trust in experts, and why perfect intellectual autonomy is no longer possible, given the massive sprawl of scientific knowledge.