What Is Wrong with Epistemic Trespassing?
Epistemic Trespassing and the Division of Cognitive Labour
When an epidemiologist appears on a news programme and declares that lockdowns are worth any economic cost, or a psychologist asserts that social media is the primary cause of rising mental health problems amongst young people, we may have what philosophers call "epistemic trespassing"—experts speaking outside their domains of expertise. It is natural to think of epistemic trespassing as a moral failing, or at least as revealing a character flaw. The epistemic trespasser is arrogant, or just likes the publicity. This may often be the case. It takes a certain kind of person, some might say an arrogant person, to make confident pronouncements like this. But I want to suggest a different way of thinking about the problem with epistemic trespassing. We can view it as a symptom of a malfunctioning division of cognitive labour—and as a sign that there is a deep problem with the role of expertise in complex democratic societies.
I’m trying to avoid the academic’s tendency to hedge and qualify in this post. But I want to make one thing clear from the outset. While some people use the idea in annoying ways, epistemic trespassing is clearly A Real Thing. First, while people may abuse the status it bestows, expertise is real: some people genuinely have knowledge and skills that others lack. Second, one way in which experts abuse the status is precisely by leveraging it to give unearned weight to their pronouncements on matters outside their area of expertise. But that’s what epistemic trespassing is. The question I want to answer in this post is what is wrong with it.
The Division of Cognitive Labour
While he didn’t invent the idea of solving a complex problem by dividing up tasks, the philosopher of science Philip Kitcher is credited with developing an important account of the division of cognitive labour. His basic insight is that a modern democratic society such as the US or UK is organised so that people take on roles that match their competences and skills. Meteorologists study weather patterns so we don’t need to rely on simple intuition or experience to predict the weather. Epidemiologists study disease transmission so we have a way of predicting the spread of a new disease. These experts use their specialised knowledge to solve the problems that we, as a society, need solving.
Obviously, this is an idealisation. Kitcher wasn’t blind to the problems with our actual divisions of cognitive labour or our actual societies. His point was rather that, to the extent that complex societies manage to solve the problems they face, they typically do so as a result of a functioning division of cognitive labour. The division of labour works well when the conditions are right. We get the benefits of specialisation—deep expertise in particular domains—combined with the benefits of coordination—different areas of expertise contributing to our overall knowledge and to solving many of the problems we face. But whether we get these benefits depends on whether the division or system is working as it should.
Why Epistemic Trespassing Is Inevitable
Any division of cognitive labour will break down if people don’t stick to their assigned roles. If meteorologists start trying to be epidemiologists, the whole thing will come crashing down. But blatant examples of epistemic trespassing—a physicist moonlighting as a philosopher, a psychologist moonlighting as an international relations analyst—are usually fairly easy to spot. They are less insidious than more complex cases of epistemic trespassing.
One kind of complex case occurs when an expert in some area combines their expertise in that area with a set of values that yield a recommendation about what we should do. This is what our epidemiologist is doing. They have genuine expertise in the distribution, patterns, and causes of diseases and health conditions in a population. What they probably don’t have is expertise in how to weigh up the potential health costs of a virus becoming widely spread within a population against the economic costs of shutting that society down. When they confidently proclaim that the scientific case for lockdowns is clear, they pretend to a kind of expertise that they most likely lack.
Another kind of complex case occurs when an expert in one area can make a credible case that their expertise extends into another, related area. A psychologist having views about the causes of mental health problems in teenagers is not that surprising. It may be that some psychologists have views about this that reflect their genuine expertise in multiple different but related areas. But it is not hard to call to mind cases where someone really does seem to be “reaching”. It’s not that what they have to say about mental health has no value. It’s just that we shouldn’t view their pronouncements about it as invested with the sort of authority that expertise typically bestows. It’s a conversation starter, an idea to consider. It’s not something that should be taken as remotely authoritative.
But remember: I don’t want to psychologise the epistemic trespasser. I want to understand the problems with our systems for dividing cognitive labour that lead to epistemic trespassing. Let me highlight two sets of problems.
The first set of problems concerns the venues in which experts communicate with the public. Most of these venues operate with quite severe time and format constraints. With the exception of the long form podcast, interviews with experts, whether online, on TV or in print, tend to be brief and to the point. There is little time or space for the expert to qualify their assertions or their authority to make those assertions. There is often little incentive for the expert to make any qualifications at all. Complex trade-offs and uncertainty don't make for compelling viewing or listening, even when they're the most accurate representation of the current state of knowledge.
The second set concerns the nexus between experts, policy makers and politicians. One of the features of a technocratic society such as ours is that elected officials typically present themselves as deferring to “expert opinion” on complex matters, such as whether to instigate a national lockdown in the face of a new virus. Our politicians often find this convenient: they can outsource difficult decisions to scientists rather than taking responsibility for their own choices. For someone faced with the decision whether to order a nationwide lockdown, it's politically expedient to defer to a scientist who has a model showing that only a lockdown will avoid millions of deaths. If the decision proves right, the politician can claim credit; if it proves wrong, they can say they were simply "following the science."
This environment creates some problematic political dynamics. One of them is what Jeffrey Friedman calls the "spiral of conviction"—a system that systematically selects for experts willing to exceed their competence. Media, politicians and policymakers often don't want careful, qualified advice that acknowledges uncertainty; they want definitive recommendations that can justify their decisions and provide political cover. This means that experts who are willing to speak beyond their expertise—who can provide the clear, confident answers that politicians need or the public want—are more likely to be consulted, quoted, and elevated to positions of influence. An epidemiologist who says "the data on school closures is mixed and involves complex trade-offs" is less useful to a politician than one who says "schools must close to save lives” ("or “there is no reason at all to close schools”). Meanwhile, experts who carefully qualify their statements or acknowledge the limits of their knowledge often find themselves marginalised in policy debates, or in public discussions more generally.
These features of our media and political environment can put experts in difficult positions, where the roles they're asked to play don't match well with the division of cognitive labour as ideally conceived. Of course, some experts might decide to “play the system”, exacerbating all of the problems I have discussed. But to my mind the more important point is that these problems exist irrespective of the motivations of individual experts. This is why, in my view, the problem of epistemic trespassing is inevitable in a society like ours. It doesn’t exist because our expert class are particularly power hungry and venal. It exists because we have a system that incentivises and rewards epistemic trespassing.
How Politicisation Makes Things Worse
I’ve already explained why I think epistemic trespassing is inevitable. There’s another political dynamic that makes things even worse. Many have commented on the increasingly politicised nature of scientific and other “knowledge” institutions. I have written about this dynamic before and the point I want to make about it is quite subtle. The main reason why politicisation is often a bad thing is that it produces a particularly destructive dynamic. Once something, like an institution or a view on some scientific issue, becomes politicised, it is very hard to depoliticise it. If you don’t like the political valence it has acquired, your only options are surrender, or to fight a political battle against it. If you don’t mind the political valence but would rather it didn’t attach to the institution or view, it is hard for your defences of the view to not be interpreted as moves in the political battle by your opponents. (Presumably some people wanted it to be politicised in the first place, so they’ll be happy).
We can apply this to politicised science. Universities, and knowledge-producing institutions have become sites of political battles and, once that happens, it is very hard to defuse the situation. Backing down means letting the "other side" win. There are, of course, those who want to take a more high-minded approach, equivalent to nuclear disarmament. But the campaign for nuclear disarmament has not been very successful. Just as the unilateral disarmer leaves themselves vulnerable to those who have not disarmed, the researcher or institution that refuses to engage in political battles may find themselves sidelined or irrelevant in increasingly partisan debates.
The upshot is that, in a politicised environment, expertise becomes a resource to be captured rather than something to be respected. Different political factions seek out experts who align with their preferred positions. Experts may feel pressure to align their public statements with their own political commitments, or react to incentives to align their statements with the policy preferences of powerful individuals or groups. This creates a dynamic where the boundaries between expertise and advocacy are blurred simply because people are reacting to the incentives in ways that are broadly instrumentally rational.
The pandemic illustrates many of these points. In many (though not all) countries public health experts found themselves embroiled in heated political battles about individual freedom, masks, lockdowns, school closures and vaccinations. Once they became embroiled in these battles, it was very hard for any individual to avoid their statements about the pandemic becoming more fodder for the political fight. A careful, qualified statement about mask effectiveness or the weak evidence basis for school closures might be seen as giving ammunition to those who wanted to remove mask mandates or re-open all the schools. A more definitive statement might be taken as justifying those who wanted to keep—or even tighten—existing restrictions. This makes it difficult to engage in public health communication in a way that is not strategic and political. If what you say will be interpreted as a move in a “political game” regardless of what you do, why not play the game yourself?
A Way Forward?
The politicisation of expertise is a fundamental challenge to the division of cognitive labour that Kitcher envisions. Indeed, to the extent that Kitcher gives us a good model for how things are meant to work, it is a fundamental challenge to democratic governance. When the division of cognitive labour breaks down—when experts are systematically incentivised to speak beyond their competence—it undermines public trust in expertise and makes the dream of science-informed public policy even harder to achieve. What to do about this?
I’m not big on solutions to complex social and political problems. The style of analysis I favour tends towards explaining why the most important problems are hard to solve, which makes it difficult to really believe in any of the solutions that are typically proffered. To my mind, the most serious problem is the point about political responsibility. I don’t know when it started, or to what extent things were always this way, but our political leaders simply refuse to take responsibility for the decisions that they make. One way in which they do this is by outsourcing these decisions to the experts. But this means that experts are forced into a role that they should not be playing in a democratic society. Unless we can find our political leaders to take more responsibility for their actions, I suspect we are, to put it frankly, fucked.
What to do about this? You point out that solving the problem is hard, but I would argue that it is framed as "impossible" because the axioms that define the system are in conflict. The interplay of expertise, politicians, and mass media, at least in Western societies, rules out the possibility of division of labor proposed by Kitcher - your descriptive evidence of this phenomena demonstrates it well. If we want his divisions to work, I might look at whether other systems manage to apply it with fewer contradictions.
I think looking at Eastern cultures suggests the problem is rooted in these system beliefs. In South Korea they routinely showed experts front-loading uncertainties (e.g. "this information could change within days") while retaining majority trust during COVID (https://pmc.ncbi.nlm.nih.gov/articles/PMC7952821/). Not perfect! But indicative that people have varying tolerances for uncertain experts.
Social identity theory seems to suggest that when we face major crises, group solidarities intensify, and that should have measurable effects on the behavior of their society. It also strips away a lot of complication, because tons of people are uniformly focused on a single threat - there are less confounding variables to consider.
This leaves me thinking of two directions of solution:
- Attempt to inject some Eastern collectivist behaviors into our culture (I'm not sure which ones might be causal or most compatible with Kitcher, it would take deeper deduction)
- Lean into the spiral of conviction, and experience a shock so severe that we realize our society cannot continue behaving this way, and we clean up from the rubble (kind of like your "I suspect we are fucked" point).
I found all of this illuminating, except that I’m confused about the mental health example. If not the psychologist, who is actually the expert on mental health?