Survey Respondents Give Awful Estimates, part n of N

Two recent blog posts/stories caught my eye as they pick up on a theme that fascinates me, but about which I have very little expertise: hilariously, and perhaps depressingly, bad estimates of social phenomena by survey respondents. The first example is from a recent SocImages post: Americans Way (Like, Way) Overestimate the Number of Gays and Lesbians. The title of the post gives away the main finding pretty well: while the best estimates of GLB identified individuals are about 3.5%, “more than a third of Americans believe that more than one out of every four people identifies as gay or lesbian” and the median answer was in the 20-25% range. Even broadening out to the notion of attraction or behavior, and not just self-identification, survey estimates put the number closer to 10% or 11% (though these blog posts don’t report the broadest possible number, which would include all individuals answering yes to at least one of attraction, behavior, or identity). Interestingly, these estimates are significantly up from 2008 (when only a quarter responded that more than 25% of Americans are gay).

The second comes from the Inequalities blog, and concerns fraud in the welfare system in Britain. According to the survey, the average British person “believes that 30 in 100 disability claimants and 35 in 100 unemployment claimants are falsely claiming.” This authors go on to say that this figure is a tremendous overestimate:

Even if we put together fraud AND customer error, the latest figures show that 3.3% of unemployment claims are ‘false claims’, and a mere 1.1-1.2% of disability benefit claims. (I’ve talked about fraud figures previously on the blog here). So for people on average to think that 30-40% of claims are false is a massive, massive over-estimate – far more than could be explained by the difficulty in getting good estimates of fraud rates.

Both of these findings remind me of an old, and depressing, set of survey results about American opinion about foreign aid. Foreign aid is debated a lot in the public sphere, and it’s a constant target of conservative claims about government waste and the need to cut back. Survey results find that Americans want to cut foreign aid, but also that they massively overstate its size in the budget. Here’s a PBS story from 2010:

The survey… asked the question: “What percentage of the federal budget goes to foreign aid?”

The median answer was roughly 25 percent, according to the poll of 848 Americans. In reality, about 1 percent of the budget is allotted to foreign aid.

More broadly, another recent poll shows that Americans have just an awful understanding of how much money is spent on many different parts of the budget. For example, the median estimate was that public broadcasting accounts for 5% of the federal budget, instead of the .1% it actually does.

I should say right here that I wouldn’t have known the exact answer to most of these questions if you asked me before looking at the data. The federal budget is massive and complicated. I understand its broad trends – social security, medicaid/medicare, defense, and interest on the debt are the biggies, most other social assistance or public good provision programs are tiny – but if you asked me right now for an estimate of how much the federal government spends on highways, for example, I would have only a wild guess. So, given that someone who would bother to follow polling data on federal budget priorities doesn’t have the outline of the budget memorize, what does that imply that for our expectations of the average survey respondent?

Another way of asking this question is to step back and ask, what kind of knowledge do these questions give us? Most respondents are never asked to make decisions explicitly about the federal budget. I have no control over how much congress spends on foreign aid or PBS. Mostly, I have a vote and some limited capacity to donate to candidates or write letters or whatnot. To do that, how informed do I need to be? This is a sort of rational inattention argument, and it makes some sense for the budget data – although obviously, it suggests that manipulation is relatively easy, as people’s priors are weak and so if the Republican party spends years saying that foreign aid is creating our crushing tax burden, people don’t have a knee jerk “That’s mathematically impossible!” reaction like they perhaps ought.

But that argument doesn’t hold up, I think, for the sexuality question. The federal budget is a freakishly complicated process that allocates an almost unimaginable amount of money, from a personal standpoint – trillions and trillions of dollars. But our estimates of the percentage of gays and lesbians are presumably informed by our relatively large sampling of known individuals – we all know thousands of people – among other things. It makes me wonder, to start with, where people think all these gays and lesbians are! Do midwesterners think that New York and San Francisco are 90% GLB or something?

I don’t have good answers, but I am hoping some of you do. How should we interpret these sorts of trend/proportion knowledge questions in surveys? If we frame these questions as “public understanding of social facts” or “public understanding of social science” (to riff on the equally poorly-acronym’d “public understanding of science”), what kind of meaningful information do they give us? These reports are usually framed in an enlightenment social problems vein: if only we actually knew the real numbers, we wouldn’t care so much about (fraud, foreign aid, etc.). But that’s deeply unsatisfying. What’s a better way?



  1. Good stuff Dan. And given that people are really wrong and over-confident when they answer questions it is quite possible to learn the actual answer to, how do they answer more personal and subjective questions? I’d like to see more experiments where the questions are asked differently, incentives for correctness, etc.

  2. Hey Dan,

    You know this is one of my big peeves. I agree that the usual reaction to these situations is unsatisfying — especially when it regards only the issue at hand. I think a lot of people throw their hands up, think “everyone is so dumb about this issue!”, and that’s the end of the story.

    If we think systemically (as you have done by connecting the dots), then that highlights that people aren’t just missing one thing, or two things, or three things. They’re missing a lot of things. In fact, there is a wide-spread misunderstanding of social facts when it comes to our best known basic demographics. Furthermore, there is a fundamental misunderstanding about these facts. I think a lot of people buy the “damn lies” theory of stats and believe that all numbers are false. Rarely do they realize that the worse statistic might be off by a relatively large percent (say 25%), but that this number is still *far off* what they believe (consider the gay/lesbian proportion). Practically, I tend toward the enlightenment view (as you put it), and feel this is very depressing. I really do believe that if people knew these facts they would lead far better lives. As old Sam Johnson once said: “Little would be wanting to the happiness of life, if every man could conform to the right as soon as he was shown it.”

    BUT, I have two following thoughts that go beyond the normal hand-waiving: 1) WE STILL don’t know what’s good for us, despite the profusion of knowledge about how we might improve society. What this tells me is that we already have enough evidence to conclude that knowledge alone is not enough for society to be better. This should lay to rest any opinion that we are going to solve society’s problems through the advancement of science. In fact, I think that this should lay to rest the opinion that we will ever solve society’s problems at all. 2) We DO need more research on why people don’t want to take up this carefully crafted empirical knowledge about human society. Why don’t people know the real proportion of gays and lesbians, and more importantly, why don’t they care to find out? Which people have the furthest misestimations across the board (i.e. how many social facts do the least informed get wrong)? There may be many interesting findings. Ultimately, I think any answer will essentially boil down to a fundamental lack of curiosity. All mediating explanations aside, if people were more curious, they would seek out the best knowledge on any subject. Obviously they don’t.

%d bloggers like this: