Whether Black people’s political knowledge is accurately reported in survey research depends on the race and skin tone of the interviewer conducting the assessment, according to a new study co-authored by a Georgia State University researcher.
The findings raise questions about how to ensure that political science research accurately captures the knowledge and attitudes of diverse survey respondents.
“For decades now, we’ve been underestimating the political knowledge of non-white respondents,” said Judd Thornton, associate professor in Georgia State’s Department of Political Science.
Thornton and co-author Adam Enders of the University of Louisville published their findings in the Journal of Race, Ethnicity, and Politics.
Past research offers evidence that Black respondents answer fewer political questions correctly when asked by white interviewers as compared to Black interviewers. However, this discrepancy in performance is often attributed to respondents’ behavior without much consideration of interviewers’ contributions.
Looking to examine interviewers’ role in this effect, the authors analyzed data from the 2012 American National Election Study (ANES). The nonpartisan research group serves social scientists, teachers, students, policy makers and journalists by producing data drawn from surveys it conducts on voting, public opinion and political participation, according to its website.
During pre- and post-election surveys, interviewers administered an objective knowledge assessment in the form of political questions (such as, “Who is the U.S. Attorney General?”) and subjectively rated respondents’ overall political knowledge on a five-point scale. Interviewers also reported their own racial identity and skin tone on a 10-point scale, as well as the respondent’s race and skin tone.
In their analyses, the authors first looked at differences in Black respondents’ knowledge ratings based on interviewer race. In theory, trained interviewers’ assessments of respondent knowledge should be highly aligned with performance on the objective assessment. As a respondent gets more answers correct, the interviewer should rate the respondent as being more knowledgeable.
However, the authors found that interviewers’ assessments of respondents’ political knowledge were more closely correlated with interviewer race than the factual accuracy of the answers. Overall, Black respondents were rated as more politically knowledgeable by Black interviewers than by white interviewers even when controlling for objective performance.
“Not only is racial bias systematic,” the researchers wrote, “but it is seemingly capable of overriding other considerations that should guide interviewers’ assessments of political knowledge.”
The researchers then employed a measure of relative skin tone calculated by finding the difference between the interviewer and respondent skin tones. Whether the interviewer was Black or white — and regardless of the respondent’s performance — respondents with darker skin tones than the interviewer received lower scores.
“The interviewers … are rating darker-skinned respondents as less knowledgeable, even controlling for objective knowledge,” said Thornton.
While previous studies have asserted that fear of negative perception from white interviewers may decrease Black respondents’ performance, these findings provide evidence of interviewer bias on subjective assessments of factual knowledge.
Given the use of interviews in determining levels of political sophistication (also known as “political awareness” or “political expertise”) and the effects of interviewer bias on reported political knowledge, the ability to capture Black public opinion and political involvement appears flawed. Thus, Black individuals are not accurately represented in the impactful political work that relies on this information.
“It seems like so many things are driven by political sophistication but if we’re not properly classifying how sophisticated individuals are, then our conclusions are going to be mistaken,” said Thornton.
Leave a Reply