Industry

Parsing Fact and Perception in Ethnography

May 3, 2021 2049
quadrant showing fundamentals of fact checking
National Public Radio uses this four-quadrant card in training its journalists in fact-checking (read about it here). Ethnographers should go ahead and use material that is “Controversial/Hard to Verify” so long as they clearly identify it. (Graphic: NPR)

The Annual Review of Sociology is a very big deal. Published since 1975, it was initiated by the American Sociological Association in collaboration with the nonprofit publisher Annual Reviews for the purpose of “synthesizing and integrating knowledge for the progress of science and the benefit of society.” With an impressive impact factor, the Annual Review of Sociology now covers “major theoretical and methodological developments as well as current research in the major subfields.” Because submissions are by invitation only, and then subject to peer review, placing a piece in the Annual Review is a major accomplishment for academic authors. It therefore came as a surprise to me – and really, a bit of an honor – to discover that an article in the forthcoming 2021 Annual Review of Sociology devotes almost an entire section to responding to my 2017 book, Interrogating Ethnography, in which I critiqued some common practices in ethnography and suggested how they could be made more rigorous.

Interrogating ethnography cover

The article is “Ethnography, Data Transparency, and the Information Age,” by Alexandra Murphy, Colin Jerolmack, and DeAnna Smith. Now available on the journal’s (paywalled) website, and kindly sent to me by one of the authors, the essay argues that the “growing recognition that the advancement of social science hinges upon scholars being more transparent.” The practice of ethnography, they explain, is facing a “reckoning over the long-standing conventions they follow to gather, write about, and store their data.” The article is comprehensive and insightful, focusing on four important issues related to transparency: data collection, anonymization, verification, and data sharing. Congratulations to Professors Murphy (University of Michigan), Jerolmack (NYU), and Smith (University of Michigan) for a job well done.

The engagement with my work is mostly (although not entirely) in the section titled “Data Verification,” where the authors note my efforts to fact check more than 50 urban ethnographies. “Lubet,” they say, “accuses ethnographers of treating subjects’ retellings of events as fact.” Although I would have preferred a different verb – perhaps “Lubet reports” or “Lubet relates” – it is fair enough to say that my research revealed factual problems in about half of the ethnographies I investigated. To put it gently, I found that ethnographies often resort to hearsay, folklore, and rumor, without distinguishing between first-hand observations and second-hand accounts.

The authors present three sets of responses to my work – all of them oddly negative, given that they pretty much end up agreeing with me – beginning with a rather silly assertion by the UCLA sociologist Stefan Timmermans. According to Timmermans, the authors explain, the only way to show that an ethnographer got something wrong is “to put in the work to make repeated first-hand observations in the setting under question.” It should be obvious that such a standard could almost never be met – meaning that nothing in an ethnography could ever be disproven – even in the unlikely circumstance that a subsequent researcher had the time and inclination to make multiple trips to an ethnographer’s research site.

Many ethnographies, including the most controversial ones, make a point of masking their research locales, thus making it impossible for a fact-checker to visit the “setting under question.” The authors note in a different section that site-masking can prevent direct comparisons and, to say the least, “makes it difficult to conduct ethnographic, archival, virtual, and quantitative reanalysis.” Nonetheless, they appear to endorse Timmermans’ unrealistic claim, at least partially, by noting that “officials and their records may not comport with what happens on the ground.” While that is undoubtedly true, it does not justify the wholesale rejection of all public records – as has been urged by some ethnographers – along with the uncritical acceptance of research subjects’ accounts.

The second response is that ethnographers do in fact “verify their data more thoroughly than Lubet suggests.” Maybe so. The authors cite examples of ethnographers’ self-verification, including a few taken directly from my book, but that does not mean that the practice is widespread, rigorous, or transparent, much less universal. Nor do the authors recognize the inconsistency of commending some ethnographers for cross-checking their work against public sources, after having discounted the value of such sources only one paragraph earlier. In any case, my greater critique of ethnography has been that there is no robust tradition of fact-checking each other’s work.

This brings us to the third set of responses, which really get to the heart of the matter. “Some ethnographers, however, reject the premise that fact checking – at least as commonly understood in journalism – is essential to the ethnographic enterprise.” The problem, according to the authors, is the risk of “privileging the ‘truth’ as constructed by records, experts, and other outsiders over our subjects’ perceptions and experiences.” The centrality of reporting “participants’ worldview,” according to the authors, provides “a convincing rebuttal to the Lubet-style fact-checking approach.”

As much as I appreciate the eponym, this appears to be a serious misunderstanding of the nature of evidence. There is no contradiction between locating “truth” – with or without scare quotes, and however determined – and valuing perceptions and experiences. Fact and perception are simply different categories, neither of which is necessarily more important than the other. The challenge for ethnographers – and the major shortcoming I identified – lies in making clear and careful distinctions between what they have actually seen and what they have only heard about.

Murphy, Jerolmack, and Smith attempt to take both sides. They disdain “the fetishization of journalistic fact-checking,” while still conceding “that there are many reasons why participants’ accounts should not be taken at face value.” Recognizing that “when it comes to verifying data, there is room for improvement,” the authors allow, tepidly for a reckoning, that “some kind of verification of accounts seems warranted.”

An ethnographer embedded in the MAGA world would repeatedly hear that President Biden intends to restrict consumption of beef to one hamburger a month. The readiness in some quarters to accept such a claim could provide important insights in an ethnography, but only if it were clearly identified as a reflecting the subjects’ perception or worldview. A responsible social scientist would never simply report the impending burger limitation as fact, especially without tracking down the source of the story (which turns out to have been a rumor spread by Donald Trump, Jr. and others). That isn’t fetishization, it’s research.

Steven Lubet is Williams Memorial Professor at the Northwestern University Pritzker School of Law and author of Interrogating Ethnography: Why Evidence Matters, and other books such as 2015's The “Colored Hero” Of Harper’s Ferry: John Anthony Copeland And The War Against Slavery and Lawyers' Poker: 52 Lessons That Lawyers Can Learn From Card Players. He is the director of the Fred Bartlit Center for Trial Advocacy. He has been living with ME/CFS since 2006.

View all posts by Steven Lubet

Related Articles

Let’s Return to Retractions Being Corrective, Not Punitive
Communication
July 15, 2024

Let’s Return to Retractions Being Corrective, Not Punitive

Read Now
Uncovering ‘Sneaked References’ in an Article’s Metadata
Communication
July 11, 2024

Uncovering ‘Sneaked References’ in an Article’s Metadata

Read Now
How Social Science Can Hurt Those It Loves
Ethics
June 4, 2024

How Social Science Can Hurt Those It Loves

Read Now
Opportunity to Participate in RFI on Proposed National Secure Data Service
Announcements
May 28, 2024

Opportunity to Participate in RFI on Proposed National Secure Data Service

Read Now
Why Social Science? Because It Can Help Contribute to AI That Benefits Society

Why Social Science? Because It Can Help Contribute to AI That Benefits Society

Social sciences can also inform the design and creation of ethical frameworks and guidelines for AI development and for deployment into systems. Social scientists can contribute expertise: on data quality, equity, and reliability; on how bias manifests in AI algorithms and decision-making processes; on how AI technologies impact marginalized communities and exacerbate existing inequities; and on topics such as fairness, transparency, privacy, and accountability.

Read Now
Biden Administration Releases ‘Blueprint’ For Using Social and Behavioral Science in Policy

Biden Administration Releases ‘Blueprint’ For Using Social and Behavioral Science in Policy

U.S. President Joseph Biden’s administration has laid down a marker buttressing the use of social and behavioral science in crafting policies for the federal government by releasing a 102-page Blueprint for the Use of Social and Behavioral Science to Advance Evidence-Based Policymaking.

Read Now
Celebrating 20 Years of an Afrocentric Small Scholarly Press

Celebrating 20 Years of an Afrocentric Small Scholarly Press

To mark the Black- and female-owned Universal Write Publications’ 20th anniversary, Sage’s Geane De Lima asked UWP fonder Ayo Sekai some questions about UWP’s past, present and future.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments