Career

Social Science Ahead of the (Shallow) Curve on Altmetrics Acceptance

May 3, 2019 2236

Metrics word cloud

A new survey of university faculty finds that the idea of altmetrics – using something aside from journal citations as the measure of scholarly impact – has made less headway among faculty than might be expected given the hoopla surrounding altmetrics. These new measures are the most familiar in the social science community (barely) and least familiar in the arts and humanities (dramatically so).

The survey, titled “How Faculty Demonstrate Impact,” was presented at the bi-annual meeting of the Association of College & Research Libraries. Thanks to that audience, the paper begins by noting that while bibliometrics “has been an evolving part of the academic landscape for decades,” other sorts of scholarly metrics have been a more recent phenomena. And again, given that the library information science, or LIS, community has long labored with these various measuring tools, there was an underlying assumption that the rest of academe was equally conversant with them.

That was not the case.

“Our study also found that faculty are not nearly as familiar with altmetrics as those in the LIS field might have assumed, given the attention in LIS literature and many libraries’ outreach efforts to faculty,” wrote the authors, led by Caitlin Bakker, a biomedical research services librarian at the University of Minnesota Health Sciences Libraries. “Many more faculty reported being ‘not at all’ or ‘marginally’ familiar with altmetrics than reported being ‘familiar’ or ‘extremely familiar.’

“This finding serves as an important reminder that while altmetrics may no longer be a new concept in our field, it remains an unfamiliar concept to most faculty.”

One hint that the unfamiliarity might have been expected, the authors said, is that “very little” has been written about faculty uptake of altmetrics even as the literature on impact metrics themselves has grown. Their study, they wrote, is the first multi-campus attempt to address how faculty members interact with these measures.

The survey behind the study was open to faculty at four large American universities: The University of Minnesota, the Ohio State University, Valparaiso University and the University of Vermont. The authors categorized the 1,202 responses across four broad disciplinary areas: health sciences (n=444), sciences (n=343), social sciences (n=256), and arts and humanities (n=158). They then looked at the broad trends in opinion through that four-part disciplinary prism.

What they found in how faculty perceived and trusted traditional impact metrics and altmetrics was that social science, physical science and health science tended to answer in lockstep, while the humanities inevitably were less aware and less trusting of metrics of any stripe. And in general, although not by huge margins, social scientists were more familiar with altmetrics: “23.4% of respondents being either familiar or extremely familiar with altmetrics, when compared with Sciences (16.5%) and Health Sciences (16.2%).”

This humanities-versus-everyone-else result was echoed in a pair of questions about departmental encouragement and requirement to use impact metrics period. While most departments encourage the inclusion of any sort of impact measurements in their promotion and tenure process, only about two-fifths require their use – except in the humanities where only a quarter of respondents reckoned their department encouraged their use and an “overwhelming” number reported no requirement at all to use metrics.

Lastly, and perhaps not surprisingly, the more established an academic was, the less value they placed on metrics.

Assistant Professors placed the greatest amount of importance on impact metrics, while Full Professors placed the least amount of importance. Certainly, Assistant Professors preparing for the tenure process are seeking out ways to demonstrate the impact of their work and are often under the impression they should include statistical measures of impact as part of their tenure materials. More well-established Full Professors may not feel as much pressure to engage with statistical representations of their work’s impact given that they are no longer assessed for promotion.

The authors also charitably suggested that since full professors have already used metrics to assess others climbing the ladder, they “understand the limitations of statistical impact measures” in their broader assessment of metrics.

The full author team for “How Faculty Demonstrate Impact,” including Bakker, were: Jonathan Bull, scholarly communications librarian, Valparaiso University; Nancy Courtney, research impact librarian, Ohio State University Libraries; Dan DeSanto, instruction librarian, University of Vermont; Allison Langham-Putrow, scholarly communications librarian, University of Minnesota–Twin Cities; Jenny McBurney, research services coordinator and social sciences librarian, University of Minnesota– Twin Cities; and Aaron Nichols, access/media services librarian, University of Vermont.

For a more detailed breakdown of the results, please see the full report HERE.


Social Science Space editor Michael Todd is a long-time newspaper editor and reporter whose beats included the U.S. military, primary and secondary education, government, and business. He entered the magazine world in 2006 as the managing editor of Hispanic Business. He joined the Miller-McCune Center for Research, Media and Public Policy and its magazine Miller-McCune (renamed Pacific Standard in 2012), where he served as web editor and later as senior staff writer focusing on covering the environmental and social sciences. During his time with the Miller-McCune Center, he regularly participated in media training courses for scientists in collaboration with the Communication Partnership for Science and the Sea (COMPASS), Stanford’s Aldo Leopold Leadership Institute, and individual research institutions.

View all posts by Michael Todd

Related Articles

Canada’s Storytellers Challenge Seeks Compelling Narratives About Student Research
Communication
November 21, 2024

Canada’s Storytellers Challenge Seeks Compelling Narratives About Student Research

Read Now
Tom Burns, 1959-2024: A Pioneer in Learning Development 
Impact
November 5, 2024

Tom Burns, 1959-2024: A Pioneer in Learning Development 

Read Now
Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures
Impact
September 23, 2024

Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures

Read Now
Paper to Advance Debate on Dual-Process Theories Genuinely Advanced Debate
Impact
September 18, 2024

Paper to Advance Debate on Dual-Process Theories Genuinely Advanced Debate

Read Now
Webinar: Fundamentals of Research Impact

Webinar: Fundamentals of Research Impact

Whether you’re in a research leadership position, working in research development, or a researcher embarking on their project, creating a culture of […]

Read Now
Where Did We Get the Phrase ‘Publish or Perish’?

Where Did We Get the Phrase ‘Publish or Perish’?

The origin of the phrase “publish or perish” has been intriguing since this question was first raised by Eugene Garfield in 1996. Vladimir Moskovkinl talks about the evolution of the meaning of this phrase and shows the earliest use known at this point.

Read Now
Paper Opening Science to the New Statistics Proves Its Import a Decade Later

Paper Opening Science to the New Statistics Proves Its Import a Decade Later

An article in the journal Psychological Science, “The New Statistics: Why and How” by La Trobe University’s Geoff Cumming, has proved remarkably popular in the years since and is the third-most cited paper published in a Sage journal in 2013.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments