Impact

Aussie Academics Keep Publishing, and May Start Perishing Impact
How hard is it to see your impact? (Photo by Jason Tong)

Aussie Academics Keep Publishing, and May Start Perishing

April 21, 2014 958

Fog at University of Sydney

How hard is it to see your impact? (Photo by Jason Tong)

Perhaps we should excuse comments made during the 2013 federal election about “wasteful” and “increasingly ridiculous research” undertaken in Australia.

The real shock was not that shadow ministers could make such baseless statements but that these comments could go largely unchallenged by the public. Although it seemed an act of political posturing by vilifying academics, the now-Coalition government was merely tapping into an existing view of the university sector within Australian society.

The Conversation logo_AU

This article by Jason Ensor originally appeared at The Conversation, a Social Science Space partner site, under the title “University metrics keep academics in their ivory towers”

This view is that academia is out of touch with the taxpayers it services.

Which it is. Not because scholarship can sometimes appear to be esoteric or arcane, but because scholars often fail to make the effort to argue why it is otherwise to those who foot the bill. We fail to regard the public as a research partner worthy of our attention and our respect.

In defense of academics, the barriers that prevent scholars from adding value to where it really matters are not of our making … not quite.

Australia has national research priorities and associated goals set by the government. These range from investigating social well-being to improving cyber security, from lifting manufacturing productivity to understanding cultural and economic change in our region. But how this research is disseminated and “counted” is at odds with public expectations of access and engagement.

Talking to ourselves

The university sector is about to enter the 2015 round of the Excellence in Research Australia (ERA) initiative. Under ERA, the dissemination of research via monographs, book chapters, peer-reviewed journal articles and peer-reviewed conference papers is tightly linked to a points system in which these traditional modes of publication “count.” Called “outputs,” monographs are worth 5 points each and everything else 1 point.

These are the metrics by which the bulk of goods and services produced by the academy are weighed. Academic careers grow or wither under this points system. As a result, the printed book and peer-reviewed journal article remain the exemplars of published research and the currency of scholarly accreditation and promotion. This system is bluntly characterized as the “publish or perish” dimension of academic workloads.

This fixation on forms of publication that pre-date the internet helps maintain the widespread perception of scholarship as dry and aloof. The target audience of these outputs is usually other scholars (who increasingly have little or no time to absorb colleagues’ work), not the public.

Awareness is growing of the need to move into new modes of engagement that are more available to modern society. But how do academics engage with new and emergent forms of interaction when the goalposts set by evaluation systems like the ERA value monographs and journal articles most highly?

Granted, the “publish or perish” imperative is giving way to “be visible or vanish.” But even this is perhaps too self-interested, even mildly narcissistic, by suggesting discoverability as the new fashion that will restore academia. For sure, this has led to the consideration of alternative ways to evaluate scholarship in the public domain. These range from counting actual downloads of a journal article (if it’s open access) to “hits” or “page views” of an online exhibition.

Yet in some ways this only modifies the units for measuring impact without really questioning the underlying premise of impact. Be it citations or eyeballs, these are more suited to grant applications and keeping your job than truly opening up a dialogue with the public.

Under this model, traditional forms of publication “count.” Tweeting, blogging, teaching, media appearances, public lectures, community forums and the convening of other non-print-based outcomes – all of which require a lot of commitment to develop, curate and present — rarely do, or require supporting evidence to count towards a research component.

As with live performances, exhibitions and reports to government bodies, most things digital are also considered “non-traditional research outputs.” These require extensive explanation to justify why they should be “counted.”

Catching up with the community

In an age where searching for tutorials on YouTube and information in Wikipedia is second-nature to young inquiring minds, casting digital outputs as “non-traditional” is out of sync with society.

The division of research into traditional and non-traditional is also at odds with community engagement. While most education institutions see their role as servicing and advancing Australian society, the forms of evaluation used to rank Australian university subjects can work against fulfilling this goal.

New forms of recognized outputs and outcomes are required to change the relationship and to renew scholarship as an important part of public discussion. Moreover, scholars need to be able to present research in ways that are meaningful to society and at the same time also count for their institutions. Engaging public audiences alongside academic audiences need to be core competencies with equal footing.

To date, these goals have been mutually exclusive. By not using and valuing the forms of communication and knowledge-sharing that Australians engage in every day, the research sector has actively contributed to the growing sense of irrelevance that stalks academia.

As long as our goalposts value talking to each other over and above talking with the community, we remain the primary agents of our own marginalization.The Conversation


Jason Ensor is the research and technical development manager for digital humanities at the School of Humanities and Communication Arts at the University of Western Sydney. He studied at Murdoch University in Perth and has held positions at The University of Queensland, Curtin University of Technology, The Australian National University, Murdoch University and The University of Western Australia. Most recently, he was a data analyst in research and development at Murdoch and technical officer for the Australian Centre for Indigenous History at The Australian National University (Canberra). In Perth he administered the whole of Murdoch University’s ERA (Excellence in Research Australia) data submission. In Canberra Ensor led the design and development of one of the largest digital history and knowledge management projects in the field of Australian indigenous history.

View all posts by Jason Ensor

Related Articles

Young Scholars Can’t Take the Field in Game of  Academic Metrics
Infrastructure
December 18, 2024

Young Scholars Can’t Take the Field in Game of Academic Metrics

Read Now
Canada’s Storytellers Challenge Seeks Compelling Narratives About Student Research
Communication
November 21, 2024

Canada’s Storytellers Challenge Seeks Compelling Narratives About Student Research

Read Now
Tom Burns, 1959-2024: A Pioneer in Learning Development 
Impact
November 5, 2024

Tom Burns, 1959-2024: A Pioneer in Learning Development 

Read Now
Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures
Impact
September 23, 2024

Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures

Read Now
Paper to Advance Debate on Dual-Process Theories Genuinely Advanced Debate

Paper to Advance Debate on Dual-Process Theories Genuinely Advanced Debate

Sage 1255 Impact

Psychologists Jonathan St. B. T. Evans and Keith E. Stanovich have a history of publishing important research papers that resonate for years.

Read Now
Webinar: Fundamentals of Research Impact

Webinar: Fundamentals of Research Impact

Sage 1094 Event, Impact

Whether you’re in a research leadership position, working in research development, or a researcher embarking on their project, creating a culture of […]

Read Now
Paper Opening Science to the New Statistics Proves Its Import a Decade Later

Paper Opening Science to the New Statistics Proves Its Import a Decade Later

An article in the journal Psychological Science, “The New Statistics: Why and How” by La Trobe University’s Geoff Cumming, has proved remarkably popular in the years since and is the third-most cited paper published in a Sage journal in 2013.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments