Reports

Identifying the Challenges of Social Science’s Newest Technology

November 26, 2019 6088
Person struggling among technology

Choice is overwhelming. This should be no surprise to anyone who has spent a good few hours in a department store looking for the right pair of jeans, or for anyone who has spent a good few minutes gazing the vast plethora of deodorants, mouthwashes, or soaps in every health store. Some choices carry less weight, and therefore induce less psychological stress than others. Some choices, however, matter a lot. What if you’re a researcher looking at the landscape of technological tools available for data collection, analysis, or participant recruitment? What if you’re choosing among hundreds of tools and programs? Thousands? Is that sense of peril setting in, yet? Maybe read some Sartre to delight in the radical freedom of it all. Or lie down and refuse to make a choice. But that won’t help you. It will only prolong the struggle– if you have a meaningful project, you’re eventually going to have to make some choices about the tools you use.

Earlier this month, SAGE Publishing (the parent of Social Science Space) released a new white paper which sought to explore the ever-overwhelming technological landscape of research tools for social science researchers, and the challenges that researchers may be faced with due to this landscape. The white paper’s authors are Daniela Duca, a product manager for SAGE, and Katie Metzler, SAGE’s associate vice president for product innovation.. Click here to read the report.

Underlying the report is an analysis of 418 tools identified as being tools of use by social science researchers. A portion of these tools were then cataloged into three ‘clusters:’ tools for surveying and sourcing participants (of which there were 50), tools for annotating, labeling, and coding text (48), and tools for social media research (104). The report abstracts from an examination of these tools to the technology ecosystem at large. It then examines the challenges of such an inundated ecosystem, and considers the future of the ecosystem.

Among the important questions the report raises is what are the challenges that arise from such a crowded technological landscape? These 418 tools (and more gone unnoticed, surely) must be hold an overwhelming amount of possibility. There’s a lot at stake in choosing programs and tools. How has this abundance of tools, and the never-ending creation of new tools, led to new challenges for researchers?

“The variety of tools, their diverse sources and developers, and their unclear status in the academic ecosystem,” the white paper reads, “means multiple challenges face the people building these tools, as well as the researchers trying to use them.” The report identifies five of the most immediate challenges and their impact on the ecosystem.

Poor Citation Practices

Tools are still relatively hard to search for in academic papers, even if they have unique names. Some tools have specific citation requirements, but not all. This can make it hard to find the tools used by researchers. Sometimes, researchers reference the software tools and packages they’ve used in the methodology section or in footnotes. Sometimes tools and packages go unreferenced.

This lack of citation standardization is dangerous with a sea of programs out there. It can have ramifications for the replicability or reproducibility of studies, for one example. How do you perform a study using tools which aren’t properly cited?

Navigation Troubles

With so many tools out there, some of which are obscure, some of which were developed by academics for specific projects and abandoned after, it is hard to navigate the technological ecosystem. There are a number of directories out there that seek to ease the trouble of this challenge (such as The Dirt Directory, a humanities tool that is still useful for the social sciences). SourceForge (hosts open-source tools) and LabWorm (community curated list of tools) might also be of use.

Sustainability and Open Source

When overwhelmed by the range of tools available and unsure of their respective reputations, many researchers end up developing tools suited for their specific purposes. The number of tools and packages continues to grow, but they come into being and fall out of fashion very quickly (the shelf life of a piece of software these days is very short). The big problem, then, is maintaining tools and creating communities around them whilst remaining financially sustainable. Tool creators want success in the form of community and profit (being open source in terms of accessibility, but being sustainable in terms of profit). Some tools are working towards this (the report cites RapidMiner), and the ecosystem is rapidly changing. This problem may therefore diminish in future years.

Lack of Peer-Review for Tools

With such a vast ecosystem of tools, combined with navigation difficulties, exposure difficulties, or reputation difficulties, it seems as if there ought to be a peer-review system for tools.

The report suggests that a standard model by which a program can be tested or evaluated ought to be adopted. Such a model would aid researchers in comparing different tools and packages. It would save researchers time, and assure them of the reliability or appropriateness of tools for their projects.

Big Technology

Big tech companies control data (how it is released, who can use it), which means that any tools which rely on these companies, have to keep up-to-date with the legalese and application programming interfaces (APIs). This puts stress on tool creators and may prevent researchers from accessing data. Big tech companies are increasingly taking control of the newer tools appearing on the market, via acquisitions which can limit access to data and research.

Big tech companies also attract high-caliber researchers to work for them, which is where research incentives may shift for the worse: towards faster experimentation, for example.

But regulations about personal data are being strengthened as we examine the social media ecosystem brought about by big tech companies.

Innovation

The white paper goes on to praise the success of surveying tools and discuss the exciting innovations (take NEXT, a survey tool powered by an algorithm that adapts the questions as more people answer them) in the sector.

Innovation is happening all around, though– it isn’t exclusive to surveying. New, exciting tools are being created as social science researchers get stronger backgrounds in computation. The report speculates about the next twenty years of progress in tools: in a worst-case scenario, tools are used solely for profit incentives. Tools here will be developed with more mind to the end-goal of private research: profit. In a best-case scenario, robust research is still happening at universities. Tools here will be developed with more mind to the research itself, the end-goal of which is accuracy, truth, utility, etcetera. In either case, there will be further specialization in the development of new tools. More researchers will acquire computational skills, more tools will come of it.

In any case, it seems as if we are at a junction with social science research tools. The landscape could get even wilder, or standardization of practices around tools may tame the wilderness that is the social science technology ecosystem.

To read more about the topic, or to read more speculation about the next 20 years of tools (interfaces that might collect data on our emotions directly – wow!), click here.

Augustus Wachbrit (or, if you’re intimidated by his three-syllable name, Gus) is the Social Science Communications Intern at SAGE Publishing. He assists in the creation, curation, revision, and distribution of various forms of written content primarily for Social Science Space and Method Space. He is studying Philosophy and English at California Lutheran University, where he is a research fellow and department assistant. If you’re likely to find him anywhere, he’ll be studying from a textbook, writing (either academically or creatively), exercising, or defying all odds and doing all these things at once.

View all posts by Gus Wachbrit

Related Articles

Our Open-Source Tool Allows AI-Assisted Qualitative Research at Scale
Innovation
November 13, 2024

Our Open-Source Tool Allows AI-Assisted Qualitative Research at Scale

Read Now
The Decameron Revisited – Pandemic as Farce
Public Engagement
August 6, 2024

The Decameron Revisited – Pandemic as Farce

Read Now
Developing AFIRE – Platform Connects Research Funders with Innovative Experiments
Resources
July 16, 2024

Developing AFIRE – Platform Connects Research Funders with Innovative Experiments

Read Now
Critical Thinking and Global Democracy: Strategies for Navigating a Fraught Political Landscape 
Resources
July 16, 2024

Critical Thinking and Global Democracy: Strategies for Navigating a Fraught Political Landscape 

Read Now
AI Database Created Specifically to Support Social Science Research

AI Database Created Specifically to Support Social Science Research

A new database houses more 250 different useful artificial intelligence applications that can help change the way researchers conduct social science research.

Read Now
Pandemic Nemesis: Illich reconsidered

Pandemic Nemesis: Illich reconsidered

An unexpected element of post-pandemic reflections has been the revival of interest in the work of Ivan Illich, a significant public intellectual […]

Read Now
How ‘Dad Jokes’ Help Children Learn How To Handle Embarrassment

How ‘Dad Jokes’ Help Children Learn How To Handle Embarrassment

Yes, dad jokes can be fun. They play an important role in how we interact with our kids. But dad jokes may also help prepare them to handle embarrassment later in life.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments