Translation

The Transparency Gap: What’s Missing from Qualitative Research Reporting in Information Science?

The Transparency Gap: What’s Missing from Qualitative Research Reporting in Information Science?

Rebecca D. Frank & Adam Kriesberg

How do Information Science researchers describe their use of qualitative methods? What do they say about their approach to different steps in the research process such as dates of data collection, people involved in the research process, and whether they’ve obtained ethics board approval for their work? What information gets left out? These questions lie at the heart of ongoing conversations around trust in research and the reusability of research data across the academic landscape. The field of Information Science encompasses many forms of scholarship which examine the world through the lens of information. As members of the Association for Information Science and Technology (ASIS&T) community, we have benefited greatly from the relationships and networks of scholars employing different methods in order to answer big questions using a variety of research methods. 

—Academic researchers need readers to trust their work—

In our work, we both use primarily qualitative methods such as interviews, focus groups, and document analysis, which motivated our recent paper “Mapping the Landscape, Measuring the Gap: Qualitative Methods Reporting in Information Science Research”, presented at the 2025 ASIS&T Annual Meeting in Crystal City, VA. We analyzed recent ASIS&T papers in order to understand how qualitative research is reported in Information Science research, and how closely these observed practices align with author guidelines for publication venues across the discipline.

Expectations and standards about how researchers should describe their data collection and analysis methods vary by discipline, method, and data types. Research that employs qualitative data can be particularly difficult to standardize, given the broad scope and potential sensitivity for qualitative research data. 

Our team examined long and short papers from the ASIS&T Annual Meeting across five years (2018-2022). After multiple rounds of screening and review, we identified 117 papers that featured exclusively qualitative research methods. We found that we were unable to determine both the age and source of data for much of the qualitative research that we examined. 

We found that 92 (78.6%) of those papers described research methods that relied on human subjects research, 81 of which involved qualitative interviews, but only 26 of those 92 (28.3%) contained information about ethics or institutional review board approval for the research. 

This means that qualitative research published in ASIS&T, a leading Information Science conference, largely fails to provide readers with the basic information that would be needed to evaluate research quality, understand the context of data collection and analysis, or verify that ethical standards were upheld. Qualitative research papers in Information Science represent a broad range of research practices and data type, but a lack of consensus about what information should be reported about research methods presents a challenge, and we would argue a problem, for the field of Information Science. This lack of transparency makes it difficult to understand how research was conducted, verify ethical research practices, or examine data reuse.

These findings are consistent with our examination of publishing requirements across a range of Information Science venues. We found that the instructions for authors preparing submissions to ASIS&T, as well as JASIS&T, are vague and do not specify a minimum level of information needed. Similarly, other publishers of prominent Information Science venues, including those published by Emerald Publishing Group, Sage Journals, and ScienceDirect. 

Taken together, these findings point to a problem for the field of Information Science. We think that the field is ready for a conversation about how we describe our use of qualitative methods and what information we should expect to see in reports of qualitative research. 

Although this was a pilot study that only analyzed recent publications from the ASIS&T Annual Meeting, it revealed that this research community does not yet have well-established norms around reporting for qualitative methods. Some of the elements we did (or didn’t) observe in these papers may be attributed to structural features of these types of publications such as page limits, while others suggest that qualitative Information Science researchers have yet to settle on a minimum set of information about methods that should accompany published studies in the field. 

Academic researchers need readers to trust their work. Clear reporting standards show a field’s maturity and reveal the often-hidden labor behind research findings. This matters because policymakers, practitioners, and the public rely on this research to make decisions—from designing library services to developing information systems that affect millions of users.

We argue that more stringent publishing guidelines are needed for qualitative research. Specifically, we call for all publishing venues managed by ASIS&T to require at a minimum the following information: when the data were collected, who collected them, whether ethics approval was obtained, and whether and where the data can be found. And we believe that this information should be provided to peer reviewers who are evaluating manuscripts for publication. In the paper, we further unpack grammatical choices such as the use of passive voice, which can obscure who conducted which research tasks and how these may have impacted the construction of subsequent findings.

We are sensitive to concerns that implementing some key expectations/requirements for reporting would burden researchers or privilege particular types of qualitative methods, but we respectfully disagree. Improved transparency will allow us to better understand the transformative and innovative research published in leading Information Science venues such as ASIS&T and JASIS&T, and will open up avenues for scholars who study data practices and methods. Being clear about both the what and the who of research using qualitative data also allows us to credit researchers for their work, including the often invisible work of collecting, preparing, and analyzing complex qualitative data.

Cite this article in APA as:  Frank, R. D., & Kriesberg, A. (2026, February 4). The transparency gap: What’s missing from qualitative research reporting in information science? Information Matters. https://informationmatters.org/2026/01/the-transparency-gap-whats-missing-from-qualitative-research-reporting-in-information-science/

Authors

  • Rebecca D. Frank

    Rebecca Frank is an Assistant Professor at the University of Michigan School of Information and a Faculty Affiliate at the Inter-university Consortium for Political and Social Research (ICPSR). She is also affiliated with the Einstein Center Digital Future in Berlin, Germany. Her research examines the social construction of risk in trustworthy digital repository audit and certification. She also conducts research in the areas of open data, digital preservation, digital curation, and data reuse, focusing on social and ethical barriers that limit or prevent the preservation, sharing, and reuse of digital information. Her work has been supported by the Institute for Museum and Library Services (IMLS), the Deutsche Stiftung Friedensforschung (German Foundation for Peace Research), the Einstein Centre Digital Future, the InfraLab Berlin, the National Science Foundation (United States), and the Australian Academy of Science.

    View all posts Assistant Professor, University of Michigan School of Information
  • Adam Kriesberg

Rebecca D. Frank

Rebecca Frank is an Assistant Professor at the University of Michigan School of Information and a Faculty Affiliate at the Inter-university Consortium for Political and Social Research (ICPSR). She is also affiliated with the Einstein Center Digital Future in Berlin, Germany. Her research examines the social construction of risk in trustworthy digital repository audit and certification. She also conducts research in the areas of open data, digital preservation, digital curation, and data reuse, focusing on social and ethical barriers that limit or prevent the preservation, sharing, and reuse of digital information. Her work has been supported by the Institute for Museum and Library Services (IMLS), the Deutsche Stiftung Friedensforschung (German Foundation for Peace Research), the Einstein Centre Digital Future, the InfraLab Berlin, the National Science Foundation (United States), and the Australian Academy of Science.