Tech Companies are Secretive; Researchers Need to Get Investigative
Daniel Carter, Texas State University, San Marcos, TX
Amelia Acker, University of Texas at Austin, Austin, TX
Dan Sholler, University of California, Santa Barbara, Santa Barbara, CA
This is a translation of our JASIST article: https://asistdl.onlinelibrary.wiley.com/doi/abs/10.1002/asi.24446
In October, Frances Haugen testified before Congress about how Facebook products have harmed democracy, public safety and children. Haugen, a former product manager with Facebook’s civic integrity team, disclosed thousands of internal documents to The Wall Street Journal and the Securities and Exchange Commission, revealing that the company was aware of the harmful impact of its products on users.
In her testimony, Haugen pointed to the ways Facebook obfuscated that harmful impact: “During my time at Facebook, I came to realize a devastating truth: Almost no one outside of Facebook knows what happens inside Facebook.”
A clear takeaway of the documents leaked by Haugen is that, while Facebook employs researchers who study how its products impact users, recommendations made by these researchers are often pushed aside in favor of efforts to increase engagement and profits. Further, internal findings from Facebook researchers and their recommendations are hidden from the public and from independent, outside researchers.
The difficulty of knowing what happens inside Facebook is especially troubling for academic researchers in fields such as information science who work to understand how technology impacts society and, in many cases, to provide guidance to organizations and governments about how technologies can be designed to reduce harm and promote well-being. Researchers who take a critical approach to these topics ask questions that are especially pressing in the wake of Haugen’s reports on Facebook’s operations: How do statements made by technology companies reflect or fail to reflect their actual workings? How do technologies work to subvert democratic participation?
It’s currently difficult or impossible for researchers to answer these questions partly due to the secrecy exhibited by many large technology companies like Facebook and Google. As Shoshana Zuboff points out in The Age of Surveillance Capitalism, researchers face distinct challenges when attempting to understand the operations of such companies. While in the past researchers were often welcomed into the halls of companies such as GM, she notes that companies such as Google prevent this access and instead choose to release carefully crafted statements and books as a tactic to control their image. Additionally, technology companies often insist on releasing information to journalists “on background,” meaning that how information is presented is highly negotiated and creates what veteran tech journalist Brian Merchant refers to as a “toxic arrangement” that “shields tech companies from accountability.”
But it’s also hard to answer these questions because academic researchers have historically worked hard to ensure that everyone involved in a research project is able to control how information is released. Research that involves techniques such as interviewing people or observing their work, for example, conventionally includes explicit statements designed to make sure that participants know that they don’t have to divulge any information they don’t want to, that responses will not be shared with their employer, and that they can stop participating at any time.
The privacy and confidentiality measures which academic researchers take to protect participants are crucial, especially when dealing with vulnerable groups such as prisoners or children. But the contexts in which these measures were developed are markedly different than what is discussed here. Large technology companies are not vulnerable. Instead, when it comes to providing information that would make it possible to evaluate their operations, they have repeatedly exerted their power to control access to data and to shape narratives in ways that align with their public relations needs by, for example, revoking data access from journalists and academic researchers studying ad-targeting on the platform.
Academic researchers who want to study what goes on inside Facebook and other large technology companies have few options. They can study technologies’ effects and discover, perhaps, that a product causes specific kinds of harm, but they’ll likely be unable to understand the organizational decisions behind those harms. They’ll be unable to access documents that, like those released by Haugen, reveal the processes through which harms are recognized and then allowed to continue. Observing these processes is crucial, however, because political violence and the erosion of democratic participation are not only effects of these companies’ technologies; they also have to be understood as consequences of specific organizational cultures.
One possible route for researchers who want to observe what goes on inside large technology companies is to acknowledge the limitations of allowing research participants to control the information released about themselves. Researchers who use audit studies, which often violate platforms’ terms of service or the ethical conventions of academic research, are already exploring the possibility of working around company’s obstructions by scraping website data and creating fake accounts in order to look for bias in algorithms. Recent legal proceedings suggest that these tactics are indeed legal, or at least legally defensible.
Beyond audit studies, another way for researchers to improve their access to information purposefully obscured by technology companies is to explore the kinds of investigative techniques used by journalists. Investigations by The Markup, for example, have not only shown the bias in algorithms but also obtained documents that reveal how companies have tried to shield themselves from associated legal consequences.
Contacting former employees who might be willing to give up some dirt or looking for whistleblowers with documents to leak are not normal activities for academic researchers. Researchers aren’t taught these skills, and in some ways they go against the assumptions of internal review boards that evaluate research projects for ethical clearance. But they might be crucial tools for understanding the cultures and organizational practices surrounding the products which are currently shifting society in impactful ways—and if information-science researchers want to maintain a critical perspective on technology development and use, they might have to re-evaluate their conventions and get more investigative.
Cite this article in APA as: Carter, D. (2021, October 28). Tech companies are secretive; Researchers need to get investigative. Information Matters. Vol.1, Issue 10. https://r7q.22f.myftpupload.com/2021/10/tech-companies-are-secretive-researchers-need-to-get-investigative/