EducationFeatured

Are we in the Dark Forest? Fairness of AI Use in learning

Are We in the Dark Forest? Fairness of AI Use in Learning

Bo Hyun Hong

—”My teacher told me to redo the assignment myself and gave full credit and praise to the classmate who used ChatGPT! It’s so unfair.”—

That’s what my nephew told me a few weeks ago. He was irritated by the teacher’s decision and complained about having to redo his work. I comforted him and ended the call, but the conversation stayed with me. It felt like an early warning sign, a growing gap between students who use AI for learning tasks (like summarizing or brainstorming) and those who choose not to.

Dark Forest Theory

Chinese science fiction writer Cixin Liu introduces the “dark forest” idea in The Three Body Trilogy, which many people know from Netflix’s 3 Body Problem.

The dark forest theory comes from two ideas:

  1. Chain of Suspicion: Different civilizations in the universe don’t trust each other because they can’t communicate clearly
  2. Technology explosion: Any civilization could suddenly make a huge technological leap at any time.

Because of these two conditions, no civilization can ever be sure about another’s intentions. So if you come across a new civilization the safest move might be to strike first, before they destroy you.

“The universe is a dark forest. Every civilization is an armed hunter stalking through the trees like a ghost, gently pushing aside branches that block the path and trying to tread without sound. Even breathing is done with care. The hunter has to be careful, because everywhere in the forest are stealthy hunters like him. If he finds other life—another hunter, an angel or a demon, a delicate infant or a tottering old man, a fairy or a demigod—there’s only one thing he can do: open fire and eliminate them. In this forest, hell is other people. An eternal threat that any life that exposes its own existence will be swiftly wiped out. This is the picture of cosmic civilization. It’s the explanation for the Fermi Paradox—The Dark Forest, Cixin Liu—

AI use and Dark Forest

Now, let’s return to my nephew’s story. If we think of each civilization in the universe as an individual student, and AI tools as the “technology explosion,” the dark forest theory can be applied to the classroom. My nephew realized that work created only by himself was less competitive than work produced through human AI-collaboration. So it’s understandable that he might feel pressure to start using AI tools too, just to keep up.

This isn’t just a personal story. In South Korea, for example, several universities recently discovered massive cheating with use of ChatGPT during online midterm exams. It was obvious that students violated academic policies, but it also raises a bigger question: should we simply blame students who turn to AI for their schoolwork? From the perspective of the dark forest, students might defend themselves like this: “To survive in a society with limited opportunities, and because I suspect others are already using AI, I feel like I have no choice but to enter the dark forest too-before I am left behind.”

Missing students (learners) in the discussion

As an adjunct instructor, I have talked with several students who choose not to use generative AI for their coursework. They have different reasons, university policies, concerns about the “black box,” or uncertainty about accuracy, but one common theme keeps appearing. They want to protect their own thinking and voice.

Ironically, this also shows that student already recognize AI’s potential to support their learning. They will choose to use it when they feel they need to. So, AI literacy and institutional policies should not revolve only around punishment or restriction. Instead, they should focus on helping students understand what AI can support, when it shouldn’t be used, and how to use it responsibly. To build those policies, colleges need to consider everyone involved: students, instructors, and institutions. Students must not be left out of the conversation, because they are the most affected stakeholders.

Generative AI isn’t going away, and students will keep navigating this new “forest”, whether everyone likes it or not. I hope librarians, educators, and LIS researchers work together on this. I really do not want education to return to paper-only examinations just because we are afraid of AI. Instead of pretending the forest doesn’t exist, we can light it together by AI literacy, creating thoughtful policies, and having open conversations rather than secret competitions.

Cite this article in APA as: Hong, B. H. (2025, December 18). Are we in the dark forest? Fairness of AI use in learning. Information Matters. https://informationmatters.org/2025/12/are-we-in-the-dark-forest-fairness-of-ai-use-in-learning/

Author

  • Bo Hyun Hong is a PhD Student in Information Studies at the University of Wisconsin-Milwaukee. Her research examines human-computer interaction, interactive system design, and AI-mediated information practices, with a particular focus on how task differences shape students' learning behaviors. Hong also works as an adjunct instructor, teaching undergraduate and graduate courses, emphasizing inclusive, inquiry-driven learning.

    View all posts PhD Student

Bo Hyun Hong

Bo Hyun Hong is a PhD Student in Information Studies at the University of Wisconsin-Milwaukee. Her research examines human-computer interaction, interactive system design, and AI-mediated information practices, with a particular focus on how task differences shape students' learning behaviors. Hong also works as an adjunct instructor, teaching undergraduate and graduate courses, emphasizing inclusive, inquiry-driven learning.