ChinesePromotional

The Embodied Cognition and Experience Measurement Commons: Transforming Human Information Behavior Research

The Embodied Cognition and Experience Measurement Commons: Transforming Human Information Behavior Research

Tingting Jiang, Zhumo Sun

The increasing maturity of various artificial intelligence (AI) systems, from personalized recommenders and conversational agents to virtual assistants and even social robots with physical-world presence, has fundamentally reshaped how humans acquire, interpret, and use information, significantly impacting their mental processes and decision-making. While self-report or observational methods have been widely adopted for investigating human-computer interaction, they prove insufficient in addressing the dynamic and multifaceted nature of human-AI interaction (Wekenborg et al., 2025). The complexity of interaction, combined with the need to cope with vast, heterogeneous data streams generated in the AI-mediated information environment, calls for next-generation research frameworks to engender fresh insights into human information behavior.

Wuhan University has led the way in designing and developing a transformative research facility, the Embodied Cognition and Experience Measurement Commons (ECEMC). It integrates state-of-the-art digital technologies to redefine experimental paradigms of information behavior research. The physical space of the ECEMC comprises three functional areas, each powered by distinct technological frameworks.

The Immersive Interaction Area is equipped with popular immersive media, including interactive displays, digital projections, and extended reality (VR/AR/MR), to create multi-sensory experiences by blending digital content with the physical environment. The setup embodies the principles of embodied cognition, a hypothesis positing that cognition emerges from the body’s interaction with the environment (Wilson, 2002). By leveraging full-body engagement and real-time adaptive feedback, this area aims to bridge the gap between human intuition and machine intelligence, fostering more natural human-AI interaction than the interaction mediated by traditional screen-based media that is often constrained by symbolic abstraction.

The Multimodal Data Collection Area features a cutting-edge setup that enables real-time, high-precision measurement of human behavior and mental processes during immersive interactions. It utilized a combination of motion capture and face capture systems to track users’ body movements and facial expressions. Speech recognition technology is employed to transcribe verbal communication, while eye-tracking technology captures visual attention. To understand the underlying physiological mechanisms, this area is also equipped with EEG and fNIRS to monitor brain neural activity, ECG to track heart activity, EMG to assess muscle engagement, and EDA to detect changes in skin conductance. Such a multimodal approach ensures accurate measurement of the holistic user experience.

The Collaborative Data Analysis Area is dedicated to the efficient processing, storage management, analysis, and visualization of the data derived from user experience measurement. It provides essential computational, storage, and network resources, featuring a medium-sized server cluster that supports the handling of terabyte-scale user-centered data. Hadoop and Spark form the backbone of data processing capabilities, while MySQL is employed for data storage and management and Tableau for data visualization. The collaborative efforts of multidisciplinary research teams are also facilitated in this area by regular meetings, workshops, and brainstorming sessions where ideas are exchanged and innovative solutions are developed.

The ECEMC is affiliated with the Intelligent Computing Laboratory for Cultural Heritage, Wuhan University. Constructed at a cost of 5 million RMB over a two-year period, the 300-square-meter space is designed to accommodate up to three research teams simultaneously. Initially established in 2023, its data analysis infrastructure has already powered two domain-specific studies involving large-scale datasets, one focusing on developing a job advertisement gender lexicon (Jiang et al., 2023) and another mining cultural tourism insights from online reviews (Jiang et al., 2025). Its physical layout composed of the three areas was finalized and became fully operational in March, 2025. Currently, there are two active research projects utilizing its specialized resources to investigate various challenges in human-AI interaction.

The first project focuses on the development of a VR gamified learning system centered around the cultural heritage of Dunhuang. By transporting users into a virtual world of Dunhuang’s rich history and culture through VR, the project aims to enhance learning outcomes and foster a deeper appreciation for cultural heritage. Early experimental results suggest that the immersive environment enhances user learning motivation and knowledge retention through interactive exploration of historical sites and artworks, cultivating a special emotional connection that neither classroom nor textbook can provide.

The second project pioneers the development of an AI-powered virtual assistant system for rehabilitation assisted by lower limb exoskeleton robots. This project highlights the use of fNIRS and EMG to assess participants’ cognitive load and physical efforts during rehabilitation exercises, respectively. By synthesizing the multimodal physiological data in real time, the virtual assistant system can generate adaptive personalized feedback and guidance, from gait adjustment cues to motivational prompts, to improve both motor recovery and psychological resilience. The multidisciplinary research team is exploring the potential to transform rehabilitation into a more intuitive journey.

The ECEMC establishes and implements a data-driven methodology encompassing the entire lifecycle of data generation, collection, and analysis for studying human-AI interaction, which distinguishes itself from conventional digital culture or social computing labs that mainly rely on literature, Web traces, and other secondary data. By combining empirical rigor with practical innovation, the ECEMC seeks to inform the design of human-centered AI systems across various domains, promoting system transparency, user agency, and ethical alignment.

References

Jiang, T., Li, Y., Fu, S., & Chen, Y. (2023). Creating a Chinese gender lexicon for detecting gendered wording in job advertisements. Information Processing & Management60(5), 103424.

Jiang, T., Xu, Y., Li, Y., & Xia, Y. (2025). Integration of public libraries and cultural tourism in China: An analysis of library attractiveness components based on tourist review mining. Information Processing & Management62(2), 104000.

Wekenborg, M. K., Gilbert, S., & Kather, J. N. (2025). Examining human-AI interaction in real-world healthcare beyond the laboratory. npj Digital Medicine8(1), 169.

Wilson, M. (2002). Six views of embodied cognition. Psychonomic bulletin & review9, 625-636.

Cite this article in APA as: Jiang, T., & Sun, Z. The embodied cognition and experience measurement commons: Transforming human information behavior research. (2025, May 8). https://informationmatters.org/2025/05/the-embodied-cognition-and-experience-measurement-commons-transforming-human-information-behavior-research/

Author

  • Tingting Jiang

    Tingting Jiang is a professor in the School of Information Management at Wuhan University. She obtained her Ph.D. from the University of Pittsburgh. Her research focuses on information behavior and user psychology, human-AI interaction, multisensory experience, and human-centered data.

    View all posts

Tingting Jiang

Tingting Jiang is a professor in the School of Information Management at Wuhan University. She obtained her Ph.D. from the University of Pittsburgh. Her research focuses on information behavior and user psychology, human-AI interaction, multisensory experience, and human-centered data.