Paper deep dive
A Platform-Agnostic Multimodal Digital Human Modelling Framework: Neurophysiological Sensing in Game-Based Interaction
Daniel J. Buxton, Mufti Mahmud, Jordan J. Bird, Thomas Hughes-Roberts, David J. Brown
Abstract
Abstract:Digital Human Modelling (DHM) is increasingly shaped by advances in AI, wearable biosensing, and interactive digital environments, particularly in research addressing accessibility and inclusion. However, many AI-enabled DHM approaches remain tightly coupled to specific platforms, tasks, or interpretative pipelines, limiting reproducibility, scalability, and ethical reuse. This paper presents a platform-agnostic DHM framework designed to support AI-ready multimodal interaction research by explicitly separating sensing, interaction modelling, and inference readiness. The framework integrates the OpenBCI Galea headset as a unified multimodal sensing layer, providing concurrent EEG, EMG, EOG, PPG, and inertial data streams, alongside a reproducible, game-based interaction environment implemented using SuperTux. Rather than embedding AI models or behavioural inference, physiological signals are represented as structured, temporally aligned observables, enabling downstream AI methods to be applied under appropriate ethical approval. Interaction is modelled using computational task primitives and timestamped event markers, supporting consistent alignment across heterogeneous sensors and platforms. Technical verification via author self-instrumentation confirms data integrity, stream continuity, and synchronisation; no human-subjects evaluation or AI inference is reported. Scalability considerations are discussed with respect to data throughput, latency, and extension to additional sensors or interaction modalities. Illustrative use cases demonstrate how the framework can support AI-enabled DHM and HCI studies, including accessibility-oriented interaction design and adaptive systems research, without requiring architectural modifications. The proposed framework provides an emerging-technology-focused infrastructure for future ethics-approved, inclusive DHM research.
Tags
Links
- Source: https://arxiv.org/abs/2603.10680v1
- Canonical: https://arxiv.org/abs/2603.10680v1
PDF not stored locally. Use the link above to view on the source site.
Intelligence
Status: failed | Model: google/gemini-3.1-flash-lite-preview | Prompt: intel-v1 | Confidence: 0%
Last extracted: 3/13/2026, 1:11:36 AM
OpenRouter request failed (402): {"error":{"message":"This request requires more credits, or fewer max_tokens. You requested up to 65536 tokens, but can only afford 56816. To increase, visit https://openrouter.ai/settings/keys and create a key with a higher monthly limit","code":402,"metadata":{"provider_name":null}},"user_id":"user_2shvuzpVFCCndDdGXIdfi40gIMy"}
Entities (0)
Relation Signals (0)
No relation signals yet.
Cypher Suggestions (0)
No Cypher suggestions yet.
Full Text
24,198 characters extracted from source content.
Expand or collapse full text
A Platform-Agnostic Multimodal Digital Human Modelling Framework: Neurophysiological Sensing in Game-Based Interaction Daniel J. Buxton 1[0000−0002−8729−3736] , Mufti Mahmud 1,2[0000−0002−2037−8348] , Jordan J. Bird 1[0000−0002−9858−1231] , Thomas Hughes-Roberts 1[0000−0002−3204−8610] , and David J. Brown 1[0000−0002−1677−7485] 1 Nottingham Trent University, Nottingham, NG11 8NS, United Kingdom dan.buxton, jordan.bird, thomas.hughes-roberts, david.brown@ntu.ac.uk 2 King Fahd University of Petroleum and Minerals, Dhahran 31261, Kingdom of Saudi Arabia mufti.mahmud@kfupm.edu.sa Abstract. Digital Human Modelling (DHM) is increasingly shaped by advances in artifi- cial intelligence (AI), wearable biosensing, and interactive digital environments, particularly in research addressing accessibility and inclusion. However, many AI-enabled DHM ap- proaches remain tightly coupled to specific platforms, tasks, or interpretative pipelines, lim- iting reproducibility, scalability, and ethical reuse. This paper presents a platform-agnostic DHM framework designed to support AI-ready multimodal interaction research by ex- plicitly separating sensing, interaction modelling, and inference readiness. The framework integrates the OpenBCI Galea headset as a unified multimodal sensing layer, providing con- current Electroencephalogram (EEG), Electromyogram (EMG), Electro-oculogram (EOG), Photoplethysmogram (PPG), and inertial data streams, alongside a reproducible, game- based interaction environment implemented using SuperTux. Rather than embedding AI models or behavioural inference, physiological signals are represented as structured, tem- porally aligned observables, enabling downstream AI methods to be applied under appro- priate ethical approval. Interaction is modelled using computational task primitives and timestamped event markers, supporting consistent alignment across heterogeneous sen- sors and platforms. Technical verification via author self-instrumentation confirms data integrity, stream continuity, and synchronisation; no human-subjects evaluation or AI in- ference is reported. Scalability considerations are discussed with respect to data throughput, latency, and extension to additional sensors or interaction modalities. Illustrative use cases demonstrate how the framework can support AI-enabled DHM and HCI studies, includ- ing accessibility-oriented interaction design and adaptive systems research, without requir- ing architectural modifications. The proposed framework provides an emerging-technology- focused infrastructure for future ethics-approved, inclusive DHM research. Keywords: Digital Human Modelling· Multimodal Neurophysiological Sensing· Platform- Agnostic Frameworks· Game-Based Interaction· Accessibility and Inclusion. arXiv:2603.10680v1 [cs.HC] 11 Mar 2026 2D. J. Buxton et al. 1 Introduction Digital Human Modelling (DHM) plays a central role in the design of human–computer systems across domains such as ergonomics, safety, health, and accessibility. Recent advances in wearable sensing and interactive technologies have expanded the range of signals available for modelling human interaction, including neurophysiological, muscular, ocular, and cardiovascular measures. At the same time, there is growing recognition that accessibility and inclusion must be treated as first-class design considerations within DHM, particularly when research aims to support diverse populations and contexts. Despite these advances, many existing digital modelling and multimodal interaction approaches remain tightly coupled to specific platforms, experimental setups, or task environments. Sensing, interaction, and interpretation are often integrated within bespoke pipelines optimised for a single study or application, limiting reproducibility, portability, and ethical reuse. This coupling presents challenges for accessibility-oriented research, where interaction tasks and sensing configurations may need to be adapted to accommodate differing motor, sensory, or cognitive needs without re-engineering the entire system. In parallel, the use of neurophysiological signals in human–computer interaction has raised important ethical considerations. While such signals can provide valuable contextual information about interaction, their interpretation is frequently conflated with inference about internal cogni- tive or emotional states. For DHM research, particularly in accessibility-sensitive contexts, there is a need for infrastructures that clearly separate data acquisition from interpretation, allowing physiological and interaction data to be treated as descriptive observables rather than diagnostic indicators. This paper addresses these challenges by presenting a platform-agnostic multimodal DHM framework that decouples neurophysiological sensing, interaction modelling, and inference readi- ness through a modular abstraction architecture. The framework integrates the OpenBCI Galea headset as a unified sensing layer, providing concurrent neurophysiological and inertial data streams, alongside a reproducible, game-based interaction environment implemented using Super- Tux. Interaction is modelled through structured task primitives and timestamped event markers, enabling consistent alignment between sensing and interaction while remaining independent of specific hardware or software platforms. The contribution of this work is architectural rather than evaluative. Technical verification is limited to the authors’ self-instrumentation to confirm data integrity, stream continuity, and temporal alignment; no human-subjects research is reported, and no behavioural, emotional, or accessibility outcomes are inferred. By focusing on infrastructure rather than inference, the pro- posed framework provides a reusable scaffold for future ethics-approved DHM studies, supporting A Platform-Agnostic Multimodal DHM Framework3 inclusive and accessible research design through platform-independent sensing and interaction modelling. This paper is organised into the following sections: Related Works: reviews prior research in Digital Human Modelling, multimodal physio- logical sensing, game-based interaction, accessibility, and ethical considerations, positioning the present work within existing DHM and HCI literature while identifying limitations in portability, abstraction, and ethical separation. Framework Overview: introduces the design objectives and architectural principles of the proposed platform-agnostic DHM framework, including separation of sensing, interaction mod- elling, and inference readiness, with emphasis on accessibility-oriented and ethically bounded research design. Sensing Integration and Verification: describes the integration of the OpenBCI Galea headset as a multimodal sensing layer, detailing signal abstraction, temporal synchronisation, technical verification via author self-instrumentation, and considerations for scalability and data throughput. Interaction Modelling and Applied Implications: presents the game-based interaction environment and interaction primitives, followed by illustrative DHM and HCI use cases and concrete accessibility adaptation examples that demonstrate how the framework may support inclusive research without embedding evaluative or diagnostic assumptions. Conclusion: summarises the contribution and limitations of the framework and outlines planned ethics-approved validation steps and future research directions. 2 Related Works Human Modelling has a long history within ergonomics, safety, and human-system interaction, where computational representations of human characteristics are used to inform system design rather than to evaluate individual performance [4,5]. Early DHM research established that mod- elling need not be limited to visual or biomechanical avatars, but can instead operate at the level of interaction structure and task abstraction [6]. Layered DHM architectures separating data ac- quisition, abstraction, and modelling have subsequently been advocated to support reuse across application domains and experimental contexts [6]. In parallel, research in physiological computing has demonstrated that signals such as Elec- troencephalography (EEG), Electromyography (EMG), Electro-oculography (EOG), and cardio- vascular measures can be incorporated into interactive systems as additional information channels [7]. Importantly, foundational work in this area treats physiological signals as interaction-level observables rather than direct indicators of internal cognitive or emotional state. Multimodal sensing approaches are commonly adopted to improve robustness and contextual coverage in 4D. J. Buxton et al. wearable and human-centred systems [2], although much of the literature focuses on downstream classification or inference, raising methodological and ethical considerations. Recent advances in wearable biosensing have enabled compact platforms that integrate multi- ple physiological and inertial modalities into a single device. The Galea headset[15], for example, provides concurrent EEG, EMG, EOG, photoplethysmography (PPG), and inertial measurement streams intended for research and interactive applications [3,9]. Existing work using similar sens- ing technologies typically embeds these signals within task-specific pipelines, limiting portability and reuse across studies. Games and interactive simulations have also been widely used as structured environments [10] for studying human interaction. Digital games offer deterministic mechanics, repeatable task structures, and well-defined event boundaries, making them suitable as controlled interaction sub- strates [17]. Prior work has combined gameplay with physiological sensing to model affective or experiential states, often focusing on real-time interpretation or performance evaluation [11,12]. In contrast, more neutral uses of games treat them as task environments that generate struc- tured interaction events without embedding interpretative assumptions, supporting reproducible modelling approaches. Accessibility and inclusion have increasingly been framed within HCI as systems-level de- sign challenges rather than properties to be assessed post hoc [16]. Inclusive design approaches emphasise flexibility and adaptability at the level of interaction and infrastructure, enabling ac- commodation of diverse user needs [14,1]. From a DHM perspective, platform-agnostic sensing and interaction pipelines can therefore support inclusive research design by reducing dependence on proprietary tools or rigid experimental protocols. Finally, the ethical use of physiological data in interactive systems has received growing at- tention. Concerns regarding over-interpretation, unintended inference, and misuse of biosignals motivate a clear separation between data acquisition and interpretation [13]. Ethical frameworks for human-centred AI similarly emphasise transparency and boundary-setting in sensitive appli- cation domains [8]. These considerations motivate DHM frameworks that prioritise abstraction and infrastructure over inference, enabling future ethics-approved studies without premature or unsupported claims. In comparison to existing DHM and multimodal interaction frameworks, which often inte- grate sensing, task execution, and interpretation within tightly coupled and application-specific pipelines, the present work focuses explicitly on the infrastructural layer that precedes inference. Rather than proposing new behavioural metrics, adaptive algorithms, or representational models, the contribution lies in separating sensing, interaction modelling, and inference readiness. This distinction enables platform-agnostic deployment and ethical reuse across studies, addressing lim- itations in portability and reproducibility observed in prior approaches. A Platform-Agnostic Multimodal DHM Framework5 3 Framework Overview This work proposes a platform-agnostic framework for DHM that separates multimodal sensing, interaction modelling, and inference readiness into distinct architectural layers. The objective is to provide reusable research infrastructure that supports ethically bounded, accessibility-oriented DHM studies across diverse application contexts. Rather than introducing new behavioural met- rics or interpretative models, the framework focuses on architectural principles that enable repro- ducible, adaptable, and ethically defensible human–computer interaction research. 3.1 Design Objectives The framework is guided by four core design objectives. First, platform agnosticism ensures that sensing hardware, interaction environments, and downstream analysis components can be substi- tuted or extended without architectural modification. Second, separation of concerns is enforced by decoupling sensing, interaction modelling, and inference, reducing methodological entangle- ment and supporting ethical reuse of collected data. Third, accessibility-oriented extensibility is treated as a design constraint, enabling interaction tasks and sensing configurations to be adapted for diverse participant needs without redefining the core pipeline. Finally, ethical separation of inference ensures that physiological and interaction data are treated as descriptive observables, avoiding premature interpretation or diagnostic claims. 3.2 Architectural Overview At a high level, the framework comprises a multimodal sensing layer, an abstraction layer responsi- ble for temporal alignment and data structuring, and an interaction modelling layer. Physiological and inertial signals are captured independently of the interaction environment and synchronised using timestamped event markers. Interaction is represented through structured task descriptors rather than performance metrics or behavioural scores. This layered architecture supports reuse across DHM applications while maintaining transparency regarding system scope and limitations (Fig. 1). 6D. J. Buxton et al. Fig. 1. High-level system architecture and deployment of the SuperTux interaction environment and Galea sensing pipeline. 4 Sensing Integration and Verification The multimodal sensing layer integrates the OpenBCI Galea headset as a unified source of phys- iological data. Galea provides concurrent EEG, EMG, EOG, PPG, and inertial measurement streams, enabling capture of interaction-adjacent signals within a single wearable platform. The framework treats these signals as parallel data sources, abstracted from any task-specific inter- pretation. Table 1 shows an overview of the modalities available. 4.1 Signal Abstraction and Synchronisation All sensing streams are timestamped at acquisition and aligned with interaction events gener- ated by the task environment. Synchronisation is performed at the abstraction layer, allowing physiological data to be temporally associated with interaction primitives without embedding assumptions about behavioural meaning. This design supports consistent alignment across het- erogeneous data sources while preserving flexibility in downstream analysis. 4.2 Technical Validation Technical verification was conducted exclusively through the authors’ self-instrumentation to confirm system functionality, stream continuity, and temporal alignment. Verification focused A Platform-Agnostic Multimodal DHM Framework7 Table 1. Available Galea Beta headset modalities ModalityLocationSample RateChannelsParameters and Notes EEGScalp250 Hz10 Dry active electrodes, F1, F2, C3, Cz, C4, P3, Pz, P4, O1, O2 ExGForehead250 Hz0-2 Passive EEG Fp1, Fp2 EMGFacial250 Hz4-6 Contains ExG EOGFacial250 Hz2 4 EMG electrodes PPGEar clip250 Hzn/a Red & IR light A2 clip placement IMUForehead250 Hz6-axis Accelerometer with +/- 4g range Gyroscope with +/- 500deg/s IMU (MAG) Forehead25 Hz3-axis Magentometer with +/- 1300uT on validating end-to-end data capture and synchronisation rather than behavioural analysis. No human-subjects research was performed, and no behavioural, emotional, or accessibility outcomes were analysed. 4.3 Scalability and Performance The modular separation of sensing and interaction layers supports scalability to larger studies or additional sensors by treating each data stream as an independent, timestamped source. Buffer- ing and decoupling between acquisition, storage, and downstream processing allow increased data throughput without architectural change. While formal latency benchmarking is beyond the scope of the present work, configurable sampling rates and parallel stream handling enable future de- ployment in larger-scale or longitudinal DHM studies. 5 Integration Modelling and Applied Implications 5.1 Interaction Modelling Using Game-Based Tasks Interaction is implemented using the open-source platform game SuperTux, selected for its de- terministic mechanics, discrete event structure, and low sensory complexity. The aim of the game is to reach the end of each level in the shortest amount of time and to gain as many coins as possible, all while avoiding enemy entities that will make the player re-spawn upon contact, in addition to loosing some collected coins and power-up abilities. A screenshot of a level in the game can be seen in Fig 2. 8D. J. Buxton et al. Fig. 2. SuperTux game play Gameplay actions are abstracted into interaction primitives such as movement sequences, tim- ing events, task progression markers, and error or recovery events. These primitives are indepen- dent of both the game engine and sensing hardware, enabling structured modelling of interaction without reliance on game-specific representations. Interaction descriptors are treated as neutral representations of task engagement rather than indicators of performance quality, cognitive state, or affect. This distinction ensures that interac- tion modelling remains ethically bounded and compatible with diverse DHM methodologies. 5.2 Illustrative Modelling and HCI Use Cases Although no human-subjects evaluation is reported, the framework is designed to support a range of DHM and HCI research scenarios. For example, future ethics-approved studies could use the interaction and sensing pipeline to examine adaptive interface timing by analysing how physio- logical and interaction signals co-occur during repeated task exposure. Similarly, the framework could support comparative studies of interaction strategies under different task constraints or input configurations, without modifying the underlying sensing or synchronisation infrastructure. These use cases are illustrative and do not imply evaluation or effectiveness claims. A Platform-Agnostic Multimodal DHM Framework9 5.3 Accessibility and Inclusion Implications Accessibility and inclusion are addressed as infrastructural design considerations rather than evaluated outcomes. Interaction tasks can be configured to reduce motor demands by limiting required inputs or adjusting timing constraints, supporting studies involving participants with motor impairments. Sensory load can likewise be modified through visual or auditory simplifica- tion, enabling research with participants who experience sensory sensitivities. Such adaptations occur at the interaction layer and do not require changes to the sensing, abstraction, or synchro- nisation mechanisms, supporting inclusive DHM research design. 6 Conclusion This work presents a framework-level contribution and reports no human-subjects research. Ver- ification was limited to the authors’ self-instrumentation to confirm technical functionality. No behavioural, emotional, or accessibility outcomes are inferred. 6.1 Future Work Future work will involve ethics-approved pilot studies to validate the framework in applied DHM contexts. Planned steps include accessibility-focused deployments, comparative task configura- tions across interaction modalities, and longitudinal studies examining system robustness across repeated sessions. These studies will enable empirical assessment of the framework’s suitability for inclusive DHM research while preserving the ethical separation between sensing, interaction modelling, and inference established in the present work. The proposed framework provides a reusable, platform-agnostic scaffold for multimodal DHM research that prioritises abstraction, ethical boundary-setting, and accessibility-oriented design. It is intended to support future empirical studies while avoiding premature interpretative claims, aligning with the goals of DHM research within HCII. Disclosure of Interests. The authors have no competing interests to declare that are relevant to the content of this article. References 1. Abascal, J., Barbosa, S.D., Nicolle, C., Zaphiris, P.: Rethinking universal accessibility: A broader approach considering the digital gap. Journal of Accessibility and Design for All 6(2), 179–205 (2016). https://doi.org/10.1007/s10209-015-0416-1 10D. J. Buxton et al. 2. Banaee, H., Ahmed, M.U., Loutfi, A.: Data mining for wearable sensors in health monitoring systems: A review of recent trends and challenges. ACM Computing Surveys 45(1), 1–42 (2013). https://doi. org/10.1145/2431211.2431216 3. Bernal, G., Hidalgo, N., Russomanno, C., Maes, P.: Galea: A physiological sensing system for be- havioral research in virtual environments. In: 2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). p. 66–76 (2022). https://doi.org/10.1109/VR51125.2022.00024 4. Bubb, H., Fritzsche, F.: A scientific perspective of digital human models: Past, present, and future. Ergonomics 52(3), 316–334 (2009). https://doi.org/10.1080/00140130802680755 5. Chaffin, D.B.: Digital human modeling for workplace design. Applied Ergonomics 38(6), 719–730 (2007). https://doi.org/10.1016/j.apergo.2006.10.002 6. Duffy, V.G. (ed.): Digital Human Modeling: Applications in Health, Safety, Ergonomics and Risk Management. Springer (2017). https://doi.org/10.1007/978-3-319-41694-6 7. Fairclough, S.H.: Fundamentals of physiological computing. Interacting with Computers 21(1–2), 133–145 (2009). https://doi.org/10.1016/j.intcom.2008.10.011 8. Floridi, L., et al.: Ai4people—an ethical framework for a good ai society. Minds and Machines 28, 689–707 (2018). https://doi.org/10.1007/s11023-018-9482-5 9. Gupta, K., Zhang, Y., Gunasekaran, T.S., Sasikumar, P., Krishna, N., Pai, Y.S., Billinghurst, M.: Vrdography: An empathic vr photography experience. In: 2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). p. 1013–1014 (2023). https://doi.org/10. 1109/VRW58643.2023.00352 10. Haddick, S., Brown, D.J., Connor, B., Lewis, J., Bates, M., Schofield, S.: Metahumans: Using facial action coding in games to develop social and communication skills for people with autism. In: An- tona, M., Stephanidis, C. (eds.) Universal Access in Human-Computer Interaction. User and Context Diversity. p. 343–355. Springer International Publishing, Cham (2022). https://doi.org/10.1007/ 978-3-031-93508-4_15 11. Mandryk, R.L., Atkins, M.S.: A fuzzy physiological approach for continuously modeling emotion during interaction with play technologies. International Journal of Human-Computer Studies 65(4), 329–347 (2007). https://doi.org/10.1016/j.ijhcs.2006.11.011 12. Nacke, L.E., Lindley, C.A.: Flow and immersion in first-person shooters. In: Proceedings of the International Conference on Future Play. p. 81–88 (2008). https://doi.org/10.1145/1496984.1496998 13. Nebeker, C., et al.: Ethical and regulatory challenges of physiological sensing. Journal of Law, Medicine & Ethics 47(2), 330–337 (2019). https://doi.org/10.1177/1073110519840493 14. Newell, A.F., Gregor, P.: User sensitive inclusive design. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. p. 434–441 (2000). https://doi.org/10.1145/332040.332481 15. OpenBCI: Galea multimodal biosensing platform: Technical documentation. https://docs.openbci. com (2023), accessed 2026-01 16. Stephanidis, C., et al.: Seven hci grand challenges. International Journal of Human–Computer Inter- action 35(14), 1229–1269 (2019). https://doi.org/10.1080/10447318.2019.1619259 17. Yannakakis, G.N., Togelius, J.: Artificial Intelligence and Games. Springer (2018). https://doi.org/ 10.1007/978-3-319-63519-4