Abstract
Virtual Reality (VR) is gaining traction in cognitive and decision-making research because of its ability to generate immersive, controlled environments that closely replicate real-world situations. Its integration with neurophysiological tools such as electroencephalography (EEG) and eye-tracking offers a unique opportunity to gain deep insights into consumer behaviour by combining behavioural and neural measures in real-time. However, the simultaneous use of VR and neurophysiological measures remains challenging due to crucial issues concerning data stream alignment, event timestamping, hardware compatibility, and potential signal interference induced by head-mounted equipment. To date, the absence of standardised protocols has limited the scalability and reproducibility of multimodal VR research, thereby hindering its widespread adoption. This paper presents a detailed, step-by-step guideline for harmonising EEG, eye-tracking, and VR data streams using the Lab Streaming Layer (LSL) in a Unity-based VR environment. A Varjo headset with in-built eye-tracking and a Neuroelectrics Enobio EEG system are used as a working case to illustrate a practical implementation of the guidelines displayed. By outlining clear guidelines for hardware configuration, event timestamping and software implementation, this paper demonstrates how open-source tools can enable high-precision data synchronisation in immersive research setting. The protocol is flexible and transferable to similar setups and therefore supports cross-study comparability and encourages wider uptake of multimodal VR methodologies, while acknowledging methodological constraints.
[Display omitted]