Development of a Virtual Reality Application for Oculomotor Examination Education Based on Student-Centered Pedagogy
- URL: http://arxiv.org/abs/2405.16392v1
- Date: Sun, 26 May 2024 00:53:19 GMT
- Title: Development of a Virtual Reality Application for Oculomotor Examination Education Based on Student-Centered Pedagogy
- Authors: Austin Finlayson, Rui Wu, Chia-Cheng Lin, Brian Sylcott,
- Abstract summary: This work-in-progress paper discusses the use of student-centered pedagogy to teach clinical oculomotor examination via Virtual Reality (VR)
Traditional methods, such as PowerPoint slides and lab activities, are often insufficient for providing hands-on experience due to the high cost of clinical equipment.
A VR-based application was developed using Unity and the HTC Vive Pro headset, offering a cost-effective solution for practical learning.
- Score: 3.876880241607719
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work-in-progress paper discusses the use of student-centered pedagogy to teach clinical oculomotor examination via Virtual Reality (VR). Traditional methods, such as PowerPoint slides and lab activities, are often insufficient for providing hands-on experience due to the high cost of clinical equipment. To address this, a VR-based application was developed using Unity and the HTC Vive Pro headset, offering a cost-effective solution for practical learning. The VR app allows students to engage in oculomotor examinations at their own pace, accommodating diverse backgrounds and learning preferences. This application enables students to collect and analyze data, providing a realistic simulation of clinical practice. The user study results from Doctor of Physical Therapy students indicate a high preference for the flexibility offered by the VR app, suggesting its potential as a valuable educational tool. Additionally, the paper explores the broader implications of using VR in engineering and computing education, highlighting the benefits of immersive, interactive learning environments.
Related papers
- Thelxinoƫ: Recognizing Human Emotions Using Pupillometry and Machine Learning [0.0]
This research contributes significantly to the Thelxino"e framework, aiming to enhance VR experiences by integrating multiple sensor data for realistic and emotionally resonant touch interactions.
Our findings open new avenues for developing more immersive and interactive VR environments, paving the way for future advancements in virtual touch technology.
arXiv Detail & Related papers (2024-03-27T21:14:17Z) - VisionaryVR: An Optical Simulation Tool for Evaluating and Optimizing
Vision Correction Solutions in Virtual Reality [0.5492530316344587]
The tool incorporates an experiment controller, a generic eye-tracking controller, a defocus simulator, and a generic VR questionnaire loader.
It enables vision scientists to increase their research tools with a robust, realistic, and fast research environment.
arXiv Detail & Related papers (2023-12-01T16:18:55Z) - Deep Motion Masking for Secure, Usable, and Scalable Real-Time Anonymization of Virtual Reality Motion Data [49.68609500290361]
Recent studies have demonstrated that the motion tracking "telemetry" data used by nearly all VR applications is as uniquely identifiable as a fingerprint scan.
We present in this paper a state-of-the-art VR identification model that can convincingly bypass known defensive countermeasures.
arXiv Detail & Related papers (2023-11-09T01:34:22Z) - Towards Modeling Software Quality of Virtual Reality Applications from
Users' Perspectives [44.46088489942242]
We conduct the first large-scale empirical study to model the software quality of VR applications from users' perspectives.
We analyze 1,132,056 user reviews of 14,150 VR applications across seven app stores through a semiautomatic review mining approach.
Our analysis reveals that the VR-specific quality attributes are of utmost importance to users, which are closely related to the most unique properties of VR applications.
arXiv Detail & Related papers (2023-08-13T14:42:47Z) - Force-Aware Interface via Electromyography for Natural VR/AR Interaction [69.1332992637271]
We design a learning-based neural interface for natural and intuitive force inputs in VR/AR.
We show that our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration.
We envision our findings to push forward research towards more realistic physicality in future VR/AR.
arXiv Detail & Related papers (2022-10-03T20:51:25Z) - Learning Effect of Lay People in Gesture-Based Locomotion in Virtual
Reality [81.5101473684021]
Some of the most promising methods are gesture-based and do not require additional handheld hardware.
Recent work focused mostly on user preference and performance of the different locomotion techniques.
This work is investigated whether and how quickly users can adapt to a hand gesture-based locomotion system in VR.
arXiv Detail & Related papers (2022-06-16T10:44:16Z) - A Systematic Review on Interactive Virtual Reality Laboratory [1.3999481573773072]
This study aims to comprehend the work done in quality education from a distance using VR.
Adopting virtual reality in education can help students learn more effectively.
This highlights the importance of a significant expansion of VR use in learning.
arXiv Detail & Related papers (2022-03-26T07:16:01Z) - Wireless Edge-Empowered Metaverse: A Learning-Based Incentive Mechanism
for Virtual Reality [102.4151387131726]
We propose a learning-based Incentive Mechanism framework for VR services in the Metaverse.
First, we propose the quality of perception as the metric for VR users in the virtual world.
Second, for quick trading of VR services between VR users (i.e., buyers) and VR SPs (i.e., sellers), we design a double Dutch auction mechanism.
Third, for auction communication reduction, we design a deep reinforcement learning-based auctioneer to accelerate this auction process.
arXiv Detail & Related papers (2021-11-07T13:02:52Z) - SurRoL: An Open-source Reinforcement Learning Centered and dVRK
Compatible Platform for Surgical Robot Learning [78.76052604441519]
SurRoL is an RL-centered simulation platform for surgical robot learning compatible with the da Vinci Research Kit (dVRK)
Ten learning-based surgical tasks are built in the platform, which are common in the real autonomous surgical execution.
We evaluate SurRoL using RL algorithms in simulation, provide in-depth analysis, deploy the trained policies on the real dVRK, and show that our SurRoL achieves better transferability in the real world.
arXiv Detail & Related papers (2021-08-30T07:43:47Z) - Virtual Reality based Digital Twin System for remote laboratories and
online practical learning [0.08431877864777444]
There is a need for remote learning and virtual learning applications such as virtual reality (VR) and tablet-based solutions.
A case study describing the creation of a virtual learning application for an electrical laboratory tutorial is presented.
arXiv Detail & Related papers (2021-06-17T09:38:24Z) - Guidelines for the Development of Immersive Virtual Reality Software for
Cognitive Neuroscience and Neuropsychology: The Development of Virtual
Reality Everyday Assessment Lab (VR-EAL) [0.0]
This study offers guidelines for the development of VR software in cognitive neuroscience and neuropsychology.
Twenty-five participants aged between 20 and 45 years with 12-16 years of full-time education evaluated various versions of VR-EAL.
The final version of VR-EAL achieved high scores in every sub-score of the VRNQ and exceeded its parsimonious cut-offs.
arXiv Detail & Related papers (2021-01-20T14:55:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.