Utilizing hand tracking technologies in serious games to teach American Sign Language
Abstract
Serious games gamify, or apply elements of games to, processes that are typically considered wearisome, such as education or training, to boost the user’s interest and participation. Recent advancements in virtual reality have allowed for widespread accessibility of hand tracking technologies, rapidly opening new avenues in both serious and non-serious games for the general public. ASL is the most common form of sign language in the U.S., used by approximately half a million people there1, but can be difficult to learn through online resources such as mobile apps that are unable to track the user’s hands. With the number of people relying on ASL only growing, an easy way to gain some level of comprehension of the language should be made available. This project utilizes the Unity game engine and OpenXR packages to track the user’s hands when using compatible technologies such as the Meta Quest 2, 3, or Pro. The VR game prototype recognizes when the user makes certain ASL signs, and the surrounding environment then reacts accordingly. Using this mechanic, the user can solve puzzles by figuring out what different signs mean and what effect they have. Though no studies have yet been performed to test the application’s effectiveness, as the player imitates nearby signing characters to learn new vocabulary, the end goal is for the step of associating motions with their resulting in-game actions to strengthen the player’s memory and help them effectively internalize the meanings of ASL signs.
Published
Issue
Section
License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.