Learning Agent-Based Modeling Trough Virtual Reality-Based Simulations

Authors

  • Burhan Yasakci John Champe High School, Aldie, VA
  • Hamdi Kavak Center for Social Complexity, Department of Computational and Data Sciences, George Mason University, Fairfax VA
  • William Kennedy Center for Social Complexity, Department of Computational and Data Sciences, George Mason University, Fairfax VA

Abstract

Agent-Based Modeling (ABM) provides a multitude of application areas due to their perspective to modeling systems, which are made up of interacting and autonomous agents. For instance, pedestrian movement in a metro station can be modeled with a digital representation of the station where humans are represented as individual agents. Not only do these applications stay in the line of theoretical research, but also for real world applications for businesses. However, due to the required level of knowledge needed to understand and create these models, getting into ABM has been mostly difficult for students with no prior coding experience as these models are usually created in 2D. To combat this barrier, we created three 3D VR visualizations of popular agent-based models from different sources. All models were made using Unity’s gameEngine, in which Unity’s XR toolkit was used to set up necessary functions like locomotion and user interactions. Alongside these tasks, many scripts that handled the interactions and attributes of agents were made in the C# language. These scripts range in functionality, as some are important for user experience, and some are necessary for the simulation to work. All in all, these VR-based models provide a visual and interactive representation of the example models, making it easy to grasp how the agency concept work. Future study will involve measuring the efficacy of VR-based models against typical 2D models with respect ABM education.

Published

2024-10-13

Issue

Section

College of Science: Department of Computational and Data Sciences