Overview:

Navigation Experiment Setup

Early prototype of navigation partial task training simulation. The main screen displays the out-the-window view. The screen on the right displays maps normally used for navigation. Participants control a simplified helicopter aerodynamic model using a joystick. Cameras track and record eye movement and other characteristics.

When we build, deploy, use and evaluate computer-based simulation for training we tend to focus on how the user will interact with the simulation. The intent of this research is to view the simulation from a different perspective…. How does the simulation sense and respond to the user? Can we build more reliable training simulations by improving the simulation’s ability to pick up vital cues about the user and respond appropriately?

For some tasks that are particularly difficult to learn and master it may be helpful to simplify a task, possibly by presenting an easier version or by letting the user practice selected parts of the whole task. How do we get a better understanding of the user’s advancing skill so we can provide feedback or increase difficulty at the right times? What user-based cues should a simulation consider? How do we build these sensing capabilities into simulation and make sure they are effective?

Our current efforts are extending ongoing research in wayfinding and navigation in virtual environments – specifically we are studying eye movement during helicopter overland navigation. Please read on to find out more…

 

Leave a Reply

You must be logged in to post a comment.