NMSO Navigation Studies

Progress report: Training and Simulation for Complex Cognitive Tasks-R4U4C

PI: Ji Hyun Yang, PhD & Quinn Kennedy, PhD
Team members: Michael Day, Jesse Huston, LT Brad Cowden, MAJ Shane Grass,
LCDR Chris Kirby, LT Chris Neboshynsk

We have 4 thesis students who are working either directly on this study or on theses that take this training simulation to the next level:

  • LT Brad Cowden, USN: Modeling of helicopter pilot misperception during overland navigation
  • MAJ Shane Grass, USA: Utilizing eye tracking to aid student pilot’s situational awareness
  • LCDR Chris Kirby, USN: An Analysis of Helicopter Pilot Visual Scan Techniques When Conducting Low Level Flight
  • LT Chris Neboshynsky, USN: Expertise on Cognitive Workloads and Performance during Navigation and Target Detection

Specific milestones are listed below.

  • Data collection completed for three Human-in-the-loop experiments:
    • Modeling of helicopter pilot misperception during overland navigation
      • LT Brad Cowden designed the study and ran the experiment at NPS.
      • 15 subjects flew four routes (two manual and two auto route). During the navigation subjects were asked to pinpoint their location on the map, and rate their confidence in their response.

Figure 1. Example data from two pilots: actual trajectory (yellow line), red star (pilots' perception of their locations), numbers in rectangles (pilot’s confidence level between 0 and 1), and blue dot (actual locations). Left figure: pilot’s perceptions are different from actual locations, but confidence levels are high. Right figure: pilot’s perceptions and actual locations agree, but confidence levels tend to be lower than those from the left figure.

  • An Analysis of Helicopter Pilot Visual Scan Techniques When Conducting Low Level Flight
    • LT Chris Kirby and Mr. Jesse Huston ran the experiment at Naval Air Station (NAS) North Island.
    • 21 pilots assigned to squadrons at NAS North Island participated in the study.
    • Pilots flew a low level overland route at high speeds in a Tactical Operational Flight Simulator (TOFT) while faceLAB tracked their head and eye movements.

Figure 2 Example debrief video, faceLAB calibration screen, MH-60S TOFT2

  • Training Expert Navigation and Target Detection
    • Mr. Jesse Huston ran the experiment at NPS.
    • 15 subjects flew in a simulated environment in three scenarios: one focusing on navigation, one on target detection, and one on simultaneous navigation and target detection. FaceLAB tracked their head and eye movements and button responses for target detection were saved.
    • LT Chris Neboshynsky will analyze the data focusing on expertise and cognitive workloads.
    • We have made numerous technological improvements (mainly done by Mr. Michael Day) to the study to enhance the training potential of this simulation.  These improvements include:
      • Going from a 65 degree field-of-view out-the-window display to a 180 degree field-of-view of out-the-window display.
      • Going from an auto-rotation & fixed-translation map display to a touch-screen display that allows the pilot to control a map’s orientation and translation.
      • Going from a short joystick that required constant pressure to maintain attitude to a full size cyclic control stick.
      • Pilot seat now is in a helicopter cabin.
      • Improved the resolution of the terrain.

Figure 3 Left: Flight and Eye Scan Visualization Tool and Right: touch screen display

  • Other milestones completed:
    • Literature review completed.
    • Four IRB packages have been submitted and approved.
    • MOVES Research Summit presentation
      • We presented our current work at the MOVES 2011 RES NMSO session, at which the NMSO representative was briefed on our progress.

Leave a Reply

You must be logged in to post a comment.