Visual Search in VR

Visual search involves a wide spectrum of cognitive abilities ranging from basic perceptual and attentional processing to object recognition, long- and short-term memory functions, navigation, planning, problem-solving, and decision-making. Much research in this area has centered on the relationship between bottom-up, sensory driven and top-down, goal-directed guidance cues in the allocation of attention. Although existing work has extensively studied psychophysical variables and the plausibility of neuro-computational models, no research group has yet documented how humans solve challenging search problems in complex “real world” environments.

It is precisely this question, however, that has become an increasingly urgent problem to grasp given the growing feasibility of devising smart technologies that can reduce cognitive load and increase human operator efficiency in a wide range of contexts that require visual search. Here, we aim to model ecologically valid visual search processes that involve navigation of a 3D virtual environment, uncertainty, and psychological stress.

Publication: ONGOING

This project is supported by the National Science Foundation and the Army Research Laboratory.

 



 

3D Visual Search Demo

Ying Wu  February 1, 2021

For a recent ARL Capstone event, we created a video demo of our multi-modal visual search paradigm.  Watch it here.

Adding the Last Variables

Christian Lay-Geng  July 23, 2020

My VR partner, Weichen, at this point had completed his work sending the channels from the EEG headsets into raw data. He then developed a 3D gizmo using VIVE trackers to be able to track the player’s limbs (photo above). Initially we were only tracking a player’s eye-movement and head gaze, but now we can also track decision-making from their body movements.

Initial Piloting

Christian Lay-Geng  June 22, 2020

After completing the code of the basic structure of the game, I felt like the distractor bullets that I had were too different in shape and tip color, making it really easy for me to spot the targets from a good distance away. I opted to remove the green-tip bullet and instead put a non-colored bullet shaped like the target bullet instead.

Designing the Bullet Room

Christian Lay-Geng  March 13, 2020

The goal of the Visual Search Project was to create a game that tested a player’s mindset when differentiating between targets and non-targets. We wanted to test this by including stress as a variable, which may motivate players to try new searching strategies. I was tasked with developing the 3D Room in VR while the other members used their tools to measure the player’s movements in real time.