| |
Abstract:
Abstract: Studies of the blind and blindfolded sighted find
that subjects can use non-visual (e.g. vestibular and
proprioceptive) information to find their way "home" after walking
two legs of a triangle. While roughly accurate, there are
systematic errors in responses. We examined the roles of both
visual and non-visual information in this task. Included in the
examination of visual information was whether the location of
visual flow affected performance differentially (e.g., texture on
the walls, floor, both or neither). In the first experiment,
subjects used a joystick to guide themselves in sparse virtual
environments while wearing a head mounted display (HMD).
Performance differences were observed depending on the location of
texture in the environment; reduced flow information for
translation increased variability, reduced flow information for
rotation resulted in less turning accuracy, and with no texture,
the poorest performance was observed. However, responses were
similar regardless of triangle shape. In the second experiment, the
visual manipulations were the same, but participants physically
walked in a large virtual space while wearing the HMD. In this
experiment, accuracy improved and variability decreased overall
relative to Experiment 1, but some display differences and
systematic biases were still observed. These results indicate that
optic flow information is fairly accurate for path integration, but
the addition of non-visual information may allow for improved
performance in visually impoverished environments.
|