William H. Warren

Chancellor's Professor
Bill_Warren@brown.edu
(401) 863-3980
Office Location: 
Metcalf 257
Research Focus: 
Perception and action, visual control of locomotion, spatial navigation

Bill Warren’s research focuses on the visual control of action – in particular, human locomotion and navigation. He seeks to explain how this behavior is adaptively regulated by multi-sensory information, within a dynamical systems framework. Using virtual reality techniques, his research team investigates problems such as the visual control of steering, obstacle avoidance, wayfinding, pedestrian interactions, and the collective behavior of crowds. Experiments in the Virtual Environment Navigation Lab (VENLab) enable his group to manipulate what participants see as they walk through a virtual landscape, and to measure and model their behavior. The aim of this research is to understand how adaptive behavior emerges from the dynamic interaction between an organism and its environment. He believes the answers will not be found only in the brain, but will strongly depend on the physical and informational regularities that the brain exploits. This work contributes to basic knowledge that is needed to understand visual-motor disorders in humans, and to develop mobile robots that can operate in novel environments.

William Warren is Chancellor’s Professor of Cognitive, Linguistic, and Psychological Sciences. He earned his undergraduate degree at Hampshire College (1976), his Ph.D. in Experimental Psychology from the University of Connecticut (1982), did post-doctoral work at the University of Edinburgh, and has been a professor at Brown ever since. He served as Chair of the Department of Cognitive and Linguistic Sciences from 2002-10. Warren is the recipient of a Fulbright Research Fellowship, an NIH Research Career Development Award, and Brown's Elizabeth Leduc Teaching Award for Excellence in the Life Sciences.

Research Interests:

I take an ecological approach to problems of perception and action, which aims to determine how much of the organization in behavior can be explained "for free" on the basis of sensory information, physical constraints, and the dynamics of behavior in natural environments. My current research investigates the visual control of locomotion and navigation, combining experiments in virtual reality with dynamical systems modeling and agent-based simulation.

One project studies the visual guidance of locomotor behavior such as steering, obstacle avoidance, target interception, and route selection. We develop dynamical models of each behavior, which can be combined to predict paths through complex environments. At the same time, we map out the multi-sensory information used to guide such behavior, including optic flow, vestibular and proprioceptive information. Currently we are studying pedestrian interactions such as following and walking side-by-side, in order to model the self-organization of human crowds, and we are beginning to work on the action selection problem. This project is building a “pedestrian model” that has applications to mobile robotics, computer animation, architectural design, and urban planning. We are also applying these techniques to the assessment and rehabilitation of functional mobility in patients with low vision or lower-limb injuries.

A second project extends such principles to longer-range navigation. We are investigating the information used in path integration, how path integration and visual landmarks are combined in navigation, the processes underlying spatial learning, the geometry of spatial knowledge, and how it is used to guide wayfinding. For instance, we have created two non-Euclidean virtual worlds – the Wormhole Maze and the Escher Museum – in order to probe the structure of spatial knowledge. Our working hypothesis is that successful navigation is better explained by a combination of adaptive strategies than by a metric Euclidean “cognitive map.” This work has applications to autonomous robots, the design of navigation aids and interfaces, architectural design, and urban planning.

A third stream investigates the nonlinear dynamics of perceptual-motor behavior. This research asks, do people take advantage of the natural “stabilities” of a task to organize their behavior? We study tasks such as infants bouncing in a “jolly jumper,” or adults bouncing a virtual ball on a racquet, in order to determine how humans exploit physical and informational constraints to simplify the control problem.

The ultimate aim of this research is a theory of perception and action that accounts for the organization of behavior in terms of its self-organizing dynamics (see Warren, Psychological Review, 2006).