Joo-Hyun Song

Assistant Professor
Joo-Hyun_Song@brown.edu
(401) 863-7666
Office Location: 
Metcalf 234
Research Focus: 
Visually-guided action, decision making, visual attention

Joo-Hyun Song is interested in understanding behavioral and underlying neural mechanisms involved in integrating higher-order cognitive processes and visually-guided actions in real-life situations. She has addressed this topic through a combination of methodologies – including behavioral investigations, fMRI, and neurophysiological experiments. She received a B.A in Psychology from Seoul National University (Seoul, Korea) in 1998, and a Ph.D in Psychology from Harvard University in 2006. She came to Brown in 2010 following postdoctoral training in systems neuroscience and neuroimaging at the Smith-Kettlewell Eye Research Institute in San Francisco.

Research Interests:

Visually-Guided Actions and Target Selection 

In the real world, most visual scenes are complex and crowded. Instead of a single isolated object, there are often several different objects competing for attention and directed action. Thus, a complete understanding of the production of goal-directed actions must also incorporate the higher-level processes involved in the selection of a target stimulus from distractors. Using a combination of methodologies – including behavioral investigations, fMRI, and neurophysiological experiments, my research aims to address the questions as to how targets are selected among competing distractors for visually-guided actions, and what underlying neural mechanisms are involved in selecting the target, and coordinating different types of actions such as hand and eye movements towards a common goal.

Real-Time Readout of Internal Perceptual and Cognitive Decision Processes 

Compared to discrete responses such as button presses, the analysis of continuous overt behaviors has the advantage of allowing internal temporal target selection processes to be mapped onto a visible 3D spatial space. I am interested in how movement trajectories of visually-guided reaches reveal the temporal evolution of target selection, and essentially provide a real-time readout of internal perceptual and cognitive decision processes.

Visual Spatial and Working Memory 

To efficiently guide attention, and action to the object of interest in complex environments, it is critical to build up, and maintain stable spatial and object memory representations. Using both psychophysics and functional brain imaging, I examine explicit and implicit mechanisms involved in guiding attention to the object of interest, and forming robust visual representations.