Perceptual Science Group @ MIT


Postdoctoral Scholars

Graduate Students



Recent Alumni

All alumni

The Perceptual Science Group of the Department of Brain and Cognitive Sciences at MIT does research in human vision, machine vision, human-computer interaction, and touch sensing for robotics. Both the Adelson Lab and the Rosenholtz Lab are part of the Computer Science and Artificial Intelligence Lab (CSAIL), located in the Stata Center.

Peripheral vision, inference, and visual awareness: An extended abstract is now available based on Ruth Rosenholtz' invited talk at the VSS 2017 Symposium, “The Role of Ensemble Statistics in the Visual Periphery.” What modern vision science reveals about the awareness puzzle: Summary-statistic encoding plus decision limits underlie the richness of visual perception and its quirky failures

Attention and limited capacity: Ruth Rosenholtz has a new paper on what we have learned about attention by studying peripheral vision. This leads us to a new conceptualization of limited capacity in vision and the mechanisms for dealing with it. ”Capacity limits and how the visual system copes with them.”

Modelling visual crowding: Shaiyan and Ruth's work testing a unified account of visual crowding has been accepted to the Journal of Vision.

Paper accepted to IROS 2014: Rui and Wenzhen's work on adapting the Gelsight sensor for robotic touch has been accepted to IROS 2014. This work was done in collaboration with the Platt group at NEU, and it was covered by MIT News.

Taking a new look at subway map design: The Rosenholtz lab's Texture Tiling Model was used to evaluate subway maps for the MBTA Map Redesign Contest. Check out the FastCompany Design article, article, and the CSAIL news article. The news was also picked up by a couple other media sources: Smithsonian Magazine and The Dish. Here's an older article about our research from Science Daily.

Tactile sensing for manipulation
If robots are to perform everyday tasks in the real world, they will need sophisticated tactile sensing. The tactile data must be integrated into multi-sensory representations that support exploration, manipulation, and other tasks.

      (workshop held July 15, 2017)

Giving robots a sense of touch
GelSight technology lets robots gauge objects’ hardness and manipulate small tools.

Fingertip sensor gives robot unprecedented dexterity
Armed with the GelSight sensor, a robot can grasp a freely hanging USB cable and plug it into a USB port.

GelSight — Portable, super-high-resolution 3-D imaging
A simple new imaging system could help manufacturers inspect their products, forensics experts identify weapons and doctors identify cancers.

Artificial intelligence produces realistic sounds that fool humans
Video-trained system from MIT’s Computer Science and Artificial Intelligence Lab could help robots understand how objects interact with the world.