Robot Sensory-Motor Coordination

vlcsnap-2011-10-01-07h49m39s231

 

This work studied how robots over time can integrate multimodal sensory information with motion as a basis to ground the robot in reality and provides a framework for understanding the world. We studied how sensory-motor state data could self-organize into vector space structures that categorize the world in terms of the robot’s sensory-motor coordination (SMC). We enacted tasks that imitate how babies interacted with objects of interested in the world with increased complexity. Audio, vision, and tactile sensing were used as sensory modalities.

 

 

 

Related work: