creators_name: Fitzpatrick, Paul creators_name: Arsenio, Artur editors_name: Berthouze, Luc editors_name: Kozima, Hideki editors_name: Prince, Christopher G. editors_name: Sandini, Giulio editors_name: Stojanov, Georgi editors_name: Metta, Giorgio editors_name: Balkenius, Christian type: confpaper datestamp: 2005-04-14 lastmod: 2011-03-11 08:55:50 metadata_visibility: show title: Feel the beat: using cross-modal rhythm to integrate perception of objects, others, and self ispublished: pub subjects: comp-sci-mach-vis subjects: comp-sci-mach-learn subjects: comp-sci-robot full_text_status: public keywords: periodic motion, rhythm based segmentation, periodic motion, cross-modal association, robotic proprioception abstract: For a robot to be capable of development, it must be able to explore its environment and learn from its experiences. It must find (or create) opportunities to experience the unfamiliar in ways that reveal properties valid beyond the immediate context. In this paper, we develop a novel method for using the rhythm of everyday actions as a basis for identifying the characteristic appearance and sounds associated with objects, people, and the robot itself. Our approach is to identify and segment groups of signals in individual modalities (sight, hearing, and proprioception) based on their rhythmic variation, then to identify and bind causally-related groups of signals across different modalities. By including proprioception as a modality, this cross-modal binding method applies to the robot itself, and we report a series of experiments in which the robot learns about the characteristics of its own body. date: 2004 date_type: published volume: 117 publisher: Lund University Cognitive Studies pagerange: 59-66 refereed: TRUE citation: Fitzpatrick, Paul and Arsenio, Artur (2004) Feel the beat: using cross-modal rhythm to integrate perception of objects, others, and self. [Conference Paper] document_url: http://cogprints.org/4062/1/fitzpatrick.pdf