dc.creator | Rojas, Juan Luis | |
dc.date.accessioned | 2020-08-22T00:19:58Z | |
dc.date.available | 2005-04-16 | |
dc.date.issued | 2004-04-16 | |
dc.identifier.uri | https://etd.library.vanderbilt.edu/etd-04012004-135630 | |
dc.identifier.uri | http://hdl.handle.net/1803/11877 | |
dc.description.abstract | The work in this thesis seeks to integrate the motion of a humanoid robot with its auditory and visual sensory information to achieve various reflex actions that mimic those of people. Such reflexes in the form of reach-grasp behaviors can enable the robot to learn through experience its own state and that of the world. A humanoid robot with auditory capabilities, stereo vision, and artificial pneumatic arms and hands was used to demonstrate tightly coupled sensory-motor behaviors in five different demonstrations. The complexity of succeeding demonstrations was increased to show that the reflexive sensory-motor behaviors combine to perform increasingly complex tasks. The humanoid robot executed these tasks effectively and established the ground-work for the further development of hardware and software systems, sensory-motor vector-space representations, and coupling with higher level cognition. | |
dc.format.mimetype | application/pdf | |
dc.subject | Robotics | |
dc.subject | behaviors | |
dc.subject | sensory | |
dc.subject | motor | |
dc.subject | coordination | |
dc.subject | intelligence. | |
dc.title | Sensory integration with articulated motion on a humanoid robot | |
dc.type | thesis | |
dc.contributor.committeeMember | Alan R. Peters II | |
dc.type.material | text | |
thesis.degree.name | ME | |
thesis.degree.level | thesis | |
thesis.degree.discipline | Electrical Engineering | |
thesis.degree.grantor | Vanderbilt University | |
local.embargo.terms | 2005-04-16 | |
local.embargo.lift | 2005-04-16 | |