Show simple item record

A Vision-Based Perceptual Learning System for Autonomous Mobile Robot

dc.creatorWang, Xiaochun
dc.date.accessioned2020-08-22T20:34:51Z
dc.date.available2008-07-26
dc.date.issued2007-07-26
dc.identifier.urihttps://etd.library.vanderbilt.edu/etd-07252007-174446
dc.identifier.urihttp://hdl.handle.net/1803/13566
dc.description.abstractAutonomous robots are intelligent machines capable of performing tasks in the real world without explicit human control for extended periods of time. A high degree of autonomy is particularly desirable in fields where robots can replace human workers, such as state-of-the-practice video surveillance system and space exploration. However, not having human’s sophisticated sensing and control system, two broad open problems in autonomous robot systems are the perceptual discrepancy problem, that is, there is no guarantee that the robot sensing system can recognize or detect objects defined by a human designer, and the autonomous control problem, that is, how the robots can operate in unstructured environments without continuous human guidance. As a result, autonomous robot systems should have their own ways to acquire percepts and control by learning. In this work, a computer vision system is used for visual percept acquisition and a working memory toolkit is used for robot autonomous control. Natural images contain statistical regularities which can set objects apart from each other and from random noise. For an object to be recognized in a given image, it is often necessary to segment the image into nonoverlapping but meaningful regions whose union is the entire image. Therefore, a biologically based percept acquisition system is developed to build an efficient low-level abstraction of real-world data into percepts. Perception in animals is strongly related to the type of behavior they perform. Learning plays a major part in this process. To solve how the robots can learn to autonomously control their behavior based on percepts they’ve acquired, the computer vision system is integrated with a software package called the Working Memory Toolkit (WMtk) for decision making and learning. The WMtk was developed by Joshua L. Phillips & David C. Noelle based on a neuron computational model of primate working memory system. The success of the whole system is demonstrated by its application to a navigation task.
dc.format.mimetypeapplication/pdf
dc.subjectMobile robots -- Design and construction
dc.subjectAutonomous robots
dc.subjectRobot vision
dc.subjectcontent-based image retrieval
dc.subjectdata clustering
dc.subjectpattern recognition
dc.subjectimage processing
dc.subjectcomputer vision
dc.subjectMachine learning
dc.titleA Vision-Based Perceptual Learning System for Autonomous Mobile Robot
dc.typedissertation
dc.contributor.committeeMemberDavid C. Noelle
dc.contributor.committeeMemberNilanjan Sarkar
dc.contributor.committeeMemberDouglas Hardin
dc.contributor.committeeMemberKazuhiko Kawamura
dc.type.materialtext
thesis.degree.namePHD
thesis.degree.leveldissertation
thesis.degree.disciplineElectrical Engineering
thesis.degree.grantorVanderbilt University
local.embargo.terms2008-07-26
local.embargo.lift2008-07-26
dc.contributor.committeeChairMitch Wilkes


Files in this item

Icon

This item appears in the following Collection(s)

Show simple item record