Show simple item record

Real-time Gesture Imitation in a Soft-arm Control Robot

dc.creatorThornton, Sean R
dc.date.accessioned2020-08-21T21:32:51Z
dc.date.available2009-04-08
dc.date.issued2009-04-08
dc.identifier.urihttps://etd.library.vanderbilt.edu/etd-03252009-143826
dc.identifier.urihttp://hdl.handle.net/1803/11276
dc.description.abstractIn this thesis a system is developed whereby ISAC, a soft-arm control humanoid robot, can observe, track, and imitate hand motions made by a human being. This is accomplished by making use of the OpenCV libraries for Haar object detection and pre-trained Haar classifiers to detect the human’s face and hand, applying stereo vision geometry to identify the relative locations of the face and hand and to map those coordinates onto the workspace of ISAC, and by transmitting those coordinates via UDP to the arm controller, which interpolates and activates the corresponding arm motions. Thus, ISAC can imitate motions in real-time. These motions are also stored in a database on the arm control computer for later use.
dc.format.mimetypeapplication/pdf
dc.subjectCR
dc.subjectRobots -- Constrol systems
dc.subjectGesture
dc.subjectImitation
dc.subjectMachine learning
dc.subjectRobot vision
dc.subjectOpenCV
dc.titleReal-time Gesture Imitation in a Soft-arm Control Robot
dc.typethesis
dc.type.materialtext
thesis.degree.nameMS
thesis.degree.levelthesis
thesis.degree.disciplineElectrical Engineering
thesis.degree.grantorVanderbilt University
local.embargo.terms2009-04-08
local.embargo.lift2009-04-08
dc.contributor.committeeChairDr. Richard Alan Peters II
dc.contributor.committeeChairDr. D. Mitchell Wilkes


Files in this item

Icon

This item appears in the following Collection(s)

Show simple item record