Gesture Based Flight Control

Keywords

No Thumbnail Available

Issue Date

2012-10-02

Language

en

Document type

Journal Title

Journal ISSN

Volume Title

Publisher

Title

ISSN

Volume

Issue

Startpage

Endpage

DOI

Abstract

A popular subject is the control of Unmanned Aerial Vehicles(UAV’s), where humans control UAV’s to complete a task. Humans use gesture-based interaction on a daily basis to communicate their intentions. It is essential for UAV’s to understand these gestures. In this thesis a gesture-based interaction platform is built to control a small UAV. Gesture input is captured through a Microsoft Kinect device and processed by the advanced computer vision algorithms of the Kinect SDK. Using the Kinect SDK a model of the body is generated with the locations of several joints, the "skeleton model". With this model dynamic movements of body parts are captured. These time-varying signals are transformed into an observation sequence for the Hidden Markov Model(HMM) using a new "displacement model". An HMM and a neural network(NN) are trained on the same time-varying signals and used to recognise a selection of domain specific gestures. Training an HMM using ten gestures results in a recognition rate of 82.73 percent on the test set, the NN has a recognition rate of 89.27 percent on the test set, and the NN trained on observation sequences has a recognition rate of 88.07 percent on the test set. When omitting conflicting gestures and reducing the amount of gestures to six, a recognition rate of 91.5 percent is achieved on the test set using an HMM, a recognition rate of 91.50 percent using a NN trained on observation sequences, and 91.09 percent using a NN. Using an HMM to train on one person and then using the same HMM to test another person, a recognition rate of 36.65 percent is achieved.

Description

Citation

Faculty

Faculteit der Sociale Wetenschappen