ZatLab gesture recognition framework: machine learning results

André Baltazar*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

Abstract

The main problem this work addresses is the real-time recognition of gestures, particularly in the complex domain of artistic performance. By recognizing the performer gestures, one is able to map them to diverse controls, from lightning control to the creation of visuals, sound control or even music creation, thus allowing performers real-time manipulation of creative events. The work presented here takes this challenge, using a multidisciplinary approach to the problem, based in some of the known principles of how humans recognize gesture, together with the computer science methods to successfully complete the task. This paper is a consequence of previous publications and presents in detail the Gesture Recognition Module of the ZatLab Framework and results obtained by its Machine Learning (ML) algorithms. One will provide a brief review the previous works done in the area, followed by the description of the framework design and the results of the recognition algorithms.
Original languageEnglish
Title of host publicationSmart technologies
Subtitle of host publicationbreakthroughs in research and practice
PublisherIGI Global Publishing
Chapter14
Pages318-332
Number of pages15
ISBN (Electronic)9781522525905
ISBN (Print)1522525890, 9781522525899
DOIs
Publication statusPublished - 19 Jun 2017

Fingerprint

Dive into the research topics of 'ZatLab gesture recognition framework: machine learning results'. Together they form a unique fingerprint.

Cite this