Wearable technology and gesture recognition for live performance augmentation

Smith, Sathya (2016) Wearable technology and gesture recognition for live performance augmentation. [USQ Project]

[img]
Preview
Text (Main Project)
Smith_S_Maxwell.pdf

Download (2MB) | Preview

Abstract

The use of physical gestures within interactions between humans and computer systems is a rapidly progressing research field that find itself increasingly present in smartphone and computer applications. This dissertation intends to outline the various engineering design processes involved in the creation of a simplistic and novel gesture recognition system geared towards use in live entertainment performances. The system aims to help in increasing fluidity of human-machine interaction in the entertainment industry by providing an alternative input method for controlling other performance related systems such as mixers, monitors, digital audio workstations and stage lighting.

Electronic methods of wearable body movement tracking, gesture recognition and wireless interfacing are explored in order to determine a suitable design for the system to achieve a practical result. The resulting system consists of a wearable hardware group as well as a terminal hardware group, with associated software for each. The wearable design contains an MPU6050 motion processing unit, Arduino Uno development board and a HopeRF HM-TR wireless data link transceiver. The terminal group is responsible for receiving MIDI commands and consists of an Arduino compatible ‘LeoStick’ board coupled with a second transceiver. The usage of modern additive manufacturing methods was also investigated for hardware enclosure creation to allow potential for rapid prototyping.

The GR is able to accurately provide movement data to a processor, which utilises a running average based gesture recognition algorithm in an attempt to extract movement features and respond to the presence of a pre-determined gesture by generating a MIDI command that is then sent to a computer terminal for use with external applications.

Although the system did not fully live up to design objectives, it plays the role of an important stepping stone in the creation of practical, entertainment-oriented gesture recognition devices that are more accessible to the general public.


Statistics for USQ ePrint 31484
Statistics for this ePrint Item
Item Type: USQ Project
Item Status: Live Archive
Additional Information: Bachelor of Engineering (Honours) Major Electrical & Electronic Engineering project
Faculty/School / Institute/Centre: Historic - Faculty of Health, Engineering and Sciences - School of Mechanical and Electrical Engineering (1 Jul 2013 - 31 Dec 2021)
Supervisors: Maxwell, Andrew
Date Deposited: 23 Jul 2017 23:41
Last Modified: 23 Jul 2017 23:41
Uncontrolled Keywords: wearable technology; gesture recognition; live performance augmentation; MIDI commands; HopeRF HM-TR
Fields of Research (2008): 09 Engineering > 0906 Electrical and Electronic Engineering > 090602 Control Systems, Robotics and Automation
Fields of Research (2020): 40 ENGINEERING > 4007 Control engineering, mechatronics and robotics > 400799 Control engineering, mechatronics and robotics not elsewhere classified
URI: https://sear.unisq.edu.au/id/eprint/31484

Actions (login required)

View Item Archive Repository Staff Only