This book discusses the principles, methodologies, and challenges of robotic musicianship through an in-depth review of the work conducted at the Georgia Tech Center for Music Technology (GTCMT), where the concept was first developed. Robotic musicianship is a relatively new research field that focuses on the design and development of intelligent music-making machines. The motivation behind the field is to develop robots that not only generate music, but also collaborate with humans by listening and responding in an expressive and creative manner. This combination of human and machine creativity has the potential to surprise and inspire us to play, listen, compose, and think about music in new ways. The book provides an in-depth view of the robotic platforms designed at the GTCMT Robotic Musicianship Group, including the improvisational robotic percussionists Haile and Shimon, the personal robotic companion Shimi, and a number of wearable robots, such as the Robotic Drumming Prosthesis, The Third Drumming Arm, and the Skywalker Piano Hand. The book discusses numerous research studies based on these platforms in the context of five main principles: Listen like a Human, Play Like a Machine, Be Social, Watch and Learn, and Wear It.
Author(s): Gil Weinberg; Mason Bretan; Guy Hoffman; Scott Driscoll
Publisher: Springer Nature
Language: English
Pages: 256
Tags: computer music,music,dsp
Foreword
Preface
Contents
1 Introduction
1.1 Abstract
1.2 Why Robotic Musicianship
1.3 Sound Production and Design—Survey
1.3.1 Traditional Instruments
1.3.2 Augmented and Novel Instruments
1.4 Musical Intelligence
1.4.1 Sensing and Perception
1.4.2 Music Generation
1.5 Embodiment
1.6 Integrating Robotic Musicianship into New Interfaces
1.6.1 Musical Companion Robots
1.6.2 Wearable Robotic Musicians
1.7 Discussion
References
2 Platforms—Georgia Tech's Robotic Musicians
2.1 Abstract
2.2 Haile—A Robotic Percussionist
2.2.1 Motivation
2.2.2 Design
2.3 Shimon—A Robotic Marimba Player
2.3.1 Striker Design
2.3.2 Mallet Motor Control
2.3.3 Slider Motor Control
2.3.4 Shimon's Socially Expressive Head
2.4 Shimi—A Music Driven Robotic Dancing Companion
2.4.1 Robotic Musical Companionship
2.4.2 Design
2.4.3 Software Architecture
2.4.4 Core Capabilities
2.5 The Robotic Drumming Prosthetic
2.5.1 Motivation
2.5.2 Related Work
2.5.3 Platform
2.5.4 Generative Physical Model for Stroke Generation
2.5.5 Conclusions
References
3 ``Listen Like A Human''—Human-Informed Music Perception Models
3.1 Abstract
3.2 Rhythmic Analysis of Live Drumming
3.2.1 Onset Detection
3.2.2 Beat Detection
3.2.3 Rhythmic Stability and Similarity
3.2.4 User Study
3.3 Tonal Music Analysis Using Symbolic Rules
3.3.1 Implementation
3.3.2 Evaluation
3.4 Music Analysis Using Deep Neural Networks
3.4.1 Deep Musical Autoencoder
3.4.2 Music Reconstruction Through Selection
3.5 Real-Time Audio Analysis of Prerecorded Music
3.5.1 Introduction
3.5.2 Previous Work
3.5.3 System Design
3.5.4 Live Audio Analysis
3.5.5 Gesture Design
3.5.6 Network Design
3.5.7 User Study
3.5.8 Summary
References
4 ``Play Like A Machine''—Generative Musical Models for Robots
4.1 Abstract
4.2 Genetic Algorithms
4.2.1 Related Work
4.2.2 Method
4.3 Markov Processes (``Playing with the Masters'')
4.3.1 Related Work
4.3.2 Implementation
4.3.3 Summary
4.4 Path Planning Driven Music Generation
4.4.1 Search and Path Planning
4.4.2 Musical Path Planning
4.4.3 Planning
4.4.4 Evaluation
4.4.5 Discussion
4.5 Rule Based Jazz Improvisation
4.5.1 Parametrized Representations of Higher-Level Musical Semantics
4.5.2 Joint Optimization
4.5.3 Musical Results
4.5.4 Discussion
4.6 Neural Network Based Improvisation
4.6.1 Introduction
4.6.2 Semantic Relevance
4.6.3 Concatenation Cost
4.6.4 Ranking Units
4.6.5 Evaluating the Model
4.6.6 Discussion
4.6.7 Subjective Evaluation
4.6.8 Results
4.6.9 An Embodied Unit Selection Process
4.7 Conclusion
References
5 ``Be Social''—Embodied Human-Robot Musical Interactions
5.1 Abstract
5.2 Embodied Interaction with Haile
5.2.1 Interaction Modes
5.2.2 Leader-Follower Interaction
5.2.3 Evaluation
5.2.4 Data Analysis
5.2.5 Results
5.2.6 Conclusion
5.3 Synchronization with Shimon's Music-Making Gestures
5.3.1 Hypotheses
5.3.2 Experimental Design
5.3.3 Manipulation I: Precision
5.3.4 Manipulation II: Embodiment
5.3.5 Results
5.3.6 Discussion
5.3.7 Audience Appreciation
5.4 Emotion Conveyance Through Gestures with Shimi
5.4.1 Related Work
5.4.2 A System for Generating Emotional Behaviors
5.4.3 Shimi Interactive Applications
5.5 Conclusion
References
6 ``Watch and Learn''—Computer Vision for Musical Gesture Analysis
6.1 Abstract
6.2 Robotic Musical Anticipation Based on Visual Cues
6.2.1 Introduction
6.2.2 Related Work
6.2.3 Motivation and Approach
6.2.4 Method
6.2.5 Algorithm
6.2.6 Results and Discussion
6.2.7 Conclusions
6.3 Query by Movement
6.3.1 Motivation and Related Work
6.3.2 Approach
6.3.3 User Study
6.3.4 Implementation
6.3.5 Evaluation
6.3.6 Results
6.3.7 Discussion
6.3.8 Future Work and Conclusions
6.4 A Robotic Movie Composer
6.4.1 Visual Analysis
6.4.2 Music Generation
6.4.3 Informal Feedback
6.4.4 Discussion
References
7 ``Wear it''—Wearable Robotic Musicians
7.1 Abstract
7.2 The Robotic Drumming Prosthetic Arm
7.2.1 Background
7.2.2 Related Work
7.2.3 Motivation
7.2.4 Design
7.2.5 Evaluation
7.2.6 Results
7.2.7 The Second Stick
7.2.8 Conclusion
7.3 The Third Drumming Arm
7.3.1 Introduction
7.3.2 Related Work
7.3.3 Motivation
7.3.4 Design
7.3.5 Dynamics Modeling
7.3.6 Input-Shaper
7.3.7 User Survey
7.3.8 Conclusion
7.4 The Skywalker Piano Hand
7.4.1 Related Work
7.4.2 Goals
7.4.3 Ultrasound Configuration Experiments
7.4.4 Machine Learning Design
7.5 Conclusion
References
Index