Shaping and Exploring Interactive Motion-Sound Mappings Using Online Clustering Techniques

التفاصيل البيبلوغرافية
العنوان: Shaping and Exploring Interactive Motion-Sound Mappings Using Online Clustering Techniques
المؤلفون: Scurto, Hugo, Bevilacqua, Frédéric, Françoise, Jules
المساهمون: Interaction Son Musique Mouvement Paris, Sciences et Technologies de la Musique et du Son (STMS), Institut de Recherche et Coordination Acoustique/Musique (IRCAM)-Université Pierre et Marie Curie - Paris 6 (UPMC)-Centre National de la Recherche Scientifique (CNRS)-Institut de Recherche et Coordination Acoustique/Musique (IRCAM)-Université Pierre et Marie Curie - Paris 6 (UPMC)-Centre National de la Recherche Scientifique (CNRS), School of Interactive Arts and Technologies (SIAT), Simon Fraser University (SFU.ca)
المصدر: NIME 2017 proceedings ; Proceedings of the 17th International Conference on New Interfaces for Musical Expression (NIME 2017) ; https://hal.archives-ouvertes.fr/hal-01577806Test ; Proceedings of the 17th International Conference on New Interfaces for Musical Expression (NIME 2017), May 2017, Copenhague, Denmark
بيانات النشر: HAL CCSD
سنة النشر: 2017
المجموعة: Archive ouverte HAL (Hyper Article en Ligne, CCSD - Centre pour la Communication Scientifique Directe)
مصطلحات موضوعية: Machine Learning, Embodied Interaction, Motion, Sound, Expressiveness, [INFO.INFO-HC]Computer Science [cs]/Human-Computer Interaction [cs.HC], [INFO.INFO-AI]Computer Science [cs]/Artificial Intelligence [cs.AI], [INFO.INFO-MM]Computer Science [cs]/Multimedia [cs.MM], [INFO.INFO-SD]Computer Science [cs]/Sound [cs.SD]
جغرافية الموضوع: Copenhague, Denmark
الوصف: International audience ; Machine learning tools for designing motion-sound relationships often rely on a two-phase iterative process, where users must alternate between designing gestures and performing mappings. We present a first prototype of a user adaptable tool that aims at merging these design and performance steps into one fully interactive experience. It is based on an online learning implementation of a Gaus-sian Mixture Model supporting real-time adaptation to user movement and generation of sound parameters. To allow both fine-tune modification tasks and open-ended improvi-sational practices, we designed two interaction modes that either let users shape, or guide interactive motion-sound mappings. Considering an improvisational use case, we propose two example musical applications to illustrate how our tool might support various forms of corporeal engagement with sound, and inspire further perspectives for machine learning-mediated embodied musical expression.
نوع الوثيقة: conference object
اللغة: English
العلاقة: hal-01577806; https://hal.archives-ouvertes.fr/hal-01577806Test; https://hal.archives-ouvertes.fr/hal-01577806/documentTest; https://hal.archives-ouvertes.fr/hal-01577806/file/NIME2017_paper_298_cameraready.pdfTest
الإتاحة: https://hal.archives-ouvertes.fr/hal-01577806Test
https://hal.archives-ouvertes.fr/hal-01577806/documentTest
https://hal.archives-ouvertes.fr/hal-01577806/file/NIME2017_paper_298_cameraready.pdfTest
حقوق: info:eu-repo/semantics/OpenAccess
رقم الانضمام: edsbas.9EE820B3
قاعدة البيانات: BASE