Interactive Dance Bibliography

Ahola, T., Tahiroglu, K., Ahmaniemi, T., Belloni, F., & Ranki, V. (2011). Raja-A Multidisciplinary Artistic Performance Proceedings of the International Conference on New Interfaces for Musical Expression, NIME, Oslo, Norway.


Akerly, J. (2015). Embodied flow in experiential media systems: a study of the dancer’s lived experience in a responsive audio system Proceedings of the 2nd International Workshop on Movement and Computing, Vancouver, British Columbia, Canada. https://doi.org/10.1145/2790994.2790997

Alaoui, S. F. (2019). Making an Interactive Dance Piece: Tensions in Integrating Technology in Art Proceedings of the 2019 on Designing Interactive Systems Conference, San Diego, CA, USA.

Alaoui, S. F., Bevilacqua, F., & Jacquemin, C. (2015). Interactive visuals as metaphors for dance movement qualities. ACM Transactions on Interactive Intelligent Systems (TiiS), 5(3), 1-24.

Alaoui, S. F., Bevilacqua, F., Pascual, B. B., & Jacquemin, C. (2013). Dance interaction with physical model visuals based on movement qualities. International Journal of Arts and Technology, 6(4), 357-387.

Alaoui, S. F., Caramiaux, B., & Serrano, M. (2011). From dance to touch: movement qualities for interaction design CHI'11 Extended Abstracts on Human Factors in Computing Systems, Vancouver, Canada.

Alaoui, S. F., Caramiaux, B., Serrano, M., & Bevilacqua, F. (2012). Movement qualities as interaction modality Proceedings of the Designing Interactive Systems Conference, Newcastle Upon Tyne, United Kingdom.

Alaoui, S. F., Françoise, J., Schiphorst, T., Studd, K., & Bevilacqua, F. (2017). Seeing, sensing and recognizing Laban movement qualities Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems,

Alaoui, S. F., Jacquemin, C., & Bevilacqua, F. (2013). Chiseling bodies: an augmented dance performance CHI '13 Extended Abstracts on Human Factors in Computing Systems, Paris, France.

Alaoui, S. F., & Matos, J.-M. (2021). RCO : Investigating Social and Technological Constraints through Interactive Dance Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan. https://doi.org/10.1145/3411764.3445513

Alborno, P., Cera, A., Piana, S., Mancini, M., Niewiadomski, R., Canepa, C., Volpe, G., & Camurri, A. (2016). Interactive sonification of movement qualities–a case study on fluidity Proceedings of ISon, 5th Interactive Sonification Workshop, Bielefeld University, Germany.

Andersson López, L. (2020). SENSITIV – Mapping Design of Movement Data to Sound Parameters when Creating a Sonic Interaction Design Tool for Interactive Dance [Master’s thesis, KTH - Kungliga Tekniska Högskolan]. Stockholm.

Aylward, R., Lovell, S. D., & Paradiso, J. A. (2006, 3-5 April 2006). A compact, wireless, wearable sensor network for interactive dance ensembles International Workshop on Wearable and Implantable Body Sensor Networks (BSN'06), Cambridge, MA, USA

Aylward, R., & Paradiso, J. A. (2006). Sensemble: a wireless, compact, multi-user sensor system for interactive dance New Interfaces for Musical Expression, NIME, Paris, France.

Baalman, M. A. J. (2022). Composing Interactions - An Artist's Guide to Building Expressive Interactive Systems. V2_.

Bahn, C., Hahn, T., & Trueman, D. (2001). Physicality and Feedback: A Focus on the Body in the Performance of Electronic Music International Computer Music Conference, ICMC 2001, Havana, Cuba. http://hdl.handle.net/2027/spo.bbp2372.2001.058

Bakogiannis, K., Andreopoulou, A., & Georgaki, A. (2021). The development of a dance-musification model with the use of machine learning techniques under COVID-19 restrictions. In Audio Mostly 2021 (pp. 81–88). Association for Computing Machinery. https://doi.org/10.1145/3478384.3478407

Bannerman, A. (2005). Connecting Spaces–Motion-capture, Dance, Sound.(formerly Dancing Sound–Sounding Dance) Sound Moves - An International Conference on Music and Dance, Roehampton University, London, England.

Barbosa, J., Calegario, F., Tragtenberg, J., Cabral, G., Ramalho, G., & Wanderley, M. M. (2015). Designing DMIs for popular music in the brazilian northeast: lessons learned Proceedings of the International Conference of New Interfaces of Musical Expression, NIME’15, Baton Rouge, LA.

Barker, M., & Munster, A. (2018, July). Moving data: artistic tendencies in visualising human and non-human movement Proceedings of the Conference on Electronic Visualisation and the Arts, London.

Beira, J. F. (2016). 3D (embodied) projection mapping and sensing bodies: a study in interactive dance performance [Doctoral dissertation, The University of Texas at Austin]. Austin.

Beller, G., & Aperghis, G. (2011). Gestural control of real-time concatenative synthesis in luna park P3S (Performative Speech and Singing Synthesis), Vancouver, Canada.

Bergsland, A. (2022). Designing interactive sonifications for the exploration of dance phrase edges Sound and Music Computing, SMC2022, St.Étienne, France.

Bergsland, A. (2022). Dance phrase onsets and endings in an interactive dance study Proceedings of the International Conference on Audio Mostly, St.Pölten, Austria.

Bergsland, A., Saue, S., & Stokke, P. (2019). VIBRA-Technical and Artistic Issues in an Interactive Dance Project 16th Sound and Music Computing, SMC'19, Malaga, Spain.

Bergsland, A., & Wechsler, R. (2015). Composing Interactive Dance Pieces for the MotionComposer, a device for Persons with Disabilities New Interfaces of Musical Expression NIME2015, Baton Rouge, Louisiana.

Bergsland, A., & Wechsler, R. (2017). Issues and Strategies of Rhythmicality for MotionComposer Proceedings of the 4th International Conference on Movement Computing (MOCO '17), London.

Berman, A., & James, V. (2015). Kinetic Dialogues: Enhancing creativity in dance Proceedings of the 2nd International Workshop on Movement and Computing (MOCO’15), Vancouver, Canada.

Bermudez, B., DeLahunta, S., Hoogenboom, M., Ziegler, C., Bevilacqua, F., Alaoui, S. F., & Gutierrez, B. M. (2011). The Double Skin/Double Mind Interactive Installation. Journal for Artistic Research, 0. https://www.researchcatalogue.net/view/8247/8248/0/0

Bernini, D., De Michelis, G., Plumari, M., Tisato, F., & Cremaschi, M. (2012). Towards Augmented Choreography ArtsIT 2011, Esbjerg, Denmark.

Bevilacqua, F., Naugle, L., & Dobrian, C. (2001). Music control from 3D motion capture of dance Proceedings of New Interfaces of Musical Expression, a CHI 2001 workshop, Seattle.

Bevilacqua, F., Naugle, L., & Valverde, I. (2001). Virtual dance and music environment using motion capture IEEE-Multimedia Technology And Applications Conference, Irvine CA.

Bevilacqua, F., Ridenour, J., & Cuccia, D. (2002). 3d motion capture data: Motion analysis and mapping to music The Sixth Distributed Memory Computing Conference, Santa Barbara, CA.

Bevilacqua, F., Schnell, N., Fdili Alaoui, S., Klein, G., & Noeth, S. (2011). Gesture capture: Paradigms in interactive music/dance systems. In G. Klein & S. Noeth (Eds.), Emerging Bodies: The Performance of Worldmaking in Dance and Choreography (Vol. 183, pp. 183-193). Transaction Publishers.

Biggs, S., Hawksley, S., & Paine, G. (2016). Bodytext: Somatic Data as Agency in Interactive Dance. In C. Fernandes (Ed.), Multimodality and Performance (pp. 179-186). Cambridge Scholars Publishing. http://search.ebscohost.com/login.aspx?direct=true&db=nlebk&AN=1339017&site=ehost-live

Biggs, S. J., Hawksley, S., & Paine, G. (2013). Bodytext essay 19th International Symposium Electronic Art, Sydney, Australia.

Birringer, J. (2002). Dance and media technologies. PAJ: A Journal of Performance and Art, 24(1), 84-93.

Birringer, J. (2004). Interactive dance, the body and the Internet. Journal of Visual Art Practice, 3(3). https://doi.org/doi: 10.1386/jvap.3.3.165/0

Birringer, J. (2004). Dance and Interactivity. Dance Research Journal, 35/36, 88-112. www.jstor.org/stable/30045071

Birringer, J. (2005). Interactivity:'user testing'for participatory artworks. International Journal of Performance Arts & Digital Media, 1(2).

Birringer, J. (2006). Saira Virous: Game Choreography in Multiplayer Online Performance Spaces. In S. Broadhurst & J. Machon (Eds.), Performance and Technology. Practices of Virtual Embodiment and Interactivity (pp. 43-59). Palgrave Macmillan.

Birringer, J. (2007). Performance and science. PAJ: A Journal of Performance and Art, 29(1), 21-35.

Birringer, J. (2008). After Choreography. Performance Research, 13(1), 118-122. https://doi.org/10.1080/13528160802465649

Birringer, J. (2010). Moveable worlds/Digital scenographies. International Journal of Performance Arts and Digital Media, 6(1), 89-107. https://doi.org/10.1386/padm.6.1.89_1

Birringer, J. (2022). Somatechnics and Difference. Teatro: Revista de Estudios Culturales/A Journal of Cultural Studies, 34(1), 8.

Birringer, J., & Danjoux, M. (2009). Wearable performance. Digital Creativity, 20(1-2), 95-113. https://doi.org/10.1080/14626260902868095

Birringer, J., & Danjoux, M. (2013). The sound of movement wearables: performing ukiyo. Leonardo, 46(3), 232-240. https://doi.org/10.1162/LEON_a_00562

Birringer, J. H. (2008). Performance, technology, & science. Paj Publication.

Bisig, D., & Palacio, P. (2012). STOCOS. Dance in a Synergistic Environment 15h Generative Art Conference GA2012, Lucca, Italy.

Bisig, D., & Palacio, P. (2014). Phantom Limb Hybrid Embodiments for Dance Proceedings of the Generative Art Conference, Rome, Italy.

Bisig, D., & Palacio, P. (2016). Neural narratives - Dance with virtual body extensions [Conference Paper]. 3rd International Symposium on Movement and Computing, MOCO 2016, Thessaloniki; Greece.

Bisig, D., & Palacio, P. (2020, September 15–17, 2020). Sounding feet Proceedings of the 15th International Conference on Audio Mostly, Graz, Austria.

Bisig, D., Palacio, P., & Romero, M. (2016, 2016). Piano & Dancer 19th Generative Art Conference GA2016, Florence, Italia.

Bisig, D., Palacio, P., & Romero, M. (2016). The Neural Narratives Project - Multimodal virtual body extensions for Metabody. METABODY Journal of Metacultural Critique, 2.

Bisig, D., Palacio, P., Romero, M., & Pérez, A. (2018). Sounding Feet. Sonifying Foot Pressure for Dance Proceedings The international symposium on Movement and Computing MOCO’18, Genoa, Italy.

Bisig, D., & Unemi, T. (2009). Swarms on stage-swarm simulations for dance performance Proceedings of the Generative Art Conference., Milano, Italy.

Bluff, A. J. (2017). Interactive art, immersive technology and live performance [PhD Thesis, University of Technology Sydney]. Sydney.

Bomba, M. S., & Dahlstedt, P. (2019). Somacoustics: Interactive Body-as-Instrument New Interfaces of Musical Expression, NIME2019, Porto Alegre, Brazil.

Bromwich, M. A. (1995). A Single Performer Controlled Interface for Electronic Dance/Music Theatre Proceedings of the International Computer Music Conference, ICMC, Banff, Canada.

Brown, C. (2006). Learning to Dance with Angelfish: Choreographic Encounters Between Virtuality and Reality. In S. Broadhurst & J. Machon (Eds.), Performance and Technology. Practices of Virtual Embodiment and Interactivity (pp. 85-99). Palgrave Macmillan.

Brown, C. (2019). Machine Tango: An Interactive Tango Dance Performance Proceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied Interaction, Tempe, Arizona, USA. https://doi.org/10.1145/3294109.3301263

Brown, C., & Paine, G. (2015). Interactive Tango Milonga: designing internal experience Proceedings of the 2nd International Workshop on Movement and Computing, MOCO'15, Vancouver.

Brown, C., & Paine, G. (2019). A Case Study in Collaborative Learning via Participatory Music Interactive Systems: Interactive Tango Milonga. In S. Holland, T. Mudd, K. Wilkie-McKenna, A. McPherson, & M. M. Wanderley (Eds.), New Directions in Music and Human-Computer Interaction (pp. 285-306). Springer International Publishing. https://doi.org/10.1007/978-3-319-92069-6_18

Calvo-Merino, B., Glaser, D. E., Grezes, J., Passingham, R. E., & Haggard, P. (2005). Action Observation and Acquired Motor Skills: An fMRI Study with Expert Dancers. Cerebral Cortex, 15(8), 1243-1249. http://cercor.oxfordjournals.org/cgi/content/abstract/15/8/1243

Camurri, A. (1995). Interactive Dance/Music Systems Interactive Dance/Music Systems, San Fransisco.

Camurri, A., Canepa, C., Coletta, P., Ferrari, N., Mazzarino, B., & Volpe, G. (2008). Social active listening and making of expressive music: the interactive piece the bow is bent and drawn Proceedings of the 3rd international conference on Digital Interactive Media in Entertainment and Arts, Athens, Greece. https://doi.org/10.1145/1413634.1413701

Camurri, A., Chiarvetto, R., Coglio, A., Di Stefano, M., Liconte, C., Massari, A., Massucco, C., Murta, D., Nervi, S., & Palmieri, G. (1997). Toward Kansei Information Processing in music/dance interactive multimodal environments Proc. Italian Assoc. for Musical Informatics (AIMI) Intl. Workshop Kansei: The Technology of Emotion, Genoa, Italy.

Camurri, A., & Ferrentino, P. (1999). Interactive environments for music and multimedia [journal article]. Multimedia Systems, 7(1), 32-47. https://doi.org/10.1007/s005300050109

Camurri, A., Hashimoto, S., Ricchetti, M., Ricci, A., Suzuki, K., Trocca, R., & Volpe, G. (2000). Eyesweb: Toward gesture and affect recognition in interactive dance and music systems. Computer Music Journal, 24(1), 57-69.

Camurri, A., Hashimoto, S., Ricchetti, M., Suzuki, K., Trocca, R., & Volpe, G. (1999). KANSEI analysis of movement in dance/music interactive systems 2nd International Symposium on HUmanoid and RObotics (HURO99), Tokyo.

Camurri, A., Mazzarino, B., Ricchetti, M., Timmers, R., & Volpe, G. (2004). Multimodal Analysis of Expressive Gesture in Music and Dance Performances. In A. Camurri & G. Volpe (Eds.), Gesture-Based Communication in Human-Computer Interaction (Vol. 2915, pp. 20-39). Springer. https://doi.org/10.1007/978-3-540-24598-8_3

Camurri, A., & Moeslund, T. B. (2010). Visual Gesture Recognition. In R. I. Godøy & M. Leman (Eds.), Musical Gestures. Sound, Movement, and Meaning (pp. 238-263). Routledge.

Camurri, A., Volpe, G., De Poli, G., & Leman, M. (2005). Communicating expressiveness and affect in multimodal interactive systems. IEEE MultiMedia, 12(1), 43-53. https://doi.org/10.1109/MMUL.2005.2

Camurri, A., Volpe, G., Piana, S., Mancini, M., Niewiadomski, R., Ferrari, N., & Canepa, C. (2016). The dancer in the eye: towards a multi-layered computational framework of qualities in movement Proceedings of the 3rd International Symposium on Movement and Computing, Thessalonoki, Greece.

Candau, Y., Françoise, J., Alaoui, S. F., & Schiphorst, T. (2017). Cultivating kinaesthetic awareness through interaction: Perspectives from somatic practices and embodied cognition Proceedings of MOCO’17, London.

Çarçani, K., Hansen, V. W., & Maartmann-Moe, H. (2018). Exploring Technology Use in Dance Performances Human-Computer Interaction. Interaction in Context. HCI 2018,

Clay, A., Couture, N., Decarsin, E., Desainte-Catherine, M., Vulliard, P.-H., & Larralde, J. (2012). Movement to emotions to music: using whole body emotional expression as an interaction for electronic music generation Proceedings of the International Conference of New Interfaces of Musical Expression, NIME, Ann Arbour, MI.

Clay, A., Couture, N., & Nigay, L. (2007). Emotion capture based on body postures and movements. arXiv preprint arXiv:0710.0847.

Clay, A., Couture, N., & Nigay, L. (2009). Towards an architecture model for emotion recognition in interactive systems: application to a ballet dance show ASME-AFM World Conference on Innovative Virtual Reality, WINVR2009, Chalon-sur-Saône, France.

Clay, A., Couture, N., & Nigay, L. (2009, 10-12 Sept. 2009). Engineering affective computing: A unifying software architecture 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, Amsterdam.

Clay, A., Couture, N., Nigay, L., De La Riviere, J.-B., Martin, J.-C., Courgeon, M., Desainte-Catherine, M., Orvain, E., Girondel, V., & Domengero, G. (2012). Interactions and systems for augmenting a live dance performance 2012 IEEE International Symposium on Mixed and Augmented Reality-Arts, Media, and Humanities (ISMAR-AMH), Atlanta, GA.

Clay, A., Delord, E., Couture, N., & Domenger, G. (2009). Augmenting a ballet dance show using the dancer’s emotion: Conducting joint research in dance and computer science International Conference on Arts and Technology, ArtsIT, Yi-Lan, Taiwan.

Clay, A., Domenger, G., Conan, J., Domenger, A., & Couture, N. (2014). Integrating augmented reality to enhance expression, interaction & collaboration in live performances: A ballet dance case study 2014 IEEE International Symposium on Mixed and Augmented Reality-Media, Art, Social Science, Humanities and Design (ISMAR-MASH'D), Munich, Germany.

Clay, A., Lombardo, J.-C., Couture, N., & Conan, J. (2014). Bi-manual 3D painting: an interaction paradigm for augmented reality live performance. In Advanced Research and Trends in New Technologies, Software, Human-Computer Interaction, and Communicability (pp. 423-430). IGI Global.

Coniglio, M. (2006). Materials vs Content in Digitally Mediated Performance. In S. Broadhurst & J. Machon (Eds.), Performance and Technology. Practices of Virtual Embodiment and Interactivity (pp. 78-84). Palgrave Macmillan.

Côté-Allard, U., St-Onge, D., Giguère, P., Laviolette, F., & Gosselin, B. (2017, 28 Aug.-1 Sept. 2017). Towards the use of consumer-grade electromyographic armbands for interactive, artistic robotics performances 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Lisbon, Portugal.

Dahlstedt, P., & Dahlstedt, A. S. (2019). OtoKin: Mapping for Sound Space Exploration through Dance Improvisation New Interfaces for Musical Expression, Porto Alegre, Brazil.

De Spain, K. (2000). Dance and technology: A pas de deux for post-humans. Dance Research Journal, 32(1), 2-17.

deLahunta, S. (2002). Software for dancers: coding forms. Journal of Performance Research, 7(2), 97-102.

deLahunta, S. (2002). Periodic Convergences: Dance and Computers. In S. Dinkla & M. Leeker (Eds.), Dance and Technology. Moving towards Media Productions (pp. 66-87). Alexander Verlag.

deLahunta, S. (2002). ISADORA "almost out of beta": tracing the development of a new software tool for artists. Part I: in dialogue with Mark Coniglio. Retrieved April 7 from http://www.sdela.dds.nl/sfd/isadora.html

deLahunta, S. (2010). Shifting Interfaces: Art Research at the Intersections of Live Performance and Technology [PhD, University of Plymouth].

deLahunta, S., & Bevilacqua, F. (2007). Sharing descriptions of movement. International Journal of Performance Arts and Digital Media, 3(1), 3-16. https://doi.org/10.1386/padm.3.1.3_1

Di Donato, B., Bullock, J., & Tanaka, A. (2018). Myo Mapper: a Myo armband to OSC mapper Proceedings of the International Conference on New Interfaces of Musical Expression, Blacksburg, Virginia.

Dinkla, S. (2002). Towards a Rhetoric and Didactics of Digital Dance. In S. Dinkla & M. Leeker (Eds.), Dance and Technology. Moving towards Media Productions (pp. 15-29). Alexander Verlag.

Dixon, S. (2007). Digital Performance: A History of New Media in Theater, Dance, Performance Art, and Installation. The MIT Press.

Dovgan, E., Cigon, A., Sinkovec, M., & Klopcic, U. (2008, 10-12 Sept. 2008). A system for interactive virtual dance performance 2008 50th International Symposium ELMAR, Zadar, Croatia.

Downie, M. N. (2005). Choreographing the extended agent: performance graphics for dance theater Massachusetts Institute of Technology].

El-Nasr, M. S., & Vasilakos, T. (2006). Digitalbeing: An ambient intelligent dance space 2006 IEEE International Conference on Fuzzy Systems, Vancouver, BC, Canada

Erdem, C., Schia, K. H., & Jensenius, A. R. (2019). Vrengt: A Shared Body–Machine Instrument for Music–Dance Performance Proceedings of the International Conference on New Interfaces for Musical Expression,

Erkut, C., & Dahl, S. (2017). Embodied Interaction through Movement in a Course Work [Conference Paper]. 4th International Conference on Movement Computing, MOCO 2017, London. https://www.scopus.com/inward/record.uri?eid=2-s2.0-85037716360&doi=10.1145%2f3077981.3078026&partnerID=40&md5=e1110fccacc2008bf6f003225344806c

Evert, K. (2002). Dance and Technology at the Turn of the Last and Present Centuries. In S. Dinkla & M. Leeker (Eds.), Dance and Technology. Moving towards Media Productions (pp. 30-65). Alexander Verlag.

Farley, K. (2002). Digital dance theatre: The marriage of computers, choreography and techno/human reactivity. Body, Space and Technology Journal, 3(1), 39-46.

Feldmeier, M., & Paradiso, J. A. (2007). An Interactive Music Environment for Large Groups with Giveaway Wireless Motion Sensors. Computer Music Journal, 31(1), 50-67.

Francksen, K. (2007). Isadora as a means of composing. A detailed discussion of the potential implications of utilising new technologies within pedagogy. Body, Space and Technology Journal, 7(2).

Françoise, J., Fdili Alaoui, S., Schiphorst, T., & Bevilacqua, F. (2014). Vocalizing dance movement for interactive sonification of laban effort factors Proceedings of the 2014 conference on Designing interactive systems, Vancouver, Canada.

Frid, E., Bresin, R., Alborno, P., & Elblaus, L. (2016). Interactive Sonification of Spontaneous Movement of Children—Cross-Modal Mapping and the Perception of Body Movement Qualities through Sound. Frontiers in Neuroscience, 10, 521. https://doi.org/10.3389/fnins.2016.00521

Frid, E., Elblaus, L., & Bresin, R. (2016, December 16). Sonification of fluidity—an exploration of perceptual connotations of a particular movement feature Proceedings of the 5th interactive sonification workshop (ISon2016), Bielefeld University, Germany.

Frid, E., Elblaus, L., & Bresin, R. (2018). Interactive sonification of a fluid dance movement: an exploratory study. Journal on Multimodal User Interfaces, 1-9. https://doi.org/ https://doi.org/10.1007/s12193-018-0278-y

Fuhrmann, A. L., Kretz, J., & Burwik, P. (2013). Multi sensor tracking for live sound transformation New Interfaces of Musical Expression, NIME’13, Daejeon, South Korea.

Gardner, P., Sturgeon, H., Jones, L., & Surlin, S. (2016). Body Editing: Dance Biofeedback Experiments in Apperception International Conference on Human-Computer Interaction. Interaction Platforms and Techniques. HCI 2016,

Giomi, A. (2017). La pensée sonore du corps: Pour une approche écologique à la médiation technologique, au mouvement et à l'interaction sonore. [PhD dissertation, l’Université Côte d’Azur].

Giomi, A. (2020, July 15–17, 2020). Somatic Sonification in Dance Performances. From the Artistic to the Perceptual and Back Proceedings of the 7th International Conference on Movement and Computing, MOCO’20, Jersey City/Virtual, NJ, USA.

Giomi, A., & Fratagnoli, F. (2018). Listening Touch: A Case Study about Multimodal Awareness in Movement Analysis with Interactive Sound Feedback Proceedings of the 5th International Conference on Movement and Computing, Genoa, Italy.

Giomi, A., & Leonard, J. (2020). Towards an Interactive Model-Based Sonification of Hand Gesture for Dance Performance Proceedings of the International Conference on New Interfaces for Musical Expression, NIME’20, Birmingham, UK.

Golz, P., & Shaw, A. (2014). Augmenting live performance dance through mobile technology BCS-HCI'14 Proceedings of the 28th International BCS Human Computer Interaction Conference on HCI 2014-Sand, Sea and Sky-Holiday HCI, Southport, UK.

Gonzalez, B., Carroll, E., & Latulipe, C. (2012). Dance-inspired technology, technology-inspired dance Proceedings of the 7th Nordic Conference on Human-Computer Interaction: Making Sense Through Design, Copenhagen, Denmark.

Griggio, C. F., & Romero, M. (2015). Canvas Dance: An Interactive Dance Visualization for Large-Group Interaction Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, Seoul, Republic of Korea.

Großhauser, T., Bläsing, B., Spieth, C., & Hermann, T. (2012). Wearable sensor-based real-time sonification of motion and foot pressure in dance teaching and training. Journal of the Audio Engineering Society, 60(7/8), 580-589.

Guedes, C. (2005). Mapping movement to musical rhythm: A study in interactive dance [PhD thesis, NYU]. New York.

Guedes, C. (2005). The m-objects: A Small Library for Musical Rhythm Generation and Musical Tempo Control from Dance Movement in Real Time Internationa Computer Music Conference, ICMC2005, Barcelona, Spain.

Guedes, C. (2006). Extracting Musically-Relevant Rhythmic Information from Dance Movemen by Applying Pitch-Tracking Techniques to a Video Signal Sound and Music Computing Conference 2006, Marseille, France.

Guedes, C. (2007). Translating Dance Movement into Musical Rhythm in Real Time: New Possibilities for Computer-Mediated Collaboration in Interactive Dance Performance The International Computer Music Conference, ICMC, Copenhagen, Denmark.

Guedes, C., & Woolford, K. (2007). Controlling Aural and Visual Particle Systems through Human Movement SMC'07 The 4th Sound and Music Computing Conference, Lefkada, Greece.

Gündüz, Z. (2010). Interactive dance: The merger of media technologies and the dancing body. In D. Riha (Ed.), Humanity in cybernetic environments (pp. 71-82). Inter-Disciplinary Press. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.611.8096&rep=rep1&type=pdf#page=85

Gündüz, Z. (2012). Digital dance:(dis) entangling human and technology [Doctoral dissertation, University of Amsterdam]. Amsterdam.

Haag, J. C. (2009). Inertial motion capture and live performance (with a focus on dance). Dance Dialogues: Conversations Across Cultures, Artforms and Practices: Refereed Proceedings of the World Dance Alliance Global Summit 2008, 1-12.

Hahn, T., & Bahn, C. (2002). Pikapika - the collaborative composition of an interactive sonic character. Organised Sound, 7(3), 229-238.

Hattwick, I., Malloch, J., & Wanderley, M. M. (2014). Forming Shapes to Bodies: Design for Manufacturing in the Prosthetic Instruments Proceedings of the International Conference of New Interfaces of Musical Expression, NIME’14, London, UK.

Hattwick, I., & Wanderley, M. M. (2015). Interactive lighting in the pearl: considerations and implementation New Interfaces of Musical Expression, NIME’15, Baton Rouge, LA.

Hattwick, I., & Wanderley, M. M. (2017). Design of hardware systems for professional artistic applications New Interfaces for Musical Expression, NIME’17, Aalborg.

Hawksley, S., & Biggs, S. (2006). Memory maps in interactive dance environments. International Journal of Performance Arts and Digital Media, 2(2), 123-137. https://doi.org/10.1386/padm.2.2.123_1

Horwitz, A. (2014, July 5). Talking to Troika Ranch. Ephemeral Objects - Arts Criticism for the Post-Material World. https://www.ephemeralobjects.org/2014/07/05/talking-to-troika-ranch/

Houser, J. F. (2014). Reflections: For interactive electronics, dancer, and variable instruments [Dissertation in Fine Arts, Texas Tech University]. http://hdl.handle.net/2346/58674

Hsu, A., & Kemper, S. (2015). Kinesonic approaches to mapping movement and music with the remote electroacoustic kinesthetic sensing (RAKS) system Proceedings of the 2nd International Workshop on Movement and Computing, MOCO'15, Vancouver.

Hsu, A., & Kemper, S. (2018, February 15-17, 2018). Why Should Our Bodies End at the Skin? Enacting Cyborg Performance The Sixteenth Biennial Symposium on Arts and Technology, Connecticut College.

Hsu, A., & Kemper, S. (2019). Enacting Sonic-Cyborg Performance through the Hybrid Body in Teka-Mori and Why Should Our Bodies End at the Skin? Leonardo music journal, 29, 83-87.

Hsu, A., & Kemper, S. (2019). The Hybrid Body and Sonic-Cyborg Performance in Why Should Our Bodies End at the Skin? Proceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied Interaction, Tempe, Arizona, USA. https://doi.org/10.1145/3294109.3301255

Hsu, A., & Kemper, S. T. (2015). Kinesonic Composition as Choreographed Sound: Composing Gesture in Sensor-Based Music Proceedings of the International Conference on Computer Music, ICMC, University of North Texas.

Hsu, A. Y., & Kemper, S. T. (2010). Shadows no. 4: belly dance and interactive electroacoustic musical performance CHI ’10 Extended Abstracts on Human Factors in Computing Systems, Atlanta, Georgia, USA. https://doi.org/10.1145/1753846.1753929

Hsueh, S., Alaoui, S. F., & Mackay, W. E. (2019). Understanding Kinaesthetic Creativity in Dance Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, Scotland Uk. https://doi.org/10.1145/3290605.3300741

Iwadate, Y., Inoue, M., Suzuki, R., Hikawa, N., Makino, M., & Kanemoto, Y. (2000). MIC Interactive Dance System-an emotional interaction system KES'2000. Fourth International Conference on Knowledge-Based Intelligent Engineering Systems and Allied Technologies. Proceedings, Brighton, UK.

Jaimovich, J. (2016, 2016). Emovere: Designing sound interactions for biosignals and dancers Proceedings of the International Conference on New Interfaces for Musical Expression, Brisbane, Australia.

Jaimovich, J., & Morand, F. (2019). Shaping the biology of emotion: Emovere, an interactive performance. International Journal of Performance Arts and Digital Media, 15(1), 35-52. https://doi.org/10.1080/14794713.2018.1563354

James, J., Ingalls, T., Qian, G., Olsen, L., Whiteley, D., Wong, S., & Rikakis, T. (2006). Movement-based interactive dance performance Proceedings of the 14th ACM international conference on Multimedia,

Jap, L. (2019). Mapping detected periodic dance movements to control tempo in the music playback of Electronic Dance Music [Master’s Thesis, KTH - Kungliga Tekniska Högskolan]. Stockholm.

Jap, L., & Holzapfel, A. (2019, 28-31 May 2019). Real-time Mapping of Periodic Dance Movements to Control Tempo in Electronic Dance Music 16th Sound & Music Computing Conference, SMC2019, Malaga, Spain.

Jarvis, I., & Nort, D. V. (2018). Posthuman Gesture Proceedings of the 5th International Conference on Movement and Computing, Genoa, Italy.

Jensenius, A. R. (2015). Microinteraction in music/dance performance The International Conference on New Interfaces of Musical Expression, Baton Rouge, LA.

Jensenius, A. R., & Bjerkestrand, K. A. V. (2012). Exploring Micromovements with Motion Capture and Sonification Berlin, Heidelberg.

Jessop, E. (2015). Capturing the Body Live: A Framework for Technological Recognition and Extension of Physical Expression in Performance. Leonardo, 48(1), 32-38. https://doi.org/10.1162/LEON_a_00935

Jewett, J. (2005, November 5-6, 2005). REST/LESS: Performing Interactivity in Dance, Music and Text Sound Moves - An International Conference on Music and Dance, Roehampton University, London, UK.

Johnston, A. (2013). Fluid simulation as full body audio-visual instrument New Interfaces for Musical Expression, NIME’13, Daejeon & Seoul, Korea.

Johnston, A. (2015). Conversational Interaction In Interactive Dance Works. Leonardo, 48(3), 296-297.

Johnston, A. (2015). Conceptualising interaction in live performance: reflections on ‘Encoded' Proceedings of the 2nd International Workshop on Movement and Computing, Vancouver.

Jones, S. (2004). Philippa Cullen: Dancing the Music. Leonardo music journal, 14, 65-73. https://doi.org/10.1162/0961121043067307

Jung, D., Jensen, M. H., Laing, S., & Mayall, J. (2012). .cyclic.: an interactive performance combining dance, graphics, music and kinect-technology Proceedings of the 13th International Conference of the NZ Chapter of the ACM's Special Interest Group on Human-Computer Interaction, Dunedin, New Zealand.

Jung, D., Laing, S., Jensen, M. H., Hunkin, P., Löf, A., & Tims, N. (2011). Requirements on dance-driven 3-D camera interaction: a collaboration between dance, graphic design and computer science Proceedings of the 12th Annual Conference of the New Zealand Chapter of the ACM Special Interest Group on Computer-Human Interaction, Hamilton, New Zealand.

Jung, H. Y., Kim, D., & Kim, H. (2015). Study on Interactive Dance Performance based on Wearable Sensor Technology. Advanced Science and Technology Letters, 96, 14-18. https://doi.org/http://dx.doi.org/10.14257/astl.2015.96.04

Jung, J. I. (2018). Bridging Abstract Sound and Dance Ideas with Technology: Interactive Dance Composition as Practice-Based Research International Conference on Live Interfaces, Porto.

Jung, J. I. (2019). Choreographic Sound Composition: Towards a Poetics of Restriction [Doctoral thesis, University of Huddersfield]. Huddersfield.

Källblad, A., Friberg, A., Svensson, K., & Sjöstedt Edelholm, E. (2008). Hoppsa Universum–An interactive dance installation for children Proceedings of the International Conference of New Interfaces for Musical Expression (NIME), Genova, Italy.

Katan, S. (2016). Using Interactive Machine Learning to Sonify Visually Impaired Dancers' Movement Proceedings of the 3rd International Symposium on Movement and Computing, MOCO2016, Paris.

Katan-Schmid, E. (2020). Playing with Virtual Realities: Navigating Immersion within Diverse Environments (Artist-Led Perspective). Body, Space & Technology, 19(1). https://doi.org/http://doi.org/10.16995/bst.341

Kepner, L. S. (1997). Dance and digital media: Troika Ranch and the art of technology. Digital Creativity, 8(1), 11-19.

Kim, H., & Landay, J. A. (2018). Aeroquake: Drone augmented dance [Conference Paper]. DIS 2018 - Proceedings of the 2018 Designing Interactive Systems Conference, Hong Kong. https://www.scopus.com/inward/record.uri?eid=2-s2.0-85054038341&doi=10.1145%2f3196709.3196798&partnerID=40&md5=33bf78702a1d76a8b1e34799673bd8bb

Kim, Y., Jung, D., Park, S., Chi, J., Kim, T., & Lee, S. (2008, 22-24 Sept. 2008). The Shadow Dancer: A New Dance Interface with Interactive Shoes 2008 International Conference on Cyberworlds, Hangzhou, China

Klich, R., & Scheer, E. (2012). Multimedia Performance. Palgrave Macmillan.

Knoth, B. M., & Beattie, E. (2018). Movement signals and narrative noise: the development and performance of Antennae (v.2). International Journal of Performance Arts and Digital Media, 14(1), 84-106. https://doi.org/10.1080/14794713.2018.1453918

Kozel, S. (2008). Closer : Performance, Technologies, Phenomenology [Book]. The MIT Press. http://search.ebscohost.com/login.aspx?direct=true&db=nlebk&AN=220687&site=ehost-live

Krzyzaniak, M., Akerly, J., Mosher, M., Yildirim, M., & Paine, G. (2014). Separation: Short Range Repulsion Proceedings of the International Conference on New Interfaces for Musical Expression, NIME’2014, Goldsmiths, University of London, UK.

Kumlin, T., & Lindell, R. (2017). Biosignal augmented embodied performance [Conference Paper]. 12th International Audio Mostly Conference, London. https://www.scopus.com/inward/record.uri?eid=2-s2.0-85038382634&doi=10.1145%2f3123514.3123547&partnerID=40&md5=e1bbadb94037120d1c5cd7795bd9b2f9

Lamounier, N., Naveda, L., & Bicalho, A. (2019, June 3-6). The design of technological interfaces for interactions between music, dance and garment movements Proceedings of the International Conference on New Interfaces for Musical Expression, NIME, Porto Alegre, Brazil.

Landry, S. (2019). Interactive Sonification Strategies for the Motion and Emotion of Dance Performances [PhD, Michigan Technological University].

Landry, S., & Jeon, M. (2017, June 20-23). Participatory design research methodologies: A case study in dancer sonification The 23rd International Conference on Auditory Display (ICAD 2017), Pennsylvania State University.

Landry, S., Ryan, J. D., & Jeon, M. (2014). Design issues and considerations for dance-based sonification Proceedings of the International Conference on Auditory Display, ICAD, New York, NY, USA.

Lanzalone, S. (2000). Hidden grids: paths of expressive gesture between instruments, music and dance. Organised Sound, 5(1), 17-26.

Latulipe, C., Gonzalez, B., Word, M., Huskey, S., & Wilson, D. (2019). Moderate Recursion: A Digital Artifact of Interactive Dance Cham.

Latulipe, C., & Huskey, S. (2008). Dance.Draw: exquisite interaction Proceedings of the 22nd British HCI Group Annual Conference on People and Computers: Culture, Creativity, Interaction - Volume 2, Liverpool, United Kingdom.

Latulipe, C., Wilson, D., Huskey, S., Gonzalez, B., & Word, M. (2011). Temporal integration of interactive technology in dance: creative process impacts Proceedings of the 8th ACM conference on Creativity and cognition, Atlanta, Georgia, USA.

Latulipe, C., Wilson, D., Huskey, S., Word, M., Carroll, A., Carroll, E., Gonzalez, B., Singh, V., Wirth, M., & Lottridge, D. (2010). Exploring the design space in technology-augmented dance CHI '10 Extended Abstracts on Human Factors in Computing Systems, Atlanta, Georgia, USA.

Lee, E., Enke, U., Borchers, J., & de Jong, L. (2007). Towards rhythmic analysis of human motion using acceleration-onset times New interfaces for musical expression, NIME07, New York, New York.

Lee, J.-s., & Yeo, W. S. (2012). Real-time Modification of Music with Dancer's Respiration Pattern Proceedings of the International Conference of New Interfaces of Musical Expression, NIME’12, Ann Arbour, MI.

Loke, L., & Robertson, T. (2013). Moving and making strange: An embodied approach to movement-based interaction design. ACM Trans. Comput.-Hum. Interact., 20(1), 1-25. https://doi.org/10.1145/2442106.2442113

Ludovico, L. A., El Raheb, K., & Ioannidis, Y. (2013, October 15-18, 2013). An XML-based Web Interface to Present and Analyze the Music Aspect of Dance International Symposium on Computer Music Multidisciplinary Research, Marseille, France.

Lynch, A., Majeed, B., O'flynn, B., Barton, J., Murphy, F., Delaney, K., & O'Mathuna, S. (2005). A wireless inertial measurement system (WIMS) for an interactive dance environment. Journal of Physics: Conference Series, 15(1), 95-100. https://doi.org/10.1088/1742-6596/15/1/016

MacCallum, J., & Naccarato, T. (2015). The impossibility of control: real-time negotiations with the heart. Electronic Visualisation and the Arts (EVA 2015), 184-191.

Mandilian, L., Diefenbach, P., & Kim, Y. (2010). Information overload: a collaborative dance performance. IEEE MultiMedia(1), 8-13.

Masu, R., Correia, N. N., Jurgens, S., Druzetic, I., & Primett, W. (2019, October 23-25). How do Dancers Want to Use Interactive Technology? ARTECH19 - 9th International Conference on Digital and Interactive Arts, Braga, Portugal.

Masu, R., Correia, N. N., Jurgens, S., Feitsch, J., & Romão, T. (2020). Designing interactive sonic artefacts for dance performance: an ecological approach AM'20: Audio Mostly, Graz, Austria.

Masu, R., Pajala-Assefa, H., Correia, N. N., & Romão, T. (2021). Full-Body Interaction in a Remote Context: Adapting a Dance Piece to a Browser-Based Installation 10th International Conference on Digital and Interactive Arts, Aveiro, Portugal, Portugal. https://doi.org/10.1145/3483529.3483747

McCallum, L., & Fiebrink, R. (2019, 4 May). Supporting Feature Engineering in End-User Machine Learning CHI 2019 Workshop on Emerging Perspectives in Human-Centered Machine Learning, Glasgow, United Kingdom.

McNeilly, J. (2014). A phenomenology of chunky move's glow: Moves toward a digital dramaturgy. Australasian Drama Studies(65), 53-76.

Meador, W. S., Rogers, T. J., O'Neal, K., Kurt, E., & Cunningham, C. (2004). Mixing dance realities: collaborative development of live-motion capture in a performing arts environment. Computers in Entertainment (CIE), 2(2), 1-15. https://doi.org/10.1145/1008213.1008233

Mentis, H. M., & Johansson, C. (2013). Seeing movement qualities Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, France.

Miller, L. E. (2001). Cage, Cunningham, and Collaborators: The Odyssey of "Variations V". The Musical Quarterly, 85(3), 545-567. http://www.jstor.org/stable/3600996

Moraitis, E. (2019). Symbiotic Synergies: Adaptive Framework for Polydisciplinary Collaboration in Performance Practice [Doctoral thesis, University of Salford]. Salford, UK. https://usir.salford.ac.uk/id/eprint/51263/1/EmmanuilMoraitis-SymbioticSynergies.pdf

Morales-Manzanares, R., Morales, E. F., Dannenberg, R., & Berger, J. (2001). SICIB: An interactive music composition system using body movements. Computer Music Journal, 25(2), 25-36.

Moriaty, M. (2020). Symbiosis: a biological taxonomy for modes of interaction in dance-music collaborations Proceedings of the International Conference on New Interfaces for Musical Expression, NIME-20, Birmingham, UK.

Moriaty, M. (2020). Interspecific Interactions: interaction modes between sound and movement in collaborative performance. In R. Earnshaw, S. Liggett, P. Excell, & D. Thalmann (Eds.), Technology, Design and the Arts-Opportunities and Challenges (pp. 121-138). Springer Open.

Moriaty, M. (2021). Identifying modes of interaction in dance-music collaborations Electroacoustic Music Studies Network, EMS, Leicester, UK.

Moura, J. M., Sousa, J., Branco, P., & Marcos, A. (2008). You Move You Interact: a full-body dance in-between reality and virtuality Proceedings of Artech 2008–4th International Conference on Digital Arts, Porto, Portugal.

Mullis, E. (2013). Dance, Interactive Technology, and the Device Paradigm. Dance Research Journal, 45(03), 111-123. https://doi.org/doi:10.1017/S0149767712000290

Murray-Browne, T. (2012). Interactive music: Balancing creative freedom with musical development [Doctoral dissertation, Queen Mary University of London]. London.

Murray-Browne, T., Mainstone, D., Bryan-Kinns, N., & Plumbley, M. D. (2010). The Serendiptichord: A wearable instrument for contemporary dance performance 128th Audio Engineering Society Convention, London.

Murray-Browne, T., Mainstone, D., Bryan-Kinns, N., & Plumbley, M. D. (2011). The medium is the message: Composing instruments and performing mappings Proceedings of the International Conference on New Interfaces for Musical Expression, NIME, Oslo, Norway.

Murray-Browne, T., Mainstone, D., Bryan-Kinns, N., & Plumbley, M. D. (2013). The Serendiptichord: Reflections on the Collaborative Design Process between Artist and Researcher. Leonardo, 46(1), 86-87. https://doi.org/10.1162/LEON_a_00494

Murray-Browne, T., & Tigas, P. (2021). Emergent Interfaces: Vague, Complex, Bespoke and Embodied Interaction between Humans and Computers. Applied Sciences, 11(18), 8531. https://www.mdpi.com/2076-3417/11/18/8531

Murray-Browne, T., & Tigas, P. (2021). Latent Mappings: Generating Open-Ended Expressive Mappings Using Variational Autoencoders Proceedings of the International Conference on New Interfaces for Musical Expression NYU Shanghai, China.

Naccarato, T. (2019). Re/contextualization: On the critical appropriation of technologies as artistic practice Coventry University].

Naccarato, T. J., & Maccallum, J. (2017). Critical appropriations of biosensors in artistic practice Proceedings of the 4th International Conference on Movement Computing, London, United Kingdom.

Nagata, N., Okumoto, K., Iwai, D., Toro, F., & Inokuchi, S. (2005). Analysis and Synthesis of Latin Dance Using Motion Capture Data. In K. Aizawa, Y. Nakamura, & S. i. Satoh (Eds.), Advances in Multimedia Information Processing - PCM 2004 (Vol. 3333, pp. 39-44). Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-540-30543-9_6

Naveda, L., & Santana, I. (2014). “Topos” toolkit for Pure Data: exploring the spatial features of dance gestures for interactive musical applications International Computer Music Conference, Athens.

Ng, K. (2002). Sensing and mapping for interactive performance. Organised Sound, 7(2), 191-200.

Niemetz, A., Brown, C., & Gander, P. (2011). Performing Sleep/Wake Cycles: An Arts-Science Dialogue through Embodied Technologies. Body, Space & Technology, 10(1). https://doi.org/http://doi.org/10.16995/bst.91

Niewiadomski, R., Mancini, M., Cera, A., Piana, S., Canepa, C., & Camurri, A. (2019). Does embodied training improve the recognition of mid-level expressive movement qualities sonification? Journal on Multimodal User Interfaces, 13(3), 191-203.

Nixdorf, J., & Gerhard, D. (2006). Real-Time Sound Source Spatialization as used in Challenging Bodies: Implementation and Performance NIME, Paris, France.

O’Flynn, B., Torre, G., Fernstrom, M., Winkler, T., Lynch, A., Barton, J., Angove, P., & O’Mathuna, S. C. (2007). Celeritas — A Wearable Sensor System for Interactive Digital Dance Theatre 4th International Workshop on Wearable and Implantable Body Sensor Networks (BSN 2007), Aachen University, Germany.

Otondo, F. (2018). Using mobile sound to explore spatial relationships between dance and music performance. Digital Creativity, 29(2-3), 115-128.

Otondo, F., & Torres, R. (2016). Sound vest for dance performance Proceedings of the 2016 International Computer Music Conference,

Paine, G. (2009). Pools, Pixies and Potentials International Symposium on Electronic Arts,

Palacio, P., & Bisig, D. (2013). Algorithmic and aesthetic interrelations in the dance piece Stocos Proceedings of the International Symposium “Dance and Music : the Art of the Encounter”, Lyon, France.

Palacio, P., & Bisig, D. (2014). Neural Narratives1: Phantom Limb. Connecting cognitive neurosciences, sound synthesis, generative video and dance Congreso Internacional Espacios Sonoros y Audiovisuales 2013, UAM, Madrid, Spain.

Palacio, P., & Bisig, D. (2014). Neural Narratives1: Phantom Limb. Connecting cognitive neurosciences, sound synthesis, generative video and dance. Congreso Internacional Espacios Sonoros y Audiovisuales, Madrid, Spain.

Palacio, P., & Bisig, D. (2017). Piano & Dancer - Interaction between a dancer and an acoustic instrument [Conference Paper]. 4th International Conference on Movement Computing, London. https://www.scopus.com/inward/record.uri?eid=2-s2.0-85037687847&doi=10.1145%2f3077981.3078052&partnerID=40&md5=f35cbd7b67b95d2e8a0ee6824a3ec92d

Paradiso, J., Hu, E., & Hsiao, K. Y. (1998). Instrumented footwear for interactive dance Proc. of the XII Colloquium on Musical Informatics, Gorizia, Italy.

Paradiso, J., & Sparacino, F. (1997). Optical tracking for music and dance performance Fourth Conference on Optical 3D Measurement Techniques, ETH, Zurich.

Paradiso, J. A. (1999). The Brain Opera Technology: New Instruments and Gestural Sensors for Musical Interaction and Performance. Journal of New Music Research, 28(2), 130-149. https://doi.org/10.1076/jnmr.28.2.130.3119

Paradiso, J. A. (2006). Some novel applications for wireless inertial sensors Proc NSTI Nanotech, Boston, MA.

Paradiso, J. A., Hsiao, K.-Y., Benbasat, A. Y., & Teegarden, Z. (2000). Design and implementation of expressive footwear. IBM Systems Journal, 39(3&4), 511-529.

Paradiso, J. A., Hsiao, K.-Y., & Hu, E. (1999). Interactive Music for Instrumented Dancing Shoes International Conference of Computer Music, ICMC'99, Bejing China.

Paradiso, J. A., Hsiao, K.-y., Strickon, J., Lifton, J., & Adler, A. (2000). Sensor systems for interactive surfaces. IBM Systems Journal, 39(3.4), 892-914.

Paradiso, J. A., & Hu, E. (1997, 13-14 Oct. 1997). Expressive footwear for computer-augmented dance performance Digest of Papers. First International Symposium on Wearable Computers, Cambridge, MA.

Paradiso, J. A., Morris, S. J., Benbasat, A. Y., & Asmussen, E. (2004). Interactive therapy with instrumented footwear CHI '04 Extended Abstracts on Human Factors in Computing Systems, Vienna, Austria.

Park, C., Chou, P. H., & Sun, Y. (2006, 13-17 March 2006). A wearable wireless sensor platform for interactive dance performances Fourth Annual IEEE International Conference on Pervasive Computing and Communications, PERCOM'06,

Park, S. H., Kim, T. W., Jung, D., Chi, J. M., Kim, Y. J., & Lee, S. (2008). The Shadow Dancer: An interactive performance system with a foot interface 18th International Conference on Artificial Reality and Telexistence, ICAT, Yokohama, Japan.

Pinkston, R. F. (1994). A touch sensitive dance floor/MIDI controller. The Journal of the Acoustical Society of America, 96(5), 3302-3302. https://doi.org/10.1121/1.410820

Plant, N., Hilton, C., Gillies, M., Fiebrink, R., Perry, P., González Díaz, C., Gibson, R., Martelli, B., & Zbyszynski, M. (2021, February 14–17). Interactive Machine Learning for Embodied Interaction Design: A tool and methodology Proceedings of the Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction, TEI’21, Salzburg, Austria.

Pohl, H., & Hadjakos, A. (2010). Dance pattern recognition using dynamic time warping Sound and Music Computing Conference, Barcelona.

Popat, S. (2001). Interactive dance-making: online creative collaborations. Digital Creativity, 12(4), 205-214.

Portanova, S. (2013). Moving Without a Body : Digital Philosophy and Choreographic Thoughts [Book]. The MIT Press. http://search.ebscohost.com/login.aspx?direct=true&db=e000xww&AN=562439&site=ehost-live

Povall, R. (1998). Technology is with us. Dance Research Journal, 30(1), 1-4.

Qian, G., Guo, F., Ingalls, T., Olson, L., James, J., & Rikakis, T. (2004). A gesture-driven multimodal interactive dance system ICME'04. 2004 IEEE International Conference on Multimedia and Expo,

Qian, G., James, J., Ingalls, T., Rikakis, T., Rajko, S., Wang, Y., Whiteley, D., & Guo, F. (2006). Human Movement Analysis for Interactive Dance. In H. Sundaram, M. Naphade, J. Smith, & Y. Rui (Eds.), Image and Video Retrieval (Vol. 4071, pp. 499-502). Springer Berlin Heidelberg. https://doi.org/10.1007/11788034_53

Quay, Y. d., Skogstad, S., & Jensenius, A. (2011). Dance Jockey: Performing Electronic Music by Dancing. Leonardo music journal, 21, 11-12. https://doi.org/10.1162/LMJ_a_00052

Raheb, K. E., Tsampounaris, G., Katifori, A., & Ioannidis, Y. (2018). Choreomorphy: A whole-body interaction experience for dance improvisation and visual experimentation Proceedings of the 2018 International Conference on Advanced Visual Interfaces, AVI’18, Castiglione della Pescaia, Italy.

Rizzo, A., El Raheb, K., Whatley, S., Cisneros, R. M., Zanoni, M., Camurri, A., Viro, V., Matos, J.-M., Piana, S., & Buccoli, M. (2018). WhoLoDancE: Whole-body Interaction Learning for Dance Education Proceedings of the Workshop on Cultural Informatics co-located with the EUROMED International Conference on Digital Heritage 2018 (EUROMED 2018), Nicosia, Cyprus.

Robles, C. (2011). The Use of Bio-interfaces in Interactive Multimedia Works: Two Examples. Body, Space & Technology, 10(1). https://doi.org/http://doi.org/10.16995/bst.95

Rovan, J. B., Wechsler, R., & Weiss, F. (2001). Seine hohle Form: Artistic Collaboration in an Interactive Dance and Music Performance Environment. Crossings: eJournal of Art and Technology, 1(2). http://crossings.tcd.ie/issues/1.2/Rovan/

Rovan, J. B., Wechsler, R., & Weiss, F. (2001). Seine hohle Form, a project report. Body, Space and Technology Journal, 2(1). http://people.brunel.ac.uk/bst/vol0201/index.html

Salter, C. (2010). Entangled: technology and the transformation of performance. MIT Press.

Salter, C. (2011). Timbral architectures, aurality’s force: sound and music: Chris Salter. In S. Spier (Ed.), William Forsythe and the Practice of Choreography (pp. 66-84). Routledge.

Salter, C. L., Baalman, M. J., & Moody-Grigsby, D. (2008). Between Mapping, Sonification and Composition: Responsive Audio Environments in Live Performance. In R. Kronland-Martinet, S. Ystad, & K. Jensen (Eds.), Computer Music Modeling and Retrieval. Sense of Sounds (Vol. 4969, pp. 246-262). Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-540-85035-9_17

Schacher, J. C. (2010). Motion to Gesture to Sound: Mapping for interactive dance New Interfaces for Musical Expression, NIME 2010, Sydney, Australia.

Schacher, J. C. (2016). Moving Music: Exploring Movement-to-Sound Relationships Proceedings of the 3rd International Symposium on Movement and Computing, Thessaloniki, GA, Greece.

Schacher, J. C. (2017). Sound Presence. Performing with Bodies and Technology [PhD Thesis, University of Antwerp]. Antwerp, Belgium.

Schacher, J. C., & Bisig, D. (2014). Watch This! Expressive Movement in Electronic Music Performance Proceedings of the 2014 International Workshop on Movement and Computing,

Schacher, J. C., Bisig, D., & Kocher, P. (2014). The map and the flock: Emergence in mapping with swarm algorithms. Computer Music Journal, 38(3), 49-63.

Schacher, J. C., & Stoecklin, A. (2011, 2011). Traces-Body, Motion and Sound Proceedings of the International Conference of New Interfaces of Musical Expression, Oslo, Norway.

Schiphorst, T., Sheppard, R., Loke, L., & Lin, C.-C. (2013). Beautiful dance moves: mapping movement, technology & computation Proceedings of the 9th ACM Conference on Creativity & Cognition, Sydney, Australia.

Scholz, R. E. P., & Ramalho, G. L. (2019). Lowering the Usability Entry Barrier to Interactive Poetics Experimentation in Dance. Interacting with Computers, 31(1), 59-82. https://doi.org/10.1093/iwc/iwz004

Shaw, N. Z. (2011). Synchronous objects, choreographic objects, and the translation of dancing ideas. Emerging Bodies-The Performance of Worldmaking in Dance and Choreography, Klein G., Noeth S.,(Eds.). transcript, 207-224.

Siegel, W. (1999). Two Compositions for Interactive Dance International Computer Music Conference (ICMC), Beijing.

Siegel, W. (2009). Dancing the Music: Interactive Dance and Music. In R. T. Dean (Ed.), The Oxford Handbook of Computer Music (pp. 191-213). Oxford University Press.

Siegel, W., & Jacobsen, J. (1998). The Challenges of Interactive Dance: An Overview and Case Study. Computer Music Journal, 22(4), 29-43.

Smith, M., & Roche, J. (2015). Perceiving the Interactive Body in Dance: Enhancing kinesthetic empathy through art objects. Body, Space & Technology, 14. https://doi.org/DOI: http://doi.org/10.16995/bst.35

Sparacino, F., Davenport, G., & Pentland, A. (2000). Media in performance: Interactive spaces for dance, theater, circus, and museum exhibits. IBM Systems Journal, 39(3&4), 479-510.

Sparacino, F., Wren, C., Davenport, G., & Pentland, A. (1999). Augmented performance in dance and theater International Dance and Technology, IDAT99, Tempe, AZ.

Suzuki, R., Iwadate, Y., Inoue, M., & Woo, W. (2000, 8-11 Oct.). MIDAS: MIC Interactive DAnce System 2000 IEEE international conference on systems, man and cybernetics. 'cybernetics evolving to systems, humans, organizations, and their complex interactions', Nashville, TN.

Svenns, T. (2020). SENSITIV: Designing for Interactive Dance and the Experience of Control [Master’s thesis, KTH]. Stockholm, Sweden. https://www.diva-portal.org/smash/get/diva2:1466897/FULLTEXT01.pdf

Swendsen, P., & Topper, D. (2006). Strategizing Real-Time Music/Dance Interactions Proceedings of the 1st Nordic Music Technology Conference (NoMute 2006), Trondheim, Norway.

Tilmanne, J., d'Alessandro, N., Barborka, P., Bayansar, F., Bernardo, F., Fiebrink, R., Heloir, A., Hemery, E., Laraba, S., & Moinet, A. (2016). Prototyping a New Audio-Visual Instrument Based on Extraction of High-Level Features on Full-Body Motion 11th International Summer Workshop on Multimodal Interfaces (eNTERFACE'15), Mons, Belgium.

Todoroff, T. (2011, 2011). Wireless digital/analog sensors for music and dance performances Proceedings of the International Conference on New Interfaces of Musical Expression, NIME'11, Oslo, Norway.

Toenjes, J. (2007). Composing for Interactive Dance: Paradigms for Perception. Perspectives of New Music, 45(2), 28-50. http://www.jstor.org/stable/25164655

Toenjes, J. (2009, June 3-6). Natural Materials on Stage: Custom Controllers for Aesthetic Effect Proceedings of the International Conference New Interfaces of Musical Expression, NIME, Pittsburgh, PA.

Toenjes, J., Beck, K., Reimer, M. A., & Mott, E. (2016). Dancing with mobile devices: The lait application system in performance and educational settings. Journal of Dance Education, 16(3), 81-89.

Torpey, P. A., & Jessop, E. N. (2009). Disembodied performance CHI '09 Extended Abstracts on Human Factors in Computing Systems, Boston, MA, USA.

Torre, G., Fernstrom, M., & Cahill, M. (2007). An accelerometer and gyroscope based sensor system for dance performance.

Tragtenberg, J., Calegario, F., Cabral, G., & Ramalho, G. (2019). Towards the Concept of “Digital Dance and Music Instrument” Proceedings of the International Conference on New Interfaces for Musical Expression, Porto Alegre, Brazil, Porto Alegre, Brazil.

Ulyate, R., & Bianciardi, D. (2002). The interactive dance club: Avoiding chaos in a multi-participant environment. Computer Music Journal, 26(3), 40-49.

Ungvary, T., Waters, S., & Rajka, P. (1992). NUNTIUS: A computer system for the interactive composition and analysis of music and dance. Leonardo, 25(1), 59-68.

Valverde, I., & Cochrane, T. (2017). Senses Places: Soma-tech mixed-reality participatory performance installation/environment [Conference Paper]. 8th International Conference on Digital Arts, ARTECH 2017, Macau, China. https://www.scopus.com/inward/record.uri?eid=2-s2.0-85032491826&doi=10.1145%2f3106548.3106613&partnerID=40&md5=5260373363569959041c6f6306604860

van Hout, B., Giacolini, L., Hengeveld, B., Funk, M., & Frens, J. W. (2014). Experio: a Design for Novel Audience Participation in Club Settings International conference of New Interfaces of Musical Expression, NIME, London, UK.

Varanda, P. (2016). New Media Dance: Where is the Performance? In C. Fernandes (Ed.), Multimodality and Performance (pp. 187-202). Cambridge Scholars Publishing. http://search.ebscohost.com/login.aspx?direct=true&db=nlebk&AN=1339017&site=ehost-live

Varanda, P. (2019). Soi Moi: The Techno-Soma-Aesthetics of a Dance for the iPhone. Body, Space & Technology, 18(1). https://doi.org/http://doi.org/10.16995/bst.314

Vass-Rhee, F. (2011). Dancing music: The intermodality of the Forsythe Company. In S. Spier (Ed.), William Forsythe and the Practice of Choreography. It Starts From Any Point (pp. 73-89). Routledge.

Ventura, P., & Bisig, D. (2016). Algorithmic Reflections on Choreography. Human technology, 12(2), 252–288. https://doi.org/http://dx.doi.org/10.17011/ht/urn.201611174656

Viaud-Delmon, I., Mason, J., Haddad, K., Noisternig, M., Bevilacqua, F., & Warusfel, O. (2011). A sounding body in a sounding space: the building of space in choreography–focus on auditory-motor interactions. Dance Research Journal, 29(2), 433-449.

Vincent, J. B., Vincent, C., Vincs, K., & McCormick, J. (2016). Navigating control and illusion: functional interactivity versus ‘faux-interactivity’ in transmedia dance performance. International Journal of Performance Arts and Digital Media, 12(1), 44-60. https://doi.org/10.1080/14794713.2016.1161955

Vincs, K., & McCormick, J. (2010). Touching Space: Using Motion Capture and Stereo Projection to Create a “Virtual Haptics” of Dance. Leonardo, 43(4), 359-366. https://doi.org/10.1162/LEON_a_00009

Visi, F. (2017). Methods and technologies for the analysis and interactive use of body movements in instrumental music performance [Doctoral thesis, Plymouth University]. Plymouth.

Visi, F., Coorevits, E., Schramm, R., & Miranda, E. R. (2017). Musical instruments, body movement, space, and motion data: music as an emergent multimodal choreography. Human technology, 13(1), 58–81. https://doi.org/http://dx.doi.org/10.17011/ht/urn.201705272518

Wang, Y., Qian, G., & Rikakis, T. (2005). Robust pause detection using 3D motion capture data for interactive dance Proceedings.(ICASSP'05). IEEE International Conference on Acoustics, Speech, and Signal Processing, 2005., Philadelphia, PA.

Wassermann, K., Blanchard, M., Bernardet, U., Manzolli, J., & Verschure, P. F. (2000). Roboser-An Autonomous Interactive Musical Composition System The International Computer Music Conference, ICMC,

Wechsler, R. (1997). O Body Swayed to Music (and Vice Versa): Roles for the Computer in Dance. Leonardo, 30(5), 385-389. https://muse.jhu.edu/article/617692

Wechsler, R. (1998). Computers and Dance: Back to the Future. Dance Research Journal, 30(1), 4-10. https://doi.org/10.2307/1477888

Wechsler, R. (2006). Artistic Considerations in the Use of Motion Tracking with Live Performers: a Practical Guide In S. Broadhurst & J. Machon (Eds.), Performance and Technology: Practices of Virtual Embodiment and Interactivity (pp. 60-77). Palgrave Macmillan.

Wechsler, R., Weiß, F., & Dowling, P. (2004). EyeCon: A Motion Sensing Tool for Creating Interactive Dance, Music, and Video Projections Proceedings of the AISB 2004 COST287-ConGAS Symposium on Gesture Interfaces for Multimedia Systems, Leeds, UK.

Weschler, R., Weiss, F., & Rovan, J. B. (2001). Artistic Collaboration in an Interactive Dance and Music Performance Environment: Seine hohle Form, a project report. Body, Space & Technology, 2(1). https://doi.org/http://doi.org/10.16995/bst.255

Wijnans, S. (2008). ‘Sound Skeleton’: Interactive transformation of improvised dance movements into a spatial sonic disembodiment. International Journal of Performance Arts and Digital Media, 4(1), 27-44.

Wijnans, S. (2011). ‘TranSonic’Perception in Interactive ChoreoSonic Performance Practice. Body, Space & Technology, 10(2). https://doi.org/http://doi.org/10.16995/bst.81

Wilson, J. A., & Bromwich, M. A. (2000). Lifting Bodies: interactive dance–finding new methodologies in the motifs prompted by new technology–a critique and progress report with particular reference to the Bodycoder System. Organised Sound, 5(1), 9-16.

Wilson-Bokowiec, J. (2010). Physicality: The techne of the physical in interactive digital performance. International Journal of Performance Arts and Digital Media, 6(1), 61-75.

Winkler, T. (1995). Making Motion Musical: Gesture Mapping Strategies for Interactive Computer Music Proceedings of the International Computer Music Conference, ICMC, Banff, Canada.

Winkler, T. (1995, 1995). Strategies for Interaction: Computer Music, Performance, and Multimedia Proceedings of the 1995 Connecticut College Symposium on Arts and Technology,

Winkler, T. (1997). Creating interactive dance with the very nervous system The 1997 Connecticut college symposium on art and technology New London, Connecticut.

Winkler, T. (1998, 1998). Motion-sensing Music: Artistic and Technical Challenges In Two Works For Dance International Computer Music Conference, ICMC,

Winkler, T. (1998). Composing Interactive Music : Techniques and Ideas Using Max [Book]. MIT Press. http://search.ebscohost.com/login.aspx?direct=true&db=nlebk&AN=1430&site=ehost-live

Winkler, T. (2002). Fusing movement, sound, and video in Falling Up, an interactive dance/theatre production Proceedings of the 2002 conference on New interfaces for musical expression, Dublin, Ireland.

Winkler, T. (2003, 2003). Movement-Activated Sound and Video Processing for Multimedia Dance/Theatre International Computer Music Conference, ICMC, San Fransisco, CA.

Withers, M. (2004). Dance of the Auroras – Fire in the Sky. Body, Space and Technology Journal, 4(1).

Woolford, K., & Guedes, C. (2007). Particulate Matters: Generating Particle Flows from Human Movement ACM International Conference on Multimedia, Augsburg, Germany.

Xu, Z. (2020). Choreography of Sonic Chopsticks and Intervention of Digital Technology with Dancing Bodies. Body, Space & Technology, 19(1). https://doi.org/http://doi.org/10.16995/bst.331

Zhou, Q., Chua, C. C., Knibbe, J., Goncalves, J., & Velloso, E. (2021). Dance and Choreography in HCI: A Two-Decade Retrospective. Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems,

AB