This contribution focuses on situated, bodily, and relational listening, which takes shape in the contexts of care and rehabilitation as a geographical space to give voice to movement and to give sound form to those signals that often go unheard, even by those who perform them. The reflection focuses on the different modes of listening that are activated in the rehabilitation process of patients with chronic lower back pain. This common condition compromises proprioception and limits access to the full experience of the body. Based on this observation, we have developed a system for sonifying back movement that accompanies the Cognitive Functional Therapy (CFT) process. CFT invites meta-reflection on kinetic perception and spaces: it analyzes pain behaviors and stimulates new bodily narratives, avoiding the vicious circle that fuels chronicity.
In this direction, the sonified gesture is not only a functional action but an evolving sound event and a dialogical process between the performer-patient and the listener in a performative spatial context, blending biomechanical analysis and musical creativity.
The system uses wearable IMU sensors (Arduino Nano 33 BLE Sense) positioned on the lumbar spine. The data, sent via Bluetooth, is processed by a Python script that converts it into OSC signals for the Max for Live environment. Here, the gesture is transformed into dynamic and customizable sound feedback, in which each IMU axis interacts with a sound parameter (frequency, timbre, amplitude, effects) from various customizable sound sources (synthesis, virtual instruments, or recordings). Each gesture gives rise to a unique soundprint, making sound a perceptual extension of the body. The intertwining of gesture, sound, and relationship opens up multiple transdisciplinary. applications. In the clinical field, it promotes body awareness and supports the therapeutic process; in the performing arts, it transforms the body into a musical and compositional instrument; in education, it allows to rethink physical training as a participatory and sensory experience.
The sound feedback, which is not prescriptive but immersive, accompanies the patient in a listening experience that is at once technical, analytical, and emotional. It does not correct, but guides the patient to feel, to reinhabit the space of movement as a place of relationship. In this way, every sound becomes a vibrating gesture, a possibility for dialogue, a listening cure.
We are currently working on expanding the project into the field of performance, applying it to dance, theatre, sculpture and generative music.
We are expanding the system from the lumbar region to a sensor network comprising IMU, EMG, flex sensors, colour sensors, etc., in order to integrate environmental changes, muscle tension and movement across multiple joints.
Object interaction: We are experimenting with sensor integration to enable interaction with stage elements and sculptures, effectively turning the object into a reactive partner that generates sound through touch and proximity.
Instant composition and live electronics applications: We are transitioning from functional sonification to live performance tools where choreographic complexity defines timbre, density, and other sound variables in real time.
Cross-media dialogue: Exploring an ecosystem where biological and kinetic data serve as the common engine for sound, dance and visual installation, creating an extemporaneous and collective creative process.
I (Current): Engineering the sensor network (IMU/EMG) and lumbar-to-body expansion.
II: Interaction prototyping with reactive sculptures and stage elements.
III: Artistic residencies and live electronics calibration.
IV: Full-performance debut and cross-media installation