00
problem
At the time, most live electronic music performances lacked a true integration between audio and visual expression. VJs often worked in parallel to musicians, triggering pre-designed visuals that only loosely matched the sound. This separation limited the depth of the audience’s sensory engagement and created a gap between the immediacy of live sound improvisation and visual response. The challenge was: • How to create a real-time generative system where visuals are not secondary decoration but structurally bound to the music. • How to present electronic music as a multisensory performance that transforms the perception of both sound and architectural space.
solution
Sonematik addressed this gap by designing a custom generative audiovisual system: • Data-driven visuals: Synthesizer signals were processed and mapped to generative algorithms, producing forms that moved, distorted, and grew directly from live audio parameters. • Architectural projection: Instead of screens, building façades became immersive surfaces where algorithmic geometries unfolded, merging art, music, and environment. • Audience engagement: By unifying music and visuals in real time, the project created an experience that blurred boundaries between concert, installation, and urban intervention. The result was a new audiovisual language where sound and image were inseparable—an expressive system emphasizing improvisation, immediacy, and immersion.

Sonematik #2 - 2012
CalArts - LA - California - US
Sonematik #1 - 2011
II Mostra 3M de Arte Digital - São Paulo - Brazil
Experiment #2
Experiment #1
Sonema by Sonematik @ soundcloud
02
03