Real-time Adaptive Beamforming
@ Qualcomm Institute - SonicArts R&D
This project involved designing and prototyping a real-time binaural beamforming system
for a 14-speaker array, scalable to a 4-meter long 62 channels wavefield synthesis model.
My core contribution was the development of the real-time convolution engine within Pure Data,
managing 28 output channels to ensure seamless spatial audio delivery.
Beyond the backend, I transitioned the project from a technical demo into a living installation. I composed and produced the original soundscapes used for the performance. With my teammate, we integrated Microsoft Kinect V2 for motion tracking. This transformed the system from a "presented demo" into an interactive sonic environment, allowing users to physically walk through and manipulate the audio beams through their movement.
Beyond the backend, I transitioned the project from a technical demo into a living installation. I composed and produced the original soundscapes used for the performance. With my teammate, we integrated Microsoft Kinect V2 for motion tracking. This transformed the system from a "presented demo" into an interactive sonic environment, allowing users to physically walk through and manipulate the audio beams through their movement.