Data Sonification
research:
The research conducted in the context of my postdoctoral appointment at Qualcomm Institute integrates environmental data science and acoustics. I focus on the development and implementation of innovative, multi-modal data rendering techniques to sonify ecosystems and extreme environmental events.
As part of the ALERTCalifornia program, which manages a network of over 1,000 cameras across both urban and remote areas of the state, our team processes and stores terabytes of visual and raw data daily. My work specifically investigates how to interpret these datasets, using advanced sound synthesis and design methods to create immersive artificial audio representations of natural phenomena, such as wind, rain, snowstorms, dust storms, bird songs, and wildfires.
Additionally, I have been working on a project to design and deploy a custom array of acoustic sensors at these locations. These sensors will collect, store, and stream acoustic data, allowing for a deeper understanding of the soundscapes of distinct terrestrial environments and enabling new insights into the impacts of extreme events on the environment.