top of page
ima-01608.jpg

Data Sonification

research:

The research conducted in the context of my postdoctoral appointment at Qualcomm Institute integrates environmental data science and acoustics. I focus on the development and implementation of innovative, multi-modal data rendering techniques to sonify ecosystems and extreme environmental events.

As part of the ALERTCalifornia program, which manages a network of over 1,000 cameras across both urban and remote areas of the state, our team processes and stores terabytes of visual and raw data daily. My work specifically investigates how to interpret these datasets, using advanced sound synthesis and design methods to create immersive artificial audio representations of natural phenomena, such as wind, rain, snowstorms, dust storms, bird songs, and wildfires. 

Additionally, I have been working on a project to design and deploy a custom array of acoustic sensors at these locations. These sensors will collect, store, and stream acoustic data, allowing for a deeper understanding of the soundscapes of distinct terrestrial environments and enabling new insights into the impacts of extreme events on the environment.

axis-girardridge1_redis.220509_1200-2000.30fps.jpg

Hailstorm


Data from camera:
GirardRidge1


Location: Shasta County, CA
Date: May 09, 2022
Timeframe: 1600-2000

 

ima-12516.jpg

Wildfire


Data from camera:
ShirleyPeak2


Location: Shirley Peak, CA
Date: August 20, 2021
Timeframe: 1448-1510

 

ima-00145.jpg

Dust Storm


Data from camera:
Monument

 
Location: Monument Peak, CA
Date: October 06, 2022
Timeframe: 1600-1900

 

ima-00789.jpg

Ice


Data from camera:
Antelope Eagle Lake 

 
Location: Antelope Eagle Lake, CA
Date: November 02, 2022
Timeframe: 1300-1700

 

bottom of page