A step-by-step explanation of how the experiments were carried out.
This project explores the intersection of sound analysis and non-traditional data visualization by rethinking how sound can be translated into data. While software such as Microsoft Excel and Google Sheets is commonly used for financial or statistical calculations, this project repurposes these tools as platforms for micro sound analysis. By treating audio files as raw data sources, I use spreadsheet charting features to visualize the internal “personality” and behavior of sound in ways that differ from conventional audio software.
Rather than relying on standard waveform graphs that measure sound objectively through decibels (dB), this project focuses on relative dynamics. Each sound is analyzed only in relation to itself, rather than by absolute loudness. At the same time, it becomes clear that sounds at very different perceived intensity levels—such as very soft and very loud sounds—can display similar dynamic patterns or consistent behavior over time.
This demonstrates that a sound’s internal fluctuations and “personality” are not determined by its volume alone, allowing patterns and behaviors to be observed independently of intensity.
Through an iterative process of trial and error, the project reveals that conventional dB-based visualizations often flatten subtle yet expressive aspects of sound. Even sounds perceived as quiet or steady can contain rich, complex internal dynamics when analyzed through this framework. By rethinking what counts as data and how it is structured, data-processing software becomes a creative tool for uncovering hidden patterns and offering new perspectives on everyday sound.
Task:
Analyze an uploaded audio file and convert it into a numerical data table based on loudness over time. Do not interpret the sound. Only measure and translate it using the rules below.
Rules:
Goal: Translate time-based sound variation into structured numerical data using fixed, repeatable rules.