In the multimedia era, understanding and visualizing audio data has become increasingly valuable—whether you're a musician tweaking a mix, a developer building an interactive app, or just someone curious about the soundscape around you. Today, we’re diving into a neat project that combines the Web Audio API with CanvasJS to create a real-time frequency analyzer, complete with a hearing damage alert.
What’s the Project About?
Imagine a webpage with three sleek column charts: one for bass frequencies (0-200 Hz), another for mids (200-2000 Hz), and a third for treble (2000 Hz and up). These charts update in real-time as your microphone picks up sound, giving you a visual breakdown of the audio spectrum. If the sound gets too loud—say, approaching levels that could harm your hearing—a warning pops up, and the charts turn red.
Breaking Down the Code
Let’s walk through how this audio analyzer comes to life.
Initializing the Charts
In JavaScript, we create three CanvasJS charts, each targeting one of the div containers. Each chart is configured with a column type to display frequency bins as vertical bars, a light2 theme for a crisp look, and sensible axis labels. The dataPoints arrays (one for each range) will hold the amplitude values we’ll update in real-time. Below is the initialization of one of the three charts.
charts.push(new CanvasJS.Chart("bassChart", {
theme: 'light2',
title: { text: "Bass Frequencies (0-200 Hz)" },
axisX: { title: "Bin" },
axisY: { title: "Amplitude", maximum: 255 },
data: [{ type: "column", dataPoints: dataPointsBass }]
}));
This setup is repeated for the mid and treble charts, with adjusted titles.
Capturing Audio
The startAudio function uses the Web Audio API to access the user’s microphone. Once the stream is captured, an AnalyserNode is created with an fftSize of 128, giving us 64 frequency bins to work with. It is with these frequency bins we are able to represent the captured sound across the three different frequency ranges; bass, mid, and treble.
async function startAudio() {
const stream = await navigator.mediaDevices.getUserMedia({ audio: true });
const audioContext = new (window.AudioContext || window.webkitAudioContext)();
const source = audioContext.createMediaStreamSource(stream);
analyser = audioContext.createAnalyser();
analyser.fftSize = 128; // 64 bins
source.connect(analyser);
updateCharts();
}
Updating the Charts
The updateCharts function is where the magic happens. It grabs frequency data from the analyzer as a Uint8Array, splits it into bass, mid, and treble ranges (roughly 21 bins each), and pushes the values into the respective dataPoints arrays.
To add a safety feature, we estimate decibel (dB) levels from the maximum amplitude. If it exceeds a threshold (set at 90 dB here, a conservative flag), the alert becomes visible, and the charts turn red—a visual cue powered by CanvasJS’s dynamic styling options.
const dB = 20 * Math.log10(maxAmplitude / 255 * 1000) + 20;
if (dB > DB_THRESHOLD) {
alertElement.style.display = "block";
charts.forEach(chart => chart.options.data[0].color = "red");
} else {
alertElement.style.display = "none";
charts.forEach(chart => chart.options.data[0].color = null);
}
The requestAnimationFrame call keeps the loop running, ensuring the charts reflect the audio in real-time.
Closing Thoughts
The complete code is available in this Codepen URL.
And that’s a wrap! With a few lines of code, we’ve built a live frequency analyzer that not only looks cool but also watches out for your ears.