beat lab / instrument
BIDIRECTIONAL INSTRUMENT

See what
you
hear.

Your movement makes sound. The sound shapes the visuals. The visuals change how you move. A feedback loop. One substance.

Three detuned oscillators through a lowpass filter. Mouse X controls pitch. Mouse Y controls brightness. The amplitude drives everything you see.

Ryoji Ikeda understood: sound and image are the same data viewed from different angles. This is that idea, made playable.

THE FEEDBACK LOOP
YOU → SOUND

Click and drag. X position maps to frequency — four octaves from A1 to A5. Y position maps to filter cutoff — top is dark and muffled, bottom is bright and open. Three oscillators play simultaneously: sawtooth, square, and a sine a fifth above. Slight detuning creates warmth. The scale quantizer snaps to pentatonic minor.

SOUND → SIGHT

The analyser reads time-domain and frequency data every frame. RMS amplitude drives: ring radius, particle attraction force, grid brightness, color saturation, bar height. Frequency bins drive individual particle sizes and colors. The waveform itself shapes the ring — you see the actual wave wrapped into a circle.

THE ARCHITECTURE

OscillatorNode × 3 → BiquadFilterNode → GainNode → AnalyserNode → destination. Canvas reads AnalyserNode at 60fps via getFloatTimeDomainData and getFloatFrequencyData. No samples. No libraries. Pure Web Audio API synthesis + Canvas 2D.

WHY BIDIRECTIONAL

Most visualizers are one-way: audio in, visuals out. A screen saver. This is an instrument. The visuals aren't decoration — they are the interface. You can't separate what you see from what you hear because they come from the same data. The ring IS the waveform. The particles ARE the spectrum. Sight and sound, same substrate.

SIGNAL CHAIN
OscillatorNode ×3
BiquadFilter (LP)
GainNode
AnalyserNode
destination
FFT Size
1024 samples
Voices
saw + square + sine (fifth)
Frequency Range
55 Hz — 880 Hz (A1–A5)
Filter
LP 200–8000 Hz, Q=2