The structure of the Web Audio API can be discussed using the same language as the modular synthesizer that was developed in the 1960s. Generally spea

A Different Introduction to the Web Audio API

submited by
Style Pass
2024-10-08 01:00:03

The structure of the Web Audio API can be discussed using the same language as the modular synthesizer that was developed in the 1960s. Generally speaking, a signal path is constructed, made up of components like sound sources (oscillators), filters, envelope generators, sequencers, mixers, amplifiers, attenuators, and other types of processors. These components individually have multiple options for inputs and outputs, allowing the user to route them in countless ways to create music, soundscapes, or any other imaginative piece they can devise.

The Web Audio API calls these components AudioNodes, that are connected to create in an AudioContext. In this example, a square wave oscillator is connected to a low pass filter, into an amplifier to control volume, and finally output to the destination node (the speakers).

The figure above illustrates how to notate the patch when dealing with a modular synthesizer. The brand or type of synthesizer hereis irrelevant, as oscillators, filters, amplifiers share the same basic functionality. Here is a patch diagram illustrating the effects when different settings are placed on each module:

Leave a Comment