I'm reading this article My goal is to play two sounds at the same time. One sound is in a different volume. Having regular "audio" tags is not a solution becau
I'm working with the Web Audio API for my javascript project, and I've run into an issue that I can't seem to find the answer for anywhere. I've added event lis
I'm so close to getting audio chat working via Websockets. The idea of this application I'm building is to have a group voice chat working in browser. I'm using
I want to make a recording where, I get multiple audio tracks from different mediaStream objects (some of them, remote). Use the getAudioTracks () method and ad
I'm loading audio files with webpacks asset/inline { test: /\.(wav)$/i, type: 'asset/inline', } import someWAV from './wav/some.wav' working all fine, fil
The gain controller works dynamically on chrome, but in firefox it does not. I need to re-play audio file for browser to recognize new input value of the range
I have an AudioWorkletNode that is a member of a class instance. When I delete/kill/remove that instance, the AudioWorkletNode and MessagePort leak. Before del
I currently work on a little Project, where I render a CubeMap with WebGL and then apply some sounds with the "web audio API" Web Audio API Since the project is
I want to record audio in javascript and convert it to a buffer at the time (not saving it to a file and then converting it to a buffer). how I can do this? I t
I'm trying to create an audio visualization for a podcast network, using the Web Audio API with createMediaElementSource() very similarly to the model explained
I have been using web audio api and have created a context, and populated a source buffer with data. It plays fine over the default output device, but i don't
The Web Audio API oscillator allows a script to be alerted when the oscillator stops with onended. How can the script be alerted when it starts? const ac = new