I've followed this example and created a custom AudioWorkletProcessor which works as expected. What I'd like to do now is to stream MP3 audio from my server (I'm currently using Python/Flask) into it.
So, for example
const response = await fetch(url);
const reader = response.body.getReader();
while (true) {
const {value, done} = await reader.read();
if (done) break;
// do something with value
}
which gives me a Uint8Array. How do I pass its content to the AudioWorklet instead of the current channel[i] = Math.random() * 2 - 1;?
Thank you :)
Firstly, MP3 is a compressed audio file format but the Web Audio API nodes only work with uncompressed sample data. You'll need to use the
decodeAudioData()method of theAudioContextobject to convert the bytes of the MP3 file into anAudioBufferobject.Secondly,
decodeAudioData()isn't really designed for streaming but because you're using MP3 you're in luck. See Encoding fails when I fetch audio content partially for more information.Thirdly, the
AudioContextobject isn't accessible from inside anAudioWorkletProcessor, so you'll have to calldecodeAudioData()from the main thread and then pass the decompressed data from yourAudioWorkletNodeto yourAudioWorkletProcessorusing their respective message ports, which are accessible from theportproperty of each object.Fourthly,
AudioBufferisn't one of the allowed types that can be sent through a message port usingpostMessage(). Fortunately theFloat32Arrayreturned by the buffer'sgetChannelData()method is one of the supported types.I'm not sure what your reason is for using an audio worklet. Depends on what you want to do with the MP3 but if all you want to do is play it then there are simpler solutions that involve lower CPU usage.