Can real-time multi-channel audio convolution be performed in a C# application?

306 Views Asked by At

Specifically I'm looking to perform a two channel convolution operation on an audio file at playback. i.e. to add a reverb effect to the file using an impulse response, before it is sent to the sound card for playing.

There is a distinct lack of examples or references to performing this operation in real-time in a C# application.

The NAudio (and maybe CScore) libraries look most promising but the absence of built in convoliution engine seems odd, is this likely because there is not enough call for it or is it more likely that a managed application is not suited to such operations?

Therefore it leads me to ask the posted question Can the real-time multi-channel audio convolution be performed in a C# application?

1

There are 1 best solutions below

2
Ian Mercer On

Yes, no problem you can do this. C# applications however aren't great for audio playback because of possible garbage collection delays. You'll need a fairly large buffer on the sound card and that introduces lag into your signal. From a file that's no issue, from a microphone in real-time it's not so great. I have a system that plays back three music streams simultaneously on three separate sound cards and muxes in additional speech files with ducking all written in C#. I do run the playback threads at higher priority also.

As to why there's no existing library to do this (there probably is), it's a fairly trivial piece of code: just multiplying and adding on a 1D stream of values.