I create my capture passing in the samplerate(48000) and Channels(2)
capture = new WasapiLoopbackCapture(device);
capture.WaveFormat = WaveFormat.CreateIeeeFloatWaveFormat(SampleRate, Channels);
// Hook up the DataAvailable event to process audio
capture.DataAvailable += Capture_DataAvailable;
then during the callback I want to convert the buffer to floats to process and this is my issue ... I am new to this stuff so online it shows you have to convert the bytes back to floats
// Convert the bytes to floats for processing
var samples = new float[e.BytesRecorded / 3];
for (int i = 0; i < e.BytesRecorded; i += 3)
{
// Convert the 24-bit sample to float
int sample = e.Buffer[i] | (e.Buffer[i + 1] << 8) | (e.Buffer[i + 2] << 16);
samples[i / 3] = sample / (float)0x7FFFFF;
}
// Split the samples into left and right channels
float[] leftChannel = new float[samples.Length / 2];
float[] rightChannel = new float[samples.Length / 2];
for (int i = 0; i < samples.Length; i += 2)
{
leftChannel[i / 2] = samples[i];
rightChannel[i / 2] = samples[i + 1];
}
My question here is, when I am only playing audio which is only in left ear, I would expect all the left channels to have values and then the right channel to be all 0's? but currently both arrays are full of non 0 data
What am I missing here or am I completely off tracks?
See above for my code used ... but I understand the device is 24 bit, 2 channels and 48000 Hz as shown in sound windows properties.
I am guessing either my logic/understanding of what is needed is wrong or it's valid for the right channel to get float values back even though the audio playing while testing is a audio clip of music playing in just the left ear.
WasapiLoopbackCapturewill always return IEEE float, not 24 bit audio, so the byte array already contains floats. TheWaveBufferclass can be used as an efficient way to access the contents of abyte[]as though it were afloat[]. Create a newWaveBufferbound to the byte array and access it through theFloatBufferproperty