GetSourceBuffer in \naudio\NAudio\Wave\WaveStreams\WaveChannel32.cs

Mar 22, 2009 at 2:37 AM

I've been working on a class to reverse a stream but have come across a statement, which I'm not sure of the reasons behind but stops the function I've developed from working under all scenarios. It actually only works in a scenario where the wave sample being loaded is less than a second..

It relates to this:

byte[] sourceBuffer;

/// <summary>
/// Helper function to avoid creating a new buffer every read
/// </summary>
byte[] GetSourceBuffer(int bytesRequired)
if (sourceBuffer == null || sourceBuffer.Length < bytesRequired)
sourceBuffer = new byte[Math.Min(sourceStream.WaveFormat.AverageBytesPerSecond,bytesRequired)];
return sourceBuffer;

The part I'm not sure about is under what scenarios you would require a buffer of bytes smaller than that being requested?

The reason for asking this question is because of how I am trying to setup the buffer for storing the result of the reversed wave. I'm not sure if this is the best approach so if any one can think of an alternative I'm open to suggestions.After I create an instance of WaveChannel32, via a reader stream, I try to read the complete contents of the WaveChannel32 stream in to another temporary byte array. However when I invoke the channelStream read method as such:

channelStream.Read(reversedSample, 0, (int)channelStream.Length);

An error is thrown, "Offset and length were out of bounds for the array or count is greater than the number of elements from index to the end of the source collection." when the actual Read method is called on the stream:

int read = sourceStream.Read(sourceBuffer, 0, sourceBytesRequired);

Because the souurceBuffer is smaller than sourceBytesRequired

However I think this issue will always occur when the requested read amount is more than an average seconds worth of bytes for playback. The other thing to note is the decision on the size of sourceBytesRequired is made before the read call; I've pasted the read line and the two preceding lines from the Read method, which actually calculate how long the sourceBytesRequired amount should be.

int sourceBytesRequired = (numBytes - bytesWritten) / 2;
byte[] sourceBuffer = GetSourceBuffer(sourceBytesRequired);
int read = sourceStream.Read(sourceBuffer, 0, sourceBytesRequired);

I hope this makes sense and that I haven't missed anything in this analysis; I checked for all references to this method in the NAudio project and could only find two references, both from the Read method, one for Mono and one for Stero.

If I have got this right, could I request that you replace this line in WaveChannel32:
sourceBuffer = new byte[Math.Min(sourceStream.WaveFormat.AverageBytesPerSecond,bytesRequired)];

with this line:
sourceBuffer = new byte[bytesRequired];


Mar 24, 2009 at 10:40 AM
Hi Sebastian,
Yes, might be a bug. I'll take a look at the code and see if I can remember why I did that. I'll fix it if necessary

Mar 25, 2009 at 12:04 PM
Hi Mark,

Much appreciated, I'll work on the next tutorial under the assumption that this is OK to change; if not I'll update the tutorial afterward ;-)