This project has moved and is read-only. For the latest updates, please go here.

Multiple WaveOut Objects

Aug 19, 2012 at 5:31 PM
Edited Aug 19, 2012 at 5:33 PM

I have a timeline that allows the end user to add sound objects (tracks) to it. The sound objects can be placed / moved anywhere on the timeline. When playing the timeline, the timeline object itself knows when to instruct the sound object to play. The sound object consists of a class that contains the NAudio objects needed to play a sound file:

Mp3FileReader / WaveFileReader

When the timeline object calls everything works as expected but if WaveOut.Play() gets called after WaveOut.Stop() has been called previously- I get an error saying the “Buffers already queued on play. If I replace WaveOut.Stop() with WaveOut.Pause() I don’t receive the error.

I’m also having an issue getting multiple sounds to start at different times. For example, I have a sound on track 1 that starts at 5 seconds on the timeline and I have another sound on track 2 that starts at 2 seconds. The second track will play (sometimes stuttering) but the sound on track 1 will not play. If I set them to start at the same time, regardless where they are on the timeline, both will play fine.

I looked at the samples and demos but things got a little convoluted with the custom controls getting in the way. I also looked at the source code and nothing stood out to me as to why the stop() functionality of the WaveOut object would cause it’s play method to crash. As well, I didn’t see where there would be a problem with using multiple WaveOut objects in one application. Maybe I’m missing something?

Any help would be greatly appreciated.


Aug 20, 2012 at 4:35 PM

Ok, I've figured out how to get multiple WaveOuts to play at the same time regaurdless of when they are started. I had to convert them to the same WaveFormat. I'm using WaveFormatConversionStream and I have a few questions about this object. Here is the code I'm using:

                    mp3Reader = new Mp3FileReader(fileName);
                    WaveFormatConversionStream waveFormatConvStream = new WaveFormatConversionStream(new WaveFormat(44100, 1), mp3Reader);
                    inputStream = new WaveChannel32(waveFormatConvStream);

                    waveOut = new WaveOut(frmHndl);

                    waveOut.DeviceNumber = deviceNo;
                    waveOut.PlaybackStopped += new EventHandler(waveOut_PlaybackStopped);
                    waveOut.Volume = 1.0f;
                    inputStream.Volume = 1.0f;


Can I dispose of the WaveFormatConversionStream after I've passed it to my inputStream (WaveChannel32) or does it need to stay in memory?

When I try to specify a sampleRate other than 16 I get an "AcmNotPossible calling acmStreamOpen" error. Is this dependent on the input files sample rate? 

My application is not an audio app, there will never be more than 3 - 4 sounds loaded at one time but the application is memory intensive so I need to use as little memory with these audio files as possible. I've converted them into a single channel (mono) for this purpose.

Will the WaveFormatConversionStream with the format I've specified work on all files thrown at it or do I need to test every type of sampleRate - bitRate combinations, with mono and stereo?




Aug 21, 2012 at 2:15 PM

I've managed to solve all of my issues on my own. For anyone who finds this post with the same issues:

I'm using NAudio Version

I was receiving the “Buffers already queued on play" error because I had an event handler hooked up to the "waveOut_PlaybackStoped" event. The handler pointed to a method that was meant to handle looping, when the file ended, I'd seek to the beginning of the file and restart the playback. The problem with this was that the event only fires when you've called the waveOut's Stop() method. Before I realized this was happening I used pause() instead of stop(), which bypassed the problem but created others with data being left in buffers when I un-paused as well as the “Buffers already queued on play" error. Removing the event handler and using the stop method fixed all these issues.

When playing multiple streams at the same time, you'll need to convert them all to the same format. I used the "WaveFormatConversionStream" which cannot be disposed until you’re done with all the other streams involved with playing, seeking, stopping, reading, etc...

I also had issues with the seek method, I couldn’t get accurate positioning. The stream's "CurrentTime" property works better. I also use the streams volume property to change the volume for each track. Both of these gave me the flexibility to change the position and volume of each stream instead of the waveOut object.

Hope this helps someone else...



Aug 27, 2012 at 11:33 AM

Hi dj,

how did you build the timeline component? Do you render the waveform for each seperate audio file first?



Aug 27, 2012 at 8:43 PM
percramer wrote:

Hi dj,

how did you build the timeline component? Do you render the waveform for each seperate audio file first?



Per - I built a custom reusable control (timeline obj). Since my app isn't an audio app and it's memory intensive for other objects, I had to limit the amout of memory used for audio and waveform data by sampling larger chunks. The control takes a list of audio peaks (ints) in it's constructor and renders these in the control's paint method. Depending on the zoom level, there is a calculation done to determin the X location of the line in a loop.

Rendering (painting) waveform data uses a lot of memory so it's also important to note that the render loop has a check for visablity. Since the list of ints is in order, you can exit the loop after you reach a point where the line location is out of bounds of the viewable area of the control.


Hope this helps,


Aug 27, 2013 at 11:13 PM
Edited Aug 27, 2013 at 11:30 PM
I have a similar issue, I'm running in a Task the same code on multiple device. Every object is instanciate for each task offcourse.
private void RunSession(WaveOut WaveOut, WaveStream stream, string match) {
      WSRConfig.GetInstance().logInfo("PLAYER", "[" + device + "]" + "Start MP3 Player");

      while (stream.CurrentTime < stream.TotalTime && played.Contains(match)) {

      WSRConfig.GetInstance().logInfo("PLAYER", "[" + device + "]" + "End MP3 Player");
2 questions:

The code play correctly on device 0, but hang on device 1, it scratch on device 2 then stop.
  • Any idea to play device 0 and 2 together ? It seems the 2 is stopped by 1 very weird
  • Any idea how to understand device 1 not available ?
  • Is device 0 always default device ?
My goal is to play a WaveStream to multiple WaveOut. In my example I have multiple WaveStream (to avoid conflict) but it seems not working.

I also see an other answer about Multiplexer, but it seems to be (N in and 1 out) and I'd like (1 in and N out). Any sample code ?
Aug 28, 2013 at 8:17 PM
0 means the default device. What callback model are you using? You shouldn't need to use tasks at all.
Aug 28, 2013 at 10:05 PM
Yes after reading multiple answers it seems I should use something to dispatch audio between input/output without Task.
  • I have 1 or more WaveStream (from file, memory stream, etc ...)
  • I need to play on 1 or more device WaveOut at the same time
What should I use for this pattern ? Any sample code ? (I only found multiple IN and 1 OUT but not multiple IN/OUT)

Aug 28, 2013 at 10:51 PM
it's not something I've included in NAudio, but the basic approach would be to use a BufferedWaveProvider for each output. You'd need to create a custom IWaveProvider whose Read method would either return data from the buffer if it was there, and if not, read more from the source stream and write the audio into all the buffers. So each output device could be at a slightly different point in time.

I'd like to do a code sample, but it would take me a bit of time to write, so can't do it now.
To be honest, it would be a lot easier if playing from file or memory stream to just open two WaveFileReaders, one for each output, and start each playing at the same time.
Aug 28, 2013 at 11:29 PM
About your last comment, it was what I did at first
  • Create WaveFileReaders
  • Create WaveOut
  • Set the device number
  • WaveOut.Init(stream);
  • WaveOut.Play();
I do it twice but it seems the first one play, then the second one. So I assume the code is synchronous ? So I use Task to made it asynchronous, but it do not work on both device

I'll try BufferedWaveProvider