This project has moved. For the latest updates, please go here.

Streaming Audio

Feb 25, 2011 at 1:19 AM

My final goal is to get low latency streaming audio from a device to my speakers.  The device will be spitting out byte arrays.  For now I am simulating the device by creating a sine wave and just trying to get it to play.  In order to get the streaming to work I've read that I should use the BufferedWaveProvider class.  Here is my attempt to use it below but I can't hear anything playing.

                WaveFormat waveFormat = new WaveFormat(44100, 2);
                BufferedWaveProvider bufferedWaveProvider = new BufferedWaveProvider(waveFormat);
                
                // Create a sine wave and add it to the buffer
                bufferedWaveProvider.AddSamples(CreateAudioBuffer(2000, 250), 0, 2000);
                using (IWavePlayer waveOut = new WasapiOut(AudioClientShareMode.Exclusive, false, 25))
                {
                    waveOut.Init(bufferedWaveProvider);
                    waveOut.Play();
                }

CreateAudioBuffer() just creates a sine wave, and that code seems to work fine so I won't post it here. When I use a RawSourceWaveStream instead of a BufferedWaveProvider I can hear the sound. Example below.

                using (Stream ms = new MemoryStream(2000))
                {
                    ms.Write(CreateAudioBuffer(2000, 250), 0, 2000);
                    ms.Position = 0;
                    WaveFormat waveFormat = new WaveFormat(44100, 2);
                    using (WaveStream waveStream = new RawSourceWaveStream(ms, waveFormat))
                    {
                        using (IWavePlayer waveOut = new WasapiOut(AudioClientShareMode.Exclusive, false, 25))
                        {
                            waveOut.Init(waveStream);
                            waveOut.Play();
                            while (waveOut.PlaybackState == PlaybackState.Playing)
                            {
                                System.Threading.Thread.Sleep(300);
                            }
                        }
                    }
                }
Can anyone see why the top example isn't working? I've tried adding a while loop and a sleep call but that doesn't help any. Am I even on the right track using BufferedWaveProvider to get streaming audio?

Coordinator
Feb 25, 2011 at 8:55 AM

looks like you are playing barely 10ms of audio. Try filling a larger buffer for testing purposes.

Mark

Feb 25, 2011 at 9:37 PM

Thanks Mark, I'm getting sound now, but not as much as I would expect.  Here is the new code.  It plays about 1 second of sound,

but only because I'm looping 100 times.

                WaveFormat waveFormat = new WaveFormat(44100, 16, 2);
                BufferedWaveProvider bufferedWaveProvider = new BufferedWaveProvider(waveFormat);

                using (IWavePlayer waveOut = new WasapiOut(AudioClientShareMode.Exclusive, false, 2))
                {
                    waveOut.Init(bufferedWaveProvider);
                    for (int i = 0; i < 100; i++)
                    {
                        bufferedWaveProvider.AddSamples(CreateAudioBuffer(176400), 0, 176400);
                        
                        if (waveOut.PlaybackState != PlaybackState.Playing)
                            waveOut.Play();
                    }
                }

CreateAudioBuffer(176400) returns a byte array with 176400 elements, or in other words 176400 bytes. 

Lets say I want to play 1 second of cd quality sound.  How many bytes would I have to create? 

I would assume that I need 176400 bytes to do this. My reasoning being cd quality sound plays 44,100

16bit stereo samples a second.  So take 44,100 and multiply by 2 for stereo and multiply by 2 for 16 bit

and that is how I get 176,400.  176,400 = 44,100 * 2 * 2.  When using a RawSourceWaveStream I get about

second of sound when playing 176,400 bytes, which is what I would expect.  But when I use BufferedWaveProvider,

I only get about 1/100 of a second of sound.  Any ideas why? Thanks again for the help!

Coordinator
Feb 28, 2011 at 1:31 PM

is this a console aplication?

if so you will need to ensure the thread playing audio is kept alive until it finishes playing

Mark

Mar 1, 2011 at 4:49 PM

For some reason, the latest release of NAudio doesn't have BufferedWaveProvider, so I can't try the above code. Any idea where it went or what replaced it?

Mar 1, 2011 at 5:00 PM

Mark

It is not a console app.  I just added a button to your demo app, so its a windows form app.  Another question I have is have you ever

measured the latency of the wasapi wrapper.  Using the BufferedWaveProvider class, and after a few changes, I am able to get it down

to 38 ms.  But I was hoping to get 10ms or so.  I'm measuring the latency by connecting the output of a signal generator to both the pc and

a scope, and then having the audio output of the pc go to the scope as well.  In other words, I'm confident in my measured latency.

Mar 1, 2011 at 5:02 PM

David

If you go here http://naudio.codeplex.com/SourceControl/list/changesets and hit the download link on the right, you'll find the class under NAudio/Wave/WaveProviders/BufferedWaveProvider.cs