Another streaming question

Jul 3, 2011 at 7:14 PM
Edited Jul 3, 2011 at 7:15 PM

Hi mark,

I have a server and client that connected with tcp connection module (not the .net tcp socket)

The server read pcm alaw file from the HD decode the file and send him in pieces to the client, 2048 bytes at once.

(the server loop over the file do the conversion/decoding and send 2048 bytes to client every cycle).

In the client I have event "OnDataReceive" that raise every time the has data on the connecion. till then I think everything ok but know I dont know how to continue.

what I did is every time that the event raised up I created WaveStream and IwaveProvider and create WaveOut and then Init the waveout with the provider etc.

of course I close and dispose everything that needed to.

In the first time (first data that arrived in the connection) I play the piec of file successfully but then I got "Not a wave format - RIFFnot found" (or something like that) error.

I know my way is wrong, what should I do and why I got the error?

Thanks in advance!

Zeev

Coordinator
Jul 3, 2011 at 9:52 PM

you must be using a WaveFileReader if you got RIFF not found. You don't have a whole WAV file every time that data is received on the connection. Just have one WAV file reader if it is indeed a WAV file you are getting.

Jul 4, 2011 at 4:26 AM

thanks

waht about the wave out? its ok to init and dispose it every time? or just once?

Coordinator
Jul 4, 2011 at 6:51 AM

I would keep one open. Have you looked at the latest NAudioDemo source code which has an example of playing streaming MP3 from the internet? It might give you some ideas.

Mark

Jul 4, 2011 at 9:57 AM

I saw it know and its help but it's a pity that I can't debug it I dont get anything from the streaming radio

Coordinator
Jul 4, 2011 at 10:01 AM

just point it at the URL of any MP3 file

Jul 4, 2011 at 11:22 AM
Edited Jul 4, 2011 at 11:32 AM

Mark can you please check my code and tell me what wrong?

This method run every time that has data in the connection.

after the method reun I do waveOut.Play();

public void Play(byte[] data)
{
     // Adding data to buffer
     if (bufferList.Count < 2048 * 16)
         foreach (var b in data)
             bufferList.Add(b);
     else{
         var stream = new RawSourceWaveStream(new MemoryStream(bufferList.ToArray()), new WaveFormat(8000,8,2));
         if (reader == null){
             reader = new WaveFileReader(stream);
             activeStream = new WaveFormatConversionStream.CreatePcmStream(reader);
         }else
             activeStream = new WaveFormatConversionStream.CreatePcmStream(stream);

           activeStream = new BlockAlignReductionStream(activeStream);
           inputStream = new WaveChannel32(activeStream);
           if (waveOut == null){
               waveOut = new WaveOut(){DesiredLatency = 300};
               waveOut.Init(inputStream);
           }
         bufferList.Clear();
     }
}
I can hear a 5 sec (the first buffer in the List) and then nothing...
Coordinator
Jul 4, 2011 at 9:09 PM

you don't need to create a WaveFileReader at all. The way I recommend is the way that NAudioDemo does it - a BufferedWaveProvider feeding a single WaveOut, which is filled whenever data is received.

Jul 5, 2011 at 8:45 AM

I did this and all I hear is noise...

public void Play(byte[] data)
{
     if (waveOut == null)
     {
          bufferedWaveProvider = new BufferedWaveProvider(new WaveFormat(8000,8,2){DiscardOnBufferOverFlow= true, BufferDuration = TimeSpan.FromSeconds(20)};
           waveOut = new WaveOut(){DesiredLatency = 300};
           waveOut.Init(bufferWaveProvider);
     }
     bufferWaveProvider.AddSamples(data, 0, data.Length);
     if (bufferWaveProvider.BufferedDuration.Seconds < 4 && waveOut.PlaybackState != PlayBack.Paused)
      {
             waveOut.Pause();
      }
      if ((bufferWaveProvider.BufferedDuration.Seconds >= 4 && waveOut.PlaybackState != PlayBack.Play)
      {
            waveOut.Play();
      }
}

 

Please help me...

Coordinator
Jul 5, 2011 at 9:13 AM

only create one BufferedWaveProvider, only create one WaveOut. Don't recreate them every time you get a buffer. Also, what format is the audio you are receiving. Is it really stereo 8 bit PCM (because that is very odd)

Jul 5, 2011 at 12:02 PM

I created only one BufferWaveProvider & WaveOut...

I dont understand something, I read wav file (11000,24,1) to memory (File.OpenRead()) and then pass it to byte array and add it to the buuferWaveProvider.

The wave file length is 53 seconds so why in the bufferWaveProider.BufferedDuration I see 6 seconds? When Iplay it i hear 6 seconds of noise.

If I play the file without BufferWaveProvider but WaveOutChannel32 and read the file with WaveFileReader the file playing good.

Coordinator
Jul 5, 2011 at 12:34 PM

the WaveFormat of the buffered wave provider must be the exact WaveFormat of the data in your byte array. The BufferedWaveProvider also is only for playing streaming audio. If you already have the whole thing, then there is no point whatsoever in using it.

Jul 5, 2011 at 2:06 PM

I know its only for streaming but I want first to play file before I try streaming.

The format is same and know I can hear a few seconds and than nothing.

Let me show you the code (I hope its the last time I dont want to bother you again :-))

public void Play()
{
      WaveStream block = null;
      var blockBuffer = new byte[1175040];
      if ()
      {
         var reader = new WaveFileReader(path);
         var pcmStream = WaveFormatConversionStream.CreatePcmStream(reader);
         block = new BlockAlignReductionStream(pcmStream);
         bufferWaveProvider = new BufferWaveProvider(block.WaveFormat){BufferDuration = block.TotalTime, BufferLength = blockBuffer.Length};  
         block.Read(blockBuffer, 0, blockBuffer.length);
         waveOut = new WaveOut(){DesiredLatency = 300};
         waveOut.Init(bufferWaveProvider);
      }
      bufferWaveProvider.AddSamples(blockBuffer, 0, blockBufferLength);
      waveOut.Play();
}
All I want is to play the file, please ignore form the streaming and other thing, just why I hear just a few secomds and not all the file?
Coordinator
Jul 5, 2011 at 2:28 PM

what type of app is this? WinForms / WPF?

Jul 5, 2011 at 2:58 PM

WPF.

maybe the bufferWaveProvider.Read() called just one even there is more buffer?

Coordinator
Jul 5, 2011 at 3:00 PM

what does block.Read return? is it less than blockBuffer.length

Jul 5, 2011 at 3:04 PM

its same length (1175040).

now i saw that th read method called more then one with 3308 bytes everytime.

Jul 5, 2011 at 3:15 PM
Edited Jul 5, 2011 at 3:25 PM

I check the Read method and I saw that it read 3308 bytes every time and in it the few first times it called it paly the audio and after thatn he keep calling but not playing anything.

one more thing, in the blockBuffer array there are a big number of bytes filled with zero... its ok? the array filled from block var.(block.Read(blockBuffer, 0, blockBuffer.length);
)

after a lot og time I get this error message "AcmStreamHeader dispose was not called" and then the Read method get zero

 

I must say that the following code its almost the same and its work great

WaveStream wave = new RawSourceStream(new MemoryStream(buffer), new WaveFormat(11000,20,1));
var reader = new WaveFileReader(wave);
var pcmStream = WaveFormatConversionStream.CreatePcmStream(reader);
block = new BlockAlignReductionStream(pcmStream);
inputStream = new WaveChannel32(block);
waveOut = new WaveOut(){DesiredLatency = 300};
waveOut.Init(inputStream);
waveOut.Play();
Coordinator
Jul 5, 2011 at 3:24 PM

You shouldn't be using both a RawSourceStream and a WaveFileReader. It is one or the other. If your buffer has a WAV header then do this:

var reader = new WaveFileReader(new MemoryStream(buffer));


Also, are you absolutely sure your audio is 20 bit? That is extremely unusual.

Jul 5, 2011 at 3:27 PM
Edited Jul 5, 2011 at 3:30 PM

I checked the file property and that what I saw... 20kbps

p.s. I edited my prev reply

Jul 5, 2011 at 3:48 PM
Edited Jul 6, 2011 at 1:45 PM

Maybe the file was in the wrong format... but still how it works with WaveChannel32?

I'll try another file

 

Edit:

thanks its working now