Problems getting duration and current time

Jun 22, 2011 at 3:11 PM

I'm having some major problems trying to get the file duration and current time.  The problem comes when trying to create the reader in order to obtain the data.  Here is how I create the readers.

private IWavePlayer waveOutDevice;
private WaveStream mainOutputStream;
private WaveFileReader mainWAVReader;
private Mp3FileReader mainMP3Reader;
waveOutDevice = new WaveOut();
mainOutputStream= CreateInputStream(fileName);
if (FileType == SoundFileType.WAV)
{
    try
    {
        mainWAVReader = new WaveFileReader(mainOutputStream);
    }
    catch (Exception ex)
    {
        ErrorLoadingSound(ex.Message); // User-defined event
        return false;
    }
}
else if (FileType == SoundFileType.MP3)
{
    try
    {
        mainMP3Reader = new Mp3FileReader(mainOutputStream);
    }
    catch (Exception ex)
    {
        ErrorLoadingSound(ex.Message);  // User-defined event
        return false;
     }
}
The wave and mp3 files I'm using for this test play FINE if I comment out the creation of the WaveFileReader and the Mp3FileReader.  Here is what happens on the statement to create the respective file readers.

wav file: When it tries to create a new WaveFileReader, it throws an exception: "Not a WAVE file - no RIFF header."

mp3 file: When it tries to create a new Mp3FileReader, it hangs for a long time and I get a message in the Output window: "A first chance exception of type 'System.FormatException' occurred in NAudio.dll".  Eventually, if I wait long enough, I get a popup box titled "ContextSwitchDeadlock was detected" with the following message:

"The CLR has been unable to transition from COM context 0x2b8c10 to COM context 0x2b8d80 for 60 seconds. The thread that owns the destination context/apartment is most likely either doing a non pumping wait or processing a very long running operation without pumping Windows messages. This situation generally has a negative performance impact and may even lead to the application becoming non responsive or memory usage accumulating continually over time. To avoid this problem, all single threaded apartment (STA) threads should use pumping wait primitives (such as CoWaitForMultipleHandles) and routinely pump messages during long running operations."

I'm sorry, but if the file plays fine using NAudio, shouldn't I also be able to get the duration and current time?  Or am I doing something wrong, here?

It never actually GETS to the functions to get the current time and the duration, because the errors occur when trying to create the reader, but here is the code I have that should get me this data.

public TimeSpan GetCurrentTime()
{
    TimeSpan duration = new TimeSpan(0);
    if (IsLoaded() && IsPlaying())
    {
        if (FileType == SoundFileType.WAV && mainWAVReader != null)
            duration = mainWAVReader.CurrentTime;
        else if (FileType == SoundFileType.MP3 && mainMP3Reader != null)
            duration = mainMP3Reader.CurrentTime;
    }
    return duration;
}
public TimeSpan GetDuration()
{
    TimeSpan duration= new TimeSpan(0);
    if (IsLoaded())
    {
        if (FileType == SoundFileType.WAV && mainWAVReader != null)
            duration = mainWAVReader.TotalTime;
        else if (FileType == SoundFileType.WAV && mainMP3Reader != null)
            duration = mainMP3Reader.TotalTime;
    }
    return duration;
}

So, am I doing something wrong, here? 

 

Coordinator
Jun 22, 2011 at 9:46 PM

what does CreateInputStream do?

Jun 22, 2011 at 10:49 PM
Edited Jun 22, 2011 at 10:57 PM

That was pulled from one of the tutorials on your site, and it works quite well to play the wav and mp3 files I've been using so far.  The code is as follows:

 

private WaveStream CreateInputStream(string fileName)
{
    WaveChannel32 inputStream = null;
    try
    {
        FileType = SoundFileType.None;
        if (fileName.EndsWith(".wav"))
        {
            FileType = SoundFileType.WAV;
            WaveStream readerStream = new WaveFileReader(fileName);
            if (readerStream.WaveFormat.Encoding != WaveFormatEncoding.Pcm)
            {
                readerStream = WaveFormatConversionStream.CreatePcmStream(readerStream);
                readerStream = new BlockAlignReductionStream(readerStream);
            }

            if (readerStream.WaveFormat.BitsPerSample == 16)
            {
                var format = new WaveFormat(readerStream.WaveFormat.SampleRate, 16, readerStream.WaveFormat.Channels);
                readerStream = new WaveFormatConversionStream(format, readerStream);
            }
            inputStream = new WaveChannel32(readerStream);
        }
        else if (fileName.EndsWith(".mp3"))
        {
            FileType = SoundFileType.MP3;
            WaveStream mp3Reader = new Mp3FileReader(fileName);
            inputStream = new WaveChannel32(mp3Reader);
        }
        else
        {
            throw new InvalidOperationException("Unsupported extension");
        }
    }
    catch (Exception ex)
    {
        //MessageBox.Show(String.Format("{0}", driverCreateException.Message));
        LastErrorMessage = ex.Message;  // Class variable that holds the last error generated by the codee
        //return;
    }

    return inputStream;
}

I'm open to any suggestions on how to fix this...

Coordinator
Jun 23, 2011 at 6:45 AM

Use mainOutputStream.TotalTime and mainOutputStream.CurrentTime instead. Don't wrap it in another Mp3FileReader or WaveFileReader.

Mark

Jun 23, 2011 at 6:48 PM

Yup, that worked fine.  Thanks, Mark.

So far, the NAudio library is doing everything I wanted it to.  I'm sure it has limitations that I'll need to work around, but I haven't any significant ones, yet.