This project has moved and is read-only. For the latest updates, please go here.

wma arggggggg

Nov 24, 2012 at 8:59 AM

hello,

For a while, I try to play wma files or other formats of the same kind with a driver or ASIO CoreAPI and I am still facing the same error.

I even included a MediaFoundation procedure (similar to that published in the new version of NAudio, a lot of work, and congratulations to the author) but it's still the same problem.


I end up with this error ::
"NAudio.MediaFoundation.IMFSourceReader ..... This interface is not supported (Exception from HRESULT 0x80004002 (E_NOINTERFACE)


However, the procedure works with audio driver WavOut () standart (without Thread).

So if someone find the trick to read files type. "Wma" with ASIO or driverAudio CoreAPI


Mark: I know I've raised the issue .. (see http://naudio.codeplex.com/workitem/16370) but hey, a call for commonality from time to time :-)


Sincerely, manun

Nov 24, 2012 at 3:59 PM

the issue here is almost certainly threading. You're creating and initialising the COM object on one thread and accessing it from another. That's why WaveOut works, while the other models which use background threads don't. You could try marking your app as being MTAThread instead of STAThread. Alternatively, you could create a wrapper around your IWaveProvider that doesn't perform any initialisation until the first call to Read comes in, which will be on the correct thread.

I've only just committed the IMFSourceReader yesterday and it's by no means finished. I need to add in some locking for positioning while you are playing and I will be trying it out with various threading options and models over the coming weeks.

Mark

Nov 24, 2012 at 4:01 PM

read this thread if you are unfamiliar with MTAThread:

http://stackoverflow.com/questions/127188/could-you-explain-sta-and-mta

Nov 25, 2012 at 11:00 PM

I've just checked in a proof of concept for how WASAPI out could use the GUI thread exclusively, allowing you to use it with MediaFoundationReader (or the WmaFileReader) and bypass the issues of marshalling across thread boundaries. For ASIO the problem is much harder. I think you will not be able to avoid the need to create the Media Foundation COM objects on MTA Threads as the callbacks will be on threads outside of the control of NAudio.

Mark

Nov 26, 2012 at 4:36 AM

hello,

Thank you for your interest in this particular case.

I had the seared troubleshooting, and your link on STA MTA to confirm my poor knowledge.


I tested your WasAPIGuiThread it works.


I am part of this model to change WaveOutEvent and DirectSound.

The audio works (it is already not!). But (always a but) the application (WinForms or WPF) does not respond during playback.


I'm afraid I can not help you (I think the thread, STA, MTA and me. We are not friends).

 

Manu

Nov 26, 2012 at 7:31 AM

There is no point trying to change WaveOutEvent as you cannot wait on an event in the GUI thread without blocking everything. I don't see any benefit to modifying DirectSound for this.

There are only two options if you want to use these COM objects:

1. Do everything on the GUI thread (i.e. use WaveOut or WasapiOutGuiThread)
2. Do everything on MTAThreads. This would require you to invent your own messaging system if you were using a GUI app. I might prototype this at some point in the future to show how it would be done from a WinForms or WPF app. I'm hoping to find out if the same problems exist for Windows Store apps (the async model may help by forcing creation of the COM objects onto a background thread).

hopefully option 1 is good enough for now to allow you to proceed

Mark

Nov 27, 2012 at 11:18 AM

Hello,

I have a good start solution without any change.
Just MediaFoundation initialized in the procedure Read and everyone is under the same background processes.
It works with all drivers.
Well, it is not very elegant as a method but it is a start.

Nov 27, 2012 at 11:21 AM

OK that's interesting. You could put the MF initialization into the ThreadProc on WasapiOut and maybe that will do it. ASIO would be a more difficult solution, since we don't creat the threads, but you could do a one-time initialization, with the hope that the callback always comes back on the same thread.

Nov 27, 2012 at 11:33 AM

I tested this solution in the driver but without success.
In idea I rotated the routine WaveStream MediaFoundationReader Read
Adding a Flag (MfStart) and initialized the mediaFoundation once.
Thus procésus MF is valid for all drivers (even ASIO)
I suspect that your choice is a more elegant solution.
I just downloaded the new version to make an example.

 

Nov 27, 2012 at 11:44 AM

I'd be interested to see the code you are using. I was thinking of having some kind of class in NAudio that ensured it was only called once, but maybe it should be called once per thread.

Nov 27, 2012 at 11:47 AM

I think I'm having problems (position, stop, ...)

Nov 27, 2012 at 11:57 AM

this is a example.

Only Procedure modified

private string _file = "";
private IMFSourceReader pReaderTh;
private bool MFStart = false;

       // Change pReader = Init()
        public MediaFoundationReader(string file)
        {
			_file = file;
			pReader = Init();
			length = GetLength();
        }

		// Init MediaFoundation 
		private IMFSourceReader Init()
		{
			IMFSourceReader retReader;

			//if (!initialized)
			//{
				// once only per app - TODO, maybe move this elsewhere
				MediaFoundationInterop.MFStartup(MediaFoundationInterop.MF_VERSION);
			//	initialized = true;
			//}
			var uri = new Uri(_file);
			MediaFoundationInterop.MFCreateSourceReaderFromURL(uri.AbsoluteUri, IntPtr.Zero, out retReader);
			retReader.SetStreamSelection(MediaFoundationInterop.MF_SOURCE_READER_ALL_STREAMS, false);
			retReader.SetStreamSelection(MediaFoundationInterop.MF_SOURCE_READER_FIRST_AUDIO_STREAM, true);

			/*IMFMediaType currentMediaType;
			pReader.GetCurrentMediaType(MediaFoundationInterop.MF_SOURCE_READER_FIRST_AUDIO_STREAM, out currentMediaType);
			Guid currentMajorType;
			currentMediaType.GetMajorType(out currentMajorType);
			IMFMediaType nativeMediaType;
			pReader.GetNativeMediaType(MediaFoundationInterop.MF_SOURCE_READER_FIRST_AUDIO_STREAM, 0, out nativeMediaType);*/

			// Create a partial media type indicating that we want uncompressed PCM audio
			IMFMediaType partialMediaType = null;
			MediaFoundationInterop.MFCreateMediaType(ref partialMediaType);
			partialMediaType.SetGUID(MediaFoundationInterop.MF_MT_MAJOR_TYPE, MediaFoundationInterop.MFMediaType_Audio);
			partialMediaType.SetGUID(MediaFoundationInterop.MF_MT_SUBTYPE, MediaFoundationInterop.MFAudioFormat_PCM);

			// set the media type
			retReader.SetCurrentMediaType(MediaFoundationInterop.MF_SOURCE_READER_FIRST_AUDIO_STREAM, IntPtr.Zero, partialMediaType);
			Marshal.ReleaseComObject(partialMediaType);

			// now let's find out what we actually got
			IMFMediaType uncompressedMediaType;
			retReader.GetCurrentMediaType(MediaFoundationInterop.MF_SOURCE_READER_FIRST_AUDIO_STREAM, out uncompressedMediaType);

			// Two ways to query it, first is to ask for properties (section is to convet into WaveFormatEx using MFCreateWaveFormatExFromMFMediaType)
			Guid actualMajorType;
			uncompressedMediaType.GetGUID(MediaFoundationInterop.MF_MT_MAJOR_TYPE, out actualMajorType);
			Debug.Assert(actualMajorType == MediaFoundationInterop.MFMediaType_Audio);
			Guid audioSubType;
			uncompressedMediaType.GetGUID(MediaFoundationInterop.MF_MT_SUBTYPE, out audioSubType);
			Debug.Assert(audioSubType == MediaFoundationInterop.MFAudioFormat_PCM);
			int channels;
			uncompressedMediaType.GetUINT32(MediaFoundationInterop.MF_MT_AUDIO_NUM_CHANNELS, out channels);
			int bits;
			uncompressedMediaType.GetUINT32(MediaFoundationInterop.MF_MT_AUDIO_BITS_PER_SAMPLE, out bits);
			int sampleRate;
			uncompressedMediaType.GetUINT32(MediaFoundationInterop.MF_MT_AUDIO_SAMPLES_PER_SECOND, out sampleRate);

			waveFormat = new WaveFormat(sampleRate, bits, channels);

			retReader.SetStreamSelection(MediaFoundationInterop.MF_SOURCE_READER_FIRST_AUDIO_STREAM, true);
			

			return retReader;
		}

		/// <summary>
		/// Reads from this wave stream
		/// </summary>
		/// <param name="buffer">Buffer to read into</param>
		/// <param name="offset">Offset in buffer</param>
		/// <param name="count">Bytes required</param>
		/// <returns>Number of bytes read; 0 indicates end of stream</returns>
		public override int Read(byte[] buffer, int offset, int count)
		{
// Here Init MF 
			if (MFStart == false)
			{
				pReaderTh = Init();
				var pv = PropVariant.FromLong(position);
				pReaderTh.SetCurrentPosition(Guid.Empty, ref pv);
				MFStart = true;
			}

			int bytesWritten = 0;
			// read in any leftovers from last time
			if (decoderOutputCount > 0)
			{
				bytesWritten += ReadFromDecoderBuffer(buffer, offset, count - bytesWritten);
			}

			while (bytesWritten < count)
			{
				IMFSample pSample;
				int dwFlags;
				ulong timestamp;
				int actualStreamIndex;
				pReaderTh.ReadSample(MediaFoundationInterop.MF_SOURCE_READER_FIRST_AUDIO_STREAM, 0, out actualStreamIndex, out dwFlags, out timestamp, out pSample);
				if (dwFlags != 0)
				{
					// reached the end of the stream or media type changed
					break;
				}/*
                if (dwFlags & MF_SOURCE_READERF_CURRENTMEDIATYPECHANGED)
                {
                    printf("Type change - not supported by WAVE file format.\n");
                    break;
                }
                if (dwFlags & MF_SOURCE_READERF_ENDOFSTREAM)
                {
                    printf("End of input file.\n");
                    break;
                }*/

				IMFMediaBuffer pBuffer;
				pSample.ConvertToContiguousBuffer(out pBuffer);
				IntPtr pAudioData = IntPtr.Zero;
				int cbBuffer;
				int pcbMaxLength;
				pBuffer.Lock(out pAudioData, out pcbMaxLength, out cbBuffer);
				EnsureBuffer(cbBuffer);
				Marshal.Copy(pAudioData, decoderOutputBuffer, 0, cbBuffer);
				decoderOutputOffset = 0;
				decoderOutputCount = cbBuffer;

				bytesWritten += ReadFromDecoderBuffer(buffer, offset + bytesWritten, count - bytesWritten);


				pBuffer.Unlock();
				Marshal.ReleaseComObject(pBuffer);
				Marshal.ReleaseComObject(pSample);
			}
			position += bytesWritten;
			return bytesWritten;
		}

Nov 27, 2012 at 12:01 PM

oupsss only add & change

// Add Var
	private string _file = "";
	private IMFSourceReader pReaderTh;
	private bool MFStart = false;

// change this

        public MediaFoundationReader(string file)
        {
			_file = file;
			pReader = Init();
			length = GetLength();
        }

// Add Init procedure

	private IMFSourceReader Init()
	{
		IMFSourceReader retReader;
.... 
}


// Add in Read
public override int Read(byte[] buffer, int offset, int count)
{

	if (MFStart == false)
	{
		pReaderTh = Init();
		var pv = PropVariant.FromLong(position);
		pReaderTh.SetCurrentPosition(Guid.Empty, ref pv);
		MFStart = true;
	}

........ same 

}

Nov 27, 2012 at 12:05 PM

OK, so you are creating a completely different stream for reading to the one you created in the constructor which was used to get the file duration. You might run into problems with reposition, because those will come in from the GUI thread.

Nov 27, 2012 at 12:09 PM

Modified  for Position

/// <summary>
        /// Current position within this stream
        /// </summary>
        public override long Position
        {
            get { return position; }
            set
            {
                // should pass in a variant of type VT_I8 which is a long containing time in 100nanosecond units
                long nsPosition = (10000000L * value) / waveFormat.AverageBytesPerSecond;
                var pv = PropVariant.FromLong(nsPosition);
                pReader.SetCurrentPosition(Guid.Empty, ref pv);
                position = value;


// ADD THIS
		if (pReaderTh != null)
		{
			pReaderTh.SetCurrentPosition(Guid.Empty, ref pv);
		}

// END ADD
                decoderOutputCount = 0;
                decoderOutputOffset = 0;
            }
        }

Nov 27, 2012 at 12:14 PM

there's no point keeping two instances open. You might as well dispose the one created in the constructor after you've got the length from it. Also, you will need some kind of locking as the Position setter and Read could run at the same time, meaning that decoderOutputCount and decoderOutputOffset could run at the same time. I'm surprised it lets you access pReaderTh from the GUI thread, but maybe the fact it was created on an MTAThread makes it safe to call from an STAThread.

Nov 27, 2012 at 12:16 PM

Position actually I have problems.

Well it was a good idea :-) No :-(

Nov 27, 2012 at 12:28 PM
Edited Nov 27, 2012 at 12:29 PM

 

Position :: SOLVED (for the moment)

// Modified
        public override long Position
        {
            get { return position; }
            set
            {
		
                position = value;
	        IsChangePosition = true;
			
            }
        }

// Read Changed
public override int Read(byte[] buffer, int offset, int count)
{

	if (MFStart == false)
	{
		pReaderTh = Init();
		decoderOutputCount = 0;
		decoderOutputOffset = 0;
		MFStart = true;
	}

	if (IsChangePosition)
	{
		long nsPosition = (10000000L * position) / waveFormat.AverageBytesPerSecond;
		var pv = PropVariant.FromLong(nsPosition);
		pReaderTh.SetCurrentPosition(Guid.Empty, ref pv);
		IsChangePosition = false;
				
	}

......
}

 

I deleted the pReader

Nov 27, 2012 at 12:36 PM

yes, this is one of the best ways to do repositioning with multithreaded code. You basically just set a reposition flag (still needs to be threadsafe). The only disadvantage is that when you reposition while paused, it looks as if nothing has happened. You also must move these lines into the Read method:

                decoderOutputCount = 0;
                decoderOutputOffset = 0;
Nov 27, 2012 at 12:44 PM
Edited Nov 27, 2012 at 12:45 PM
The only disadvantage is that when you reposition while paused, it looks as if nothing has happened. You also must move these lines into the Read method:

 

??

the latest version (read) works even during the Pause !

 

For information

Test with your Demo (AudioPlaybackDemo) (just add WmaInputFilePlugin.cs)

Nov 27, 2012 at 12:54 PM

The issue is that you either have to "pretend" the reposition has happened, by setting position to the supplied value, or leave position as it was and change it when the reposition is actually performed. The former is probably preferable. You really do need to move those decoder variables being set to 0 into the Read function though, or you could get corruption in the playback thread if you were repositioning while a Read was in progress.

Nov 27, 2012 at 1:13 PM
Edited Nov 27, 2012 at 1:16 PM

I'm afraid, I do not quite understand your comment

To clarify the situation:

 

 

public override int Read(byte[] buffer, int offset, int count)
{

	if (MFStart == false)
	{
		pReaderTh = Init();
		//SetPosition();
		MFStart = true;
	}

	SetPosition();

..................
}

			
// Position pReaderTh
	private void SetPosition()
	{
			
		if (IsChangePosition)
		{
			long nsPosition = (10000000L * position) / waveFormat.AverageBytesPerSecond;
			var pv = PropVariant.FromLong(nsPosition);
			pReaderTh.SetCurrentPosition(Guid.Empty, ref pv);
			IsChangePosition = false;
			decoderOutputCount = 0;
			decoderOutputOffset = 0;
		}
	}
Nov 27, 2012 at 1:51 PM

that's fine. You just need to add thread safety as position could get changed by Read after you call set_Position if you happened to be on a Read while position was being set from the GUI thread.

Nov 27, 2012 at 2:11 PM

I have a little trouble following you (ok my neurons are not getting any younger)

Normally everything is under the procedure Read.therefore any change of position is taken into account in the next passage Read. I have a little trouble seeing why should protect against a Thread.
Is a Lock (...) is the case?

Nov 27, 2012 at 2:17 PM

OK, here's an example:

1) Playback thread starts calling Read(). Position = 100000, asking for 100 bytes, but Read doesn't finish before...

2) User sets the Position on the GUI thread to set the position to 200000

3) back on the Playback thread, Read returns 100 bytes and increments position to 200100 (even though actual position is 100100)

4) now on the next call to read, we do the reposition, but not to the point that was asked for. We reposition to 200100.

It's not particularly serious.There is a nastier race condition where you reposition right in the middle of where position is being updated, which would cause the reposition to be ignored.

Mark

Nov 27, 2012 at 3:33 PM

Thank you for your patience.

I understood, but I look at the procedure Read more I am told that this case is not possible, since it uses temporary buffers.

- If changing the Property Position: all buffers are cleared.

I renamed the variable "position" in "_varPosition" & "IsChangePosition" in "PropertiesPositionChanged" to avoid confusion with the Position property.

 

I placed the file on WeTransfert   http://wtrns.fr/dOpzZ5h833N1j9Y

But you're probably right.

 

public override long Position
{
	get { return _varPosition; }
	set
	{
		_varPosition = value;
		PropertiesPositionChanged= true;
	}
}

public override int Read(byte[] buffer, int offset, int count)
{
		
	if (MFStart == false)
	{
		pReaderTh = Init();
		PropertiesPositionChanged= true; // Force Change Position to start
					
		MFStart = true;
	}


	if (PropertiesPositionChanged)
	{
		long nsPosition = (10000000L * _varPosition) / waveFormat.AverageBytesPerSecond;
		var pv = PropVariant.FromLong(nsPosition);
		pReaderTh.SetCurrentPosition(Guid.Empty, ref pv);
		PropertiesPositionChanged= false;
		decoderOutputCount = 0;
		decoderOutputOffset = 0;
	}

	int bytesWritten = 0;
				
	if (decoderOutputCount > 0)
	{
		bytesWritten += ReadFromDecoderBuffer(buffer, offset, count - bytesWritten);
	}

	while (bytesWritten < count)
	{
		IMFSample pSample;
		int dwFlags;
		ulong timestamp;
		int actualStreamIndex;
		pReaderTh.ReadSample(MediaFoundationInterop.MF_SOURCE_READER_FIRST_AUDIO_STREAM, 0, out actualStreamIndex, out dwFlags, out timestamp, out pSample);
		if (dwFlags != 0)
		{
			break;
		}

		IMFMediaBuffer pBuffer;
		pSample.ConvertToContiguousBuffer(out pBuffer);
		IntPtr pAudioData = IntPtr.Zero;
		int cbBuffer;
		int pcbMaxLength;
		pBuffer.Lock(out pAudioData, out pcbMaxLength, out cbBuffer);
		EnsureBuffer(cbBuffer);
		Marshal.Copy(pAudioData, decoderOutputBuffer, 0, cbBuffer);
		decoderOutputOffset = 0;
		decoderOutputCount = cbBuffer;

		bytesWritten += ReadFromDecoderBuffer(buffer, offset + bytesWritten, count - bytesWritten);


		pBuffer.Unlock();
		Marshal.ReleaseComObject(pBuffer);
		Marshal.ReleaseComObject(pSample);
	}
	_varPosition += bytesWritten;
	return bytesWritten;
}

Nov 27, 2012 at 3:37 PM

cool, when I do the multi-threading work on MediaFoundationReader I might make use of this. There is still a vast amount of interop to be written to support creatign files, live streaming, and applying Media Foundation Transforms, so it will be a while before NAudio is ready to properly announce MF support.

Nov 27, 2012 at 3:50 PM

I have a large part of interfaces, enum and structure.

I can be you advance :-))

Let me just a little time to make clean in this library.

If you are interested?

Nov 27, 2012 at 3:53 PM

sure, thanks. I have a couple of partial convertions I can rely on in places, but another could speed up the process. With .NET interop, there are often several ways of wrapping the same functions, and it can be hard to decide which one is the best approach.