We're able to access a .wav file stored as an embedded resource in the project with the below code. This gets us a stream to the sound file contents
(the below assumes that there exists a SoundFiles sub-directory in the project containing the .wav sound files and that the files are set to be embedded resources).
var asm = Assembly.GetExecutingAssembly();
var resourceStream = asm?.GetManifestResourceStream(
Once we have the stream, we can use it to create an instance of the WaveFileReader class and then provide that instance to the output player (WasapiOut for example). To control the playback volume, we could use IWavePlayer.Volume, but...
... the comments for IWavePlayer.Volume indicate that this should not be used but that volume should be set "on your input WaveProvider instead". The problem is that WaveFileReader does not have a Volume property.
So if using WaveFileReader, there doesn't appear to be any way to control the volume "correctly". Also it appears that setting volume on the IWavePlayer actually affects the global system sound volume, which is not desirable.
An alternative is to use the AudioFileReader class. While this class has a Volume property, it does not have a way to construct it using a stream.
What we're doing now is to get the stream for the embedded resource, copy the contents out to a temp file, and then point the AudioFileReader class at the temp file, then play the sound, then delete the temp file when done.
This seems a bit hacky and unnecessary though. I'm posting this to suggest one of the following.
1) Add a Volume property to the WaveFileReader class
2) Add the ability to construct an AudioFileReader from a stream instead of just a file path string