This project has moved. For the latest updates, please go here.

Creating a mixed file of 4 sounds. How can I achieve that?

Feb 27, 2015 at 1:33 AM
So, I am working on something that need a simple interaction's user.
I have 4 sounds : 1 background and 3 others for user's interaction, the interaction occurs by the respective audio button pressed.

I am using the AudioFileReader to read the mp3 audio files and using the WaveOut to play it.
I need to record a file where have the background file and the user interaction with the audios. How can I get that? ( I searched and I found a WaveIn, WaveStream and WASABI too, but I don't know what is the best to do that).

Another thing is, this audio recorded has to be a mp3 audio. I know about the wav record, but I need to get a mp3 file. Someone has some advice?

Thanks
Coordinator
Feb 27, 2015 at 6:39 AM
Look into MixingSampleProvider to perform the mixing.
Look into MediaFoundationEncoder to create MP3 (although this is only usually available on Windows 8 or above)
Feb 27, 2015 at 3:04 PM
Thanks for the reply Mark.
But, for now that looks like will be too much work. I am trying here but without any success. :(
So, now I have a new possibility, ignore the mp3 files to work with wavefiles.
But the MixingSampleProvider returns a error about 'all files have to be as the same WaveFormat'. I searched for this and in a example, you create new files with waveformat.
Even if I am working with wave files in 16 bits (Audacity's convertion), will do I need create a new file?
Could you give some more explanation about it?

Thanks again.
Feb 27, 2015 at 9:02 PM
Hello again,
So, now I understood a little of the process and I started to use your AudioPlackEngine code. Btw, nice code.
I had to make some changes (just SampleRate value to 16000) - what I can change again, if necessary - to work with my audios.
AudioPlackEngine is working pretty well.
But, I still have a problem.
How can I record the mixer from the AudioPlackEngine?
I tried access the mixer, but the application returns a error about 'WAV too large'.

Thanks.
Feb 28, 2015 at 12:16 PM
Edited Feb 28, 2015 at 12:19 PM
I was looking for some way to get it done, and I found that post (here).
I was thinking if the problem could be just the mix doesn't stop.
But I need to know if the main sound is stopped.
Remember, I am using AudioPlacybackEngine.

Thanks and sorry if I am being annoying but I really need to have this done.
Coordinator
Feb 28, 2015 at 12:30 PM
yeah, don't try to save a never-ending stream to a WAV file! I'd recommend picking a length for the file and when it gets over that length starting a new WAV file
Feb 28, 2015 at 1:06 PM
The length of the 'mixed' audio must be the length of the background audio (first audio to play). OK?
The other audio, will be played during the background audio is playing.
This part is working right now. I add them to the mixingsample instance as when necessary.
But the save this mixed file is being my problem. I will need to read the bytes again from mixer instance to write a file?
Please, could you give more info about this?

Thanks again to be so nice.
Coordinator
Mar 1, 2015 at 6:55 AM
hi, please check my article on saving and playing audio at the same time. This should point you in the right direction
Mar 2, 2015 at 12:50 PM
Hi, so I have tried to do with SavingWaveProvider and BufferedWaveProvider, as your post about saving and playing audio.
It creates the WAV file, but haven't audio. Why?
I looked at BufferedWaveProvider and it's need somewhere to have a AddSamples function. How can I add this function to the MixingSampleProvider?or to WaveOut.
Here goes some code:
bufferedProviderRec = new BufferedWaveProvider(AudioPlaybackEngine.Instance.Mixer.WaveFormat);
                 savingRec = new SavingWaveProvider(bufferedProviderRec, "mixedAudio.wav");
                 outMix = new WaveOut();
                 outMix.Init(savingRec);
                 outMix.Play();
                 AudioPlaybackEngine.Instance.PlaySound(players[0].GetInfos.Path + "/" + players[0].GetInfos.NameWithExtension);
Thanks again to be so nice.
Coordinator
Mar 2, 2015 at 3:41 PM
OK, if you are using AudioPlaybackEngine, then SavingWaveProvider needs to be put into the signal chain for that. Look for the call to Init in there, and wrap whatever is being passed in (probably a mixer) in a SavingWaveProvider.
Mar 2, 2015 at 4:42 PM
Could you explain a little more?
I don't understand about signal chain.

Thanks
Coordinator
Mar 5, 2015 at 2:57 PM
A signal chain is where you connect together IWaveProviders and ISampleProviders. So for example pass two AudioFileReaders into a MixingSampleProvider, and then pass that into a SavingWaveProvider and then pass that into Init of a WaveOut device. If you have access to Pluralsight, I go into this in a lot of detail in my NAudio course