Question

I am building a game that lets users remix songs. I have built a mixer (based upon the apple sample code MixerHost (creating an audioGraph with a mixer audioUnit), but expanded to load 12 tracks. everything is working fine, however it takes a really long time for the songs to load when the gamer selects the songs they want to remix. This is because the program has to load 12 separate mp4 files into memory before I can start playing the music.

I think what I need is to create a AUFilePlayer audioUnit that is in charge of loading the file into the mixer. If the AUFilePlayer can handle loading the file on the fly then the user will not have to wait for the files to load 100% into memory. My two questions are, 1. can an AUFilePlayer be used this way? 2. The documentation on AUFilePlayer is very very very thin. Where can I find some example code demonstrated how to implement a AUFilePlayer properly in IOS (not in MacOS)?

Thanks

Was it helpful?

Solution

I think you're right - in this case a 'direct-from-disk' buffering approach is probably what you need. I believe the correct AudioUnit subtype is AudioFilePlayer. From the documentation:

The unit reads and converts audio file data into its own internal buffers. It performs disk I/O on a high-priority thread shared among all instances of this unit within a process. Upon completion of a disk read, the unit internally schedules buffers for playback.

A working example of using this unit on Mac OS X is given in Chris Adamson's book Learning Core Audio. The code for iOS isn't much different, and is discussed in this thread on the CoreAudio-API mailing list. Adamson's working code example can be found here. You should be able to adapt this to your requirements.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top