質問

I've been struggling with this for about a year now trying to pin down my problem and represent it for others to see.

I've been writing an app that depends on 'GarageBand' like recording. That is, I want to record the user for exactly 8 beats, and then I want them to be able to loop this. I am playing a metronome for the user at the same time (User would be wearing head phones hearing the metronome, recording into the mic on their device)

I can manage to turn on recording for about 4.8 seconds (.6*8 beats), and the timer says it ran for 4.8 seconds, however my audio recording is always a bit shorter than 4.8. It's like 4.78, or 4.71 which causes a the loop to go out of whack.

I've experimented with AVAudioRecorder,AudioQueue, and AudioUnits thinking one the latter methods might result in solving my problem.

I am using NSTimer to fire off every .6 seconds playing a short blip for the metronome. After 4 beats, the metronome timer's function, turns on the recorder metronnome which waits for 4.6 seconds the stops recording.

I'm using time intervals to time how long the metro runs (looks pretty tight at 4.800xxx) and comparing this to the duration of the audio file which is always different.

I wish I could attach my project, but I guess I'll just have to settle with attaching my header and implementation. To test you'll have to make a project with the following IB characteristics:

Record, Play, Stop buttons Song/Track duration label Timer duration label Debug label

If you launch the app, then hit record, you are 'counted in' with 4 beats, then the recorder starts. Tap your finger on the desk until the recorder stops. After 8 more beats (12 in total) the recorder stops.

You can see in the displays that the recorded track is a little shorter than 4.8 seconds, and in some cases, a lot shorter, causing the audio to not loop properly.

Does anyone know what I can do to tighten this up? Thanks for reading.

Here's my code:

//
//  ViewController.h
//  speakagain
//
//  Created by NOTHING on 2014-03-18.
//

#import <UIKit/UIKit.h>
#import <Foundation/Foundation.h>
#import "CoreAudio/CoreAudioTypes.h"
#import <AudioToolbox/AudioQueue.h>
#import <AudioToolbox/AudioFile.h>
#import <AVFoundation/AVFoundation.h>

@interface ViewController : UIViewController
{
    IBOutlet UIButton *btnRecord;
    IBOutlet UIButton *btnPlay;
    IBOutlet UIButton *btnStop;
    IBOutlet UILabel *debugLabel;
    IBOutlet UILabel *timerDuration;
    IBOutlet UILabel *songDuration;

    //UILabel *labelDebug;

    struct AQRecorderState {
        AudioStreamBasicDescription  mDataFormat;
        AudioQueueRef                mQueue;
        AudioQueueBufferRef          mBuffers[kNumberBuffers];
        AudioFileID                  mAudioFile;
        UInt32                       bufferByteSize;
        SInt64                       mCurrentPacket;
        bool                         mIsRunning;                    // 8

    };
    struct AQRecorderState aqData;
    AVAudioPlayer *audioPlayer;

    NSString *songName;
    NSTimer *recordTimer;
    NSTimer *metroTimer;
    NSTimeInterval startTime, endTime, elapsedTime;

    int inputBuffer;
    int beatNumber;

}
@property (nonatomic, retain)   IBOutlet UIButton *btnRecord;
@property (nonatomic, retain)   IBOutlet UIButton *btnPlay;
@property (nonatomic, retain)   IBOutlet UIButton *btnStop;
@property (nonatomic, retain)   IBOutlet UILabel *debugLabel;
@property (nonatomic, retain) IBOutlet UILabel *timerDuration;
@property (nonatomic, retain) IBOutlet UILabel *songDuration;


- (IBAction) record;
- (IBAction) stop;
- (IBAction) play;

static void HandleInputBuffer (void *aqData,AudioQueueRef inAQ,AudioQueueBufferRef inBuffer,const AudioTimeStamp *inStartTime, UInt32 inNumPackets,const AudioStreamPacketDescription  *inPacketDesc);

@end

Implementation:

//
    //  ViewController.m
    //  speakagain
    //
    //  Created by NOTHING on 2014-03-18.
    //

    #import "ViewController.h"


    @interface ViewController ()

    @end

    @implementation ViewController
    @synthesize btnPlay, btnRecord,btnStop,songDuration, timerDuration, debugLabel;


    - (void)viewDidLoad
    {
        debugLabel.text = @"";
        songName =[[NSString alloc ]init];
        //NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        //NSString *documentsDirectory = [paths objectAtIndex:0];
        songName = @"TestingQueue.caf";



        [super viewDidLoad];
        // Do any additional setup after loading the view, typically from a nib.
    }
    - (void)prepareAudioQueue
    {
        //struct AQRecorderState *pAqData;
        inputBuffer=0;
        aqData.mDataFormat.mFormatID         = kAudioFormatLinearPCM;
        aqData.mDataFormat.mSampleRate       = 44100.0;
        aqData.mDataFormat.mChannelsPerFrame = 1;
        aqData.mDataFormat.mBitsPerChannel   = 16;
        aqData.mDataFormat.mBytesPerPacket   =
        aqData.mDataFormat.mBytesPerFrame = aqData.mDataFormat.mChannelsPerFrame * sizeof (SInt16);
        aqData.mDataFormat.mFramesPerPacket  = 1;

        //    AudioFileTypeID fileType             = kAudioFileAIFFType;
        AudioFileTypeID fileType             = kAudioFileCAFType;
        aqData.mDataFormat.mFormatFlags = kLinearPCMFormatFlagIsBigEndian| kLinearPCMFormatFlagIsSignedInteger| kLinearPCMFormatFlagIsPacked;

        AudioQueueNewInput (&aqData.mDataFormat,HandleInputBuffer, &aqData,NULL, kCFRunLoopCommonModes, 0,&aqData.mQueue);

        UInt32 dataFormatSize = sizeof (aqData.mDataFormat);

        // in Mac OS X, instead use
        //    kAudioConverterCurrentInputStreamDescription
        AudioQueueGetProperty (aqData.mQueue,kAudioQueueProperty_StreamDescription,&aqData.mDataFormat,&dataFormatSize);

        //Verify
        NSFileManager *fileManager = [NSFileManager defaultManager];
        NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        NSString *documentsDirectory = [paths objectAtIndex:0];
        NSString *txtPath = [documentsDirectory stringByAppendingPathComponent:songName];

        NSLog(@"INITIALIZING FILE");
        if ([fileManager fileExistsAtPath:txtPath] == YES) {
            NSLog(@"PREVIOUS FILE REMOVED");
            [fileManager removeItemAtPath:txtPath error:nil];
        }


        const char *filePath = [txtPath UTF8String];
        CFURLRef audioFileURL = CFURLCreateFromFileSystemRepresentation ( NULL,(const UInt8 *) filePath,strlen (filePath),false );
        AudioFileCreateWithURL (audioFileURL,fileType,&aqData.mDataFormat, kAudioFileFlags_EraseFile,&aqData.mAudioFile );

        DeriveBufferSize (aqData.mQueue,aqData.mDataFormat,0.5,&aqData.bufferByteSize);

        for (int i = 0; i < kNumberBuffers; ++i)
        {
            AudioQueueAllocateBuffer (aqData.mQueue,aqData.bufferByteSize,&aqData.mBuffers[i]);
            AudioQueueEnqueueBuffer (aqData.mQueue,aqData.mBuffers[i], 0,NULL );
        }

    }

    - (void) metronomeFire
    {
        if(beatNumber < 5)
        {
            //count in time.
            // just play the metro beep but don't start recording
            debugLabel.text = @"count in (1,2,3,4)";
            [self playSound];
        }
        if(beatNumber == 5)
        {
            //start recording
            aqData.mCurrentPacket = 0;
            aqData.mIsRunning = true;
            startTime = [NSDate timeIntervalSinceReferenceDate];
            recordTimer = [NSTimer scheduledTimerWithTimeInterval:4.8 target:self selector:@selector(killTimer) userInfo:nil repeats:NO];
            AudioQueueStart (aqData.mQueue,NULL);
            debugLabel.text = @"Recording for 8 beats (1,2,3,4 1,2,3,4)";
            [self playSound];
        }
        else if (beatNumber < 12)
        {   //play metronome from beats 6-16
            [self playSound];
        }
        if(beatNumber == 12)
        {
            [metroTimer invalidate]; metroTimer = nil;
            [self playSound];
        }

        beatNumber++;

    }
    - (IBAction) play
    {
        NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        NSString *documentsDirectory = [paths objectAtIndex:0];
        NSString *txtPath = [documentsDirectory stringByAppendingPathComponent:songName];
        NSURL *url = [NSURL fileURLWithPath:[NSString stringWithFormat:@"%@",txtPath]];

        if (audioPlayer)
        {
            [audioPlayer stop];
            audioPlayer = nil;
        }
        NSError *error;
        audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];

        if (audioPlayer == nil)
        {
            NSLog(@"%@",[error description]);
        }
        else
        {
            [audioPlayer play];
            [audioPlayer setNumberOfLoops:-1];
        }
    }
    - (void) killTimer
    {
        //this is the timer function.  Runs once after 4.8 seconds.
       [self stop];

    }
    - (IBAction) stop
    {
        if (audioPlayer)
        {
            [audioPlayer stop];
            audioPlayer = nil;



        }
        else
        {

            if(metroTimer)
            {
                [metroTimer invalidate];metroTimer = nil;
            }
            //Stop the audio queue
            AudioQueueStop (aqData.mQueue,true);
            aqData.mIsRunning = false;
            AudioQueueDispose (aqData.mQueue,true);
            AudioFileClose (aqData.mAudioFile);

            //Get elapsed time of timer
            endTime = [NSDate timeIntervalSinceReferenceDate];
            elapsedTime = endTime - startTime;

            //Get elapsed time of audio file
            NSArray *pathComponents = [NSArray arrayWithObjects:
                                       [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject],
                                       songName,
                                       nil];
            NSURL *audioFileURL = [NSURL fileURLWithPathComponents:pathComponents];
            AVURLAsset* audioAsset = [AVURLAsset URLAssetWithURL:audioFileURL options:nil];
            CMTime audioDuration = audioAsset.duration;
            float audioDurationSeconds = CMTimeGetSeconds(audioDuration);

            //Log values
            NSLog(@"Track Duration: %f",audioDurationSeconds);
            NSLog(@"Timer Duration: %.6f", elapsedTime);

            //Show values on GUI too
            songDuration.text = [NSString stringWithFormat: @"Track Duration: %f",audioDurationSeconds];
            timerDuration.text = [NSString stringWithFormat:@"Timer Duration: %@",[NSString stringWithFormat: @"%.6f", elapsedTime]];
            debugLabel.text = @"Why is the duration of the track less than the duration the timer ran?";
        }


    }
    -(void) playSound
    {
        NSString *path = [[NSBundle mainBundle] pathForResource:@"blip2" ofType:@"aif"];
        SystemSoundID soundID;
        AudioServicesCreateSystemSoundID((__bridge CFURLRef)[NSURL fileURLWithPath:path],  &soundID);
        AudioServicesPlaySystemSound (soundID);
    }

    - (IBAction) record
    {
        [self prepareAudioQueue];
        songDuration.text = @"";
        timerDuration.text = @"";
        //debugLabel.text = @"Please wait 12 beats (The first four are count in)";
        //init beat number
        beatNumber = 1;

        //safe guard
        if(aqData.mIsRunning)
        {
            AudioQueueStop (aqData.mQueue,true);

            aqData.mIsRunning = false;

            AudioQueueDispose (aqData.mQueue,true);
            AudioFileClose (aqData.mAudioFile);
        }

        //start count in (metro will start recording)
        //aqData.mCurrentPacket = 0;
        //aqData.mIsRunning = true;
        startTime = [NSDate timeIntervalSinceReferenceDate];
        metroTimer = [NSTimer scheduledTimerWithTimeInterval:.6 target:self selector:@selector(metronomeFire) userInfo:nil repeats:YES];
        //recordTimer = [NSTimer scheduledTimerWithTimeInterval:4.8 target:self selector:@selector(killTimer) userInfo:nil repeats:NO];
        //AudioQueueStart (aqData.mQueue,NULL);

    }
    static void HandleInputBuffer (void *aqData,AudioQueueRef inAQ,AudioQueueBufferRef inBuffer,const AudioTimeStamp *inStartTime,UInt32 inNumPackets,const AudioStreamPacketDescription *inPacketDesc)
    {
        //boiler plate
        NSLog(@"HandleInputBuffer");

        struct AQRecorderState *pAqData = (struct AQRecorderState *) aqData;

        if (inNumPackets == 0 && pAqData->mDataFormat.mBytesPerPacket != 0)
            inNumPackets = inBuffer->mAudioDataByteSize / pAqData->mDataFormat.mBytesPerPacket;

        if (AudioFileWritePackets (pAqData->mAudioFile,false,inBuffer->mAudioDataByteSize,inPacketDesc,pAqData->mCurrentPacket,&inNumPackets,inBuffer->mAudioData) == noErr)
        {
            pAqData->mCurrentPacket += inNumPackets;
        }

        if (pAqData->mIsRunning == 0)
            return;

        AudioQueueEnqueueBuffer (pAqData->mQueue,inBuffer,0,NULL);
    }

    void DeriveBufferSize(AudioQueueRef audioQueue,AudioStreamBasicDescription ASBDescription,Float64 seconds,UInt32 *outBufferSize)
    {
        //boiler plate
        static const int maxBufferSize = 0x50000;
        int maxPacketSize = ASBDescription.mBytesPerPacket;
        if(maxPacketSize == 0)
        {
            UInt32 maxVBRPacketSize = sizeof(maxPacketSize);
            AudioQueueGetProperty(audioQueue, kAudioQueueProperty_MaximumOutputPacketSize, &maxPacketSize, &maxVBRPacketSize);
            NSLog(@"max buffer = %d",maxPacketSize);
        }
        Float64 numBytesForTime = ASBDescription.mSampleRate * maxPacketSize * seconds;
        *outBufferSize = (UInt32)(numBytesForTime < maxBufferSize ? numBytesForTime : maxBufferSize);
    }

    OSStatus SetMagicCookieForFile (AudioQueueRef inQueue, AudioFileID inFile)
    {
        //boiler plate
        OSStatus result = noErr;
        UInt32 cookieSize;
        if (AudioQueueGetPropertySize (inQueue,kAudioQueueProperty_MagicCookie,&cookieSize) == noErr)
        {
            char* magicCookie =(char *) malloc (cookieSize);
            if (AudioQueueGetProperty (inQueue,kAudioQueueProperty_MagicCookie,magicCookie,&cookieSize) == noErr)
            {
                result =    AudioFileSetProperty (inFile,kAudioFilePropertyMagicCookieData,cookieSize,magicCookie);
            }

            free (magicCookie);
        }
        return result;

    }













    - (void)didReceiveMemoryWarning
    {
        [super didReceiveMemoryWarning];
        // Dispose of any resources that can be recreated.
    }
    @end
役に立ちましたか?

解決

This is a big topic so I doubt you'll get an answer big enough to re-architect the code that you've provided. However, I can give you links that will supply the vast majority of what you require.

First thing is NSTimer would never work due to synchronisation issues. Also, forget AudioQueue and AVAudioRecorder. Only AudioUnit is low level enough for your needs.

Have a look at my answer here:

iOS Stream Audio from one iOS Device to Another

But the true goldmine - and knowledge that you will need to be intimately familiar with - is Tasty Pixel's blog. Tasty Pixel being the vendor of Loopy HD, but also someone that is kind enough to share some pretty in depth knowledge.

See:

A simple, fast circular buffer implementation for audio processing

Developing Loopy, Part 2: Implementation

and

Using RemoteIO audio unit

Finally, make sure you are familiar with packets, frames, samples, etc. Everything needs to sync perfectly.

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top