首页 文章

AVFoundation - 重新定时CMSampleBufferRef视频输出

提问于
浏览
11

第一次在这里问一个问题 . 我希望帖子清晰,示例代码格式正确 .

我正在尝试AVFoundation和时间推移摄影 .

我的目的是从iOS设备(我的iPod touch,版本4)的摄像机中抓取每个第N帧,并将每个帧写入文件以创建游戏中时光倒流 . 我正在使用AVCaptureVideoDataOutput,AVAssetWriter和AVAssetWriterInput .

问题是,如果我使用CMSampleBufferRef传递给

captureOutput:idOutputSampleBuffer:fromConnection:

,每帧的回放是原始输入帧之间的时间长度 . 帧率为1fps . 我希望得到30fps .

我试过用了

CMSampleBufferCreateCopyWithNewTiming()

,但是在将13帧写入文件后,

captureOutput:idOutputSampleBuffer:fromConnection:

停止被调用 . 界面处于活动状态,我可以点击按钮停止捕获并将其保存到照片库进行播放 . 它看起来像我想要的那样回放,30fps,但它只有那13帧 .

如何实现30fps播放的目标?如何判断应用程序丢失的原因以及原因?

我已经放置了一个名为useNativeTime的标志,因此我可以测试这两种情况 . 当设置为YES时,我得到了我感兴趣的所有帧,因为回调不会“迷路” . 当我将该标志设置为NO时,我只处理了13帧,并且再也没有返回到该方法 . 如上所述,在这两种情况下我都可以播放视频 .

谢谢你的帮助 .

这是我正在尝试重新定时的地方 .

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    BOOL useNativeTime = NO;
    BOOL appendSuccessFlag = NO;

    //NSLog(@"in captureOutpput sample buffer method");
    if( !CMSampleBufferDataIsReady(sampleBuffer) )
    {
        NSLog( @"sample buffer is not ready. Skipping sample" );
        //CMSampleBufferInvalidate(sampleBuffer);
        return;
    }

    if (! [inputWriterBuffer isReadyForMoreMediaData])
    {
        NSLog(@"Not ready for data.");
    }
    else {
        // Write every first frame of n frames (30 native from camera). 
        intervalFrames++;
        if (intervalFrames > 30) {
            intervalFrames = 1;
        }
        else if (intervalFrames != 1) {
            //CMSampleBufferInvalidate(sampleBuffer);
            return;
        }

        // Need to initialize start session time.
        if (writtenFrames < 1) {
            if (useNativeTime) imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
            else imageSourceTime = CMTimeMake( 0 * 20 ,600); //CMTimeMake(1,30);
            [outputWriter startSessionAtSourceTime: imageSourceTime];
            NSLog(@"Starting CMtime");
            CMTimeShow(imageSourceTime);
        }

        if (useNativeTime) {
            imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
            CMTimeShow(imageSourceTime);
            // CMTime myTiming = CMTimeMake(writtenFrames * 20,600);
            // CMSampleBufferSetOutputPresentationTimeStamp(sampleBuffer, myTiming); // Tried but has no affect.
            appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:sampleBuffer];
        }
        else {
            CMSampleBufferRef newSampleBuffer;
            CMSampleTimingInfo sampleTimingInfo;
            sampleTimingInfo.duration = CMTimeMake(20,600);
            sampleTimingInfo.presentationTimeStamp = CMTimeMake( (writtenFrames + 0) * 20,600);
            sampleTimingInfo.decodeTimeStamp = kCMTimeInvalid;
            OSStatus myStatus;

            //NSLog(@"numSamples of sampleBuffer: %i", CMSampleBufferGetNumSamples(sampleBuffer) );
            myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault,
                                                             sampleBuffer,
                                                             1,
                                                             &sampleTimingInfo, // maybe a little confused on this param.
                                                             &newSampleBuffer);
            // These confirm the good heath of our newSampleBuffer.
            if (myStatus != 0) NSLog(@"CMSampleBufferCreateCopyWithNewTiming() myStatus: %i",myStatus);
            if (! CMSampleBufferIsValid(newSampleBuffer)) NSLog(@"CMSampleBufferIsValid NOT!");

            // No affect.
            //myStatus = CMSampleBufferMakeDataReady(newSampleBuffer);  // How is this different; CMSampleBufferSetDataReady ?
            //if (myStatus != 0) NSLog(@"CMSampleBufferMakeDataReady() myStatus: %i",myStatus);

            imageSourceTime = CMSampleBufferGetPresentationTimeStamp(newSampleBuffer);
            CMTimeShow(imageSourceTime);
            appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:newSampleBuffer];
            //CMSampleBufferInvalidate(sampleBuffer); // Docs don't describe action. WTF does it do? Doesn't seem to affect my problem. Used with CMSampleBufferSetInvalidateCallback maybe?
            //CFRelease(sampleBuffer); // - Not surprisingly - “EXC_BAD_ACCESS”
        }

        if (!appendSuccessFlag)
        {
            NSLog(@"Failed to append pixel buffer");
        }
        else {
            writtenFrames++;
            NSLog(@"writtenFrames: %i", writtenFrames);
            }
    }

    //[self displayOuptutWritterStatus];    // Expect and see AVAssetWriterStatusWriting.
}

我的安装程序 .

- (IBAction) recordingStartStop: (id) sender
{
    NSError * error;

    if (self.isRecording) {
        NSLog(@"~~~~~~~~~ STOPPING RECORDING ~~~~~~~~~");
        self.isRecording = NO;
        [recordingStarStop setTitle: @"Record" forState: UIControlStateNormal];

        //[self.captureSession stopRunning];
        [inputWriterBuffer markAsFinished];
        [outputWriter endSessionAtSourceTime:imageSourceTime];
        [outputWriter finishWriting]; // Blocks until file is completely written, or an error occurs.
        NSLog(@"finished CMtime");
        CMTimeShow(imageSourceTime);

        // Really, I should loop through the outputs and close all of them or target specific ones.
        // Since I'm only recording video right now, I feel safe doing this.
        [self.captureSession removeOutput: [[self.captureSession outputs] objectAtIndex: 0]];

        [videoOutput release];
        [inputWriterBuffer release];
        [outputWriter release];
        videoOutput = nil;
        inputWriterBuffer = nil;
        outputWriter = nil;
        NSLog(@"~~~~~~~~~ STOPPED RECORDING ~~~~~~~~~");
        NSLog(@"Calling UIVideoAtPathIsCompatibleWithSavedPhotosAlbum.");
        NSLog(@"filePath: %@", [projectPaths movieFilePath]);
        if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum([projectPaths movieFilePath])) {
            NSLog(@"Calling UISaveVideoAtPathToSavedPhotosAlbum.");
            UISaveVideoAtPathToSavedPhotosAlbum ([projectPaths movieFilePath], self, @selector(video:didFinishSavingWithError: contextInfo:), nil);
        }
        NSLog(@"~~~~~~~~~ WROTE RECORDING to PhotosAlbum ~~~~~~~~~");
    }
    else {
        NSLog(@"~~~~~~~~~ STARTING RECORDING ~~~~~~~~~");
        projectPaths = [[ProjectPaths alloc] initWithProjectFolder: @"TestProject"];
        intervalFrames = 30;

        videoOutput = [[AVCaptureVideoDataOutput alloc] init];
        NSMutableDictionary * cameraVideoSettings = [[[NSMutableDictionary alloc] init] autorelease];
        NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
        NSNumber* value = [NSNumber numberWithUnsignedInt: kCVPixelFormatType_32BGRA]; //kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange];
        [cameraVideoSettings setValue: value forKey: key];
        [videoOutput setVideoSettings: cameraVideoSettings];
        [videoOutput setMinFrameDuration: CMTimeMake(20, 600)]; //CMTimeMake(1, 30)]; // 30fps
        [videoOutput setAlwaysDiscardsLateVideoFrames: YES];

        queue = dispatch_queue_create("cameraQueue", NULL);
        [videoOutput setSampleBufferDelegate: self queue: queue];
        dispatch_release(queue);

        NSMutableDictionary *outputSettings = [[[NSMutableDictionary alloc] init] autorelease];
        [outputSettings setValue: AVVideoCodecH264 forKey: AVVideoCodecKey];
        [outputSettings setValue: [NSNumber numberWithInt: 1280] forKey: AVVideoWidthKey]; // currently assuming
        [outputSettings setValue: [NSNumber numberWithInt: 720] forKey: AVVideoHeightKey];

        NSMutableDictionary *compressionSettings = [[[NSMutableDictionary alloc] init] autorelease];
        [compressionSettings setValue: AVVideoProfileLevelH264Main30 forKey: AVVideoProfileLevelKey];
        //[compressionSettings setValue: [NSNumber numberWithDouble:1024.0*1024.0] forKey: AVVideoAverageBitRateKey];
        [outputSettings setValue: compressionSettings forKey: AVVideoCompressionPropertiesKey];

        inputWriterBuffer = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo outputSettings: outputSettings];
        [inputWriterBuffer retain];
        inputWriterBuffer.expectsMediaDataInRealTime = YES;

        outputWriter = [AVAssetWriter assetWriterWithURL: [projectPaths movieURLPath] fileType: AVFileTypeQuickTimeMovie error: &error];
        [outputWriter retain];

        if (error) NSLog(@"error for outputWriter = [AVAssetWriter assetWriterWithURL:fileType:error:");
        if ([outputWriter canAddInput: inputWriterBuffer]) [outputWriter addInput: inputWriterBuffer];
        else NSLog(@"can not add input");

        if (![outputWriter canApplyOutputSettings: outputSettings forMediaType:AVMediaTypeVideo]) NSLog(@"ouptutSettings are NOT supported");

        if ([captureSession canAddOutput: videoOutput]) [self.captureSession addOutput: videoOutput];
        else NSLog(@"could not addOutput: videoOutput to captureSession");

        //[self.captureSession startRunning];
        self.isRecording = YES;
        [recordingStarStop setTitle: @"Stop" forState: UIControlStateNormal];

        writtenFrames = 0;
        imageSourceTime = kCMTimeZero;
        [outputWriter startWriting];
        //[outputWriter startSessionAtSourceTime: imageSourceTime];
        NSLog(@"~~~~~~~~~ STARTED RECORDING ~~~~~~~~~");
        NSLog (@"recording to fileURL: %@", [projectPaths movieURLPath]);
    }

    NSLog(@"isRecording: %@", self.isRecording ? @"YES" : @"NO");

    [self displayOuptutWritterStatus];  
}

2 回答

  • 10

    好的,我在第一篇文章中发现了这个错误 .

    使用时

    myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault,
                                                     sampleBuffer,
                                                     1,
                                                     &sampleTimingInfo, 
                                                     &newSampleBuffer);
    

    你需要 balancer CFRelease(newSampleBuffer);

    将CVPixelBufferRef与AVAssetWriterInputPixelBufferAdaptor实例的piexBufferPool一起使用时,同样的想法也适用 . 调用 appendPixelBuffer: withPresentationTime: 方法后,您将使用 CVPixelBufferRelease(yourCVPixelBufferRef); .

    希望这对其他人有帮助 .

  • 3

    通过更多的搜索和阅读,我有一个有效的解决方案 . 不知道这是最好的方法,但到目前为止,这么好 .

    在我的设置区域中,我设置了一个AVAssetWriterInputPixelBufferAdaptor . 代码添加看起来像这样 .

    InputWriterBufferAdaptor = [AVAssetWriterInputPixelBufferAdaptor
                assetWriterInputPixelBufferAdaptorWithAssetWriterInput: inputWriterBuffer
                sourcePixelBufferAttributes: nil];
    [inputWriterBufferAdaptor retain];
    

    为了完整理解下面的代码,我在setup方法中也有这三行 .

    fpsOutput = 30; //Some possible values: 30, 10, 15 24, 25, 30/1.001 or 29.97;
    cmTimeSecondsDenominatorTimescale = 600 * 100000; //To more precisely handle 29.97.
    cmTimeNumeratorValue = cmTimeSecondsDenominatorTimescale / fpsOutput;
    

    而不是将重定时应用于样本缓冲区的副本 . 我现在有以下三行代码可以有效地执行相同的操作 . 请注意适配器的withPresentationTime参数 . 通过将我的自定义值传递给我,我获得了正在寻找的正确时间 .

    CVPixelBufferRef myImage = CMSampleBufferGetImageBuffer( sampleBuffer );
    imageSourceTime = CMTimeMake( writtenFrames * cmTimeNumeratorValue, cmTimeSecondsDenominatorTimescale);
    appendSuccessFlag = [inputWriterBufferAdaptor appendPixelBuffer: myImage withPresentationTime: imageSourceTime];
    

    使用AVAssetWriterInputPixelBufferAdaptor.pixelBufferPool属性可能会有一些收获,但我还没有想到这一点 .

相关问题