我的屏幕录制有问题 . 现在我正在使用“drawViewHierarchyInRect:afterScreenUpdates:”并将像素缓冲区提供给AVAssetWriterInputPixelBufferAdaptor,这样可以正常工作,但仅适用于iPhone 5s / 5 . 在iPad和iPhone 4s上,这种方法表现得太差了,10-15 fps . 我需要至少25-30 .

我目前的方法是迄今为止最好的方法 . 我一直在尝试glReadPixels和renderInContext(不适用于实时相机输入 .

所以我've been around searching on stackoverflow, I found a couple of alternatives and most of them I'试过了 . 但是我找到的最后一个,但是我知道这是否值得花时间 .

if ([[CCDirector sharedDirector] isPaused] || !writerInput || !writerInput.readyForMoreMediaData || !VIDEO_WRITER_IS_READY) {
    return;
}

CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, (__bridge void *)[[[CCDirector sharedDirector] openGLView] context], NULL, &rawDataTextureCache);
if (err) {
    NSAssert(NO, @"Error at CVOpenGLESTextureCacheCreate %d");
}

CFDictionaryRef empty; // empty value for attr value.
CFMutableDictionaryRef attrs;
empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary
                           NULL,
                           NULL,
                           0,
                           &kCFTypeDictionaryKeyCallBacks,
                           &kCFTypeDictionaryValueCallBacks);
attrs = CFDictionaryCreateMutable(kCFAllocatorDefault,
                                  1,
                                  &kCFTypeDictionaryKeyCallBacks,
                                  &kCFTypeDictionaryValueCallBacks);

CFDictionarySetValue(attrs,
                     kCVPixelBufferIOSurfacePropertiesKey,
                     empty);

CVPixelBufferCreate(kCFAllocatorDefault,
                    (int)esize.width,
                    (int)esize.height,
                    kCVPixelFormatType_32BGRA,
                    attrs,
                    &renderTarget);

CVOpenGLESTextureRef renderTexture;
CVOpenGLESTextureCacheCreateTextureFromImage (kCFAllocatorDefault,
                                              rawDataTextureCache,
                                              renderTarget,
                                              NULL, // texture attributes
                                              GL_TEXTURE_2D,
                                              GL_RGBA, // opengl format
                                              (int)esize.width,
                                              (int)esize.height,
                                              GL_BGRA, // native iOS format
                                              GL_UNSIGNED_BYTE,
                                              0,
                                              &renderTexture);
CFRelease(attrs);
CFRelease(empty);

glBindTexture(CVOpenGLESTextureGetTarget(renderTexture), CVOpenGLESTextureGetName(renderTexture));
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, CVOpenGLESTextureGetName(renderTexture), 0);

CVPixelBufferLockBaseAddress(renderTarget, 0);

CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
CFTimeInterval elapsedTime = thisFrameWallClockTime - firstFrameWallClockTime;
CMTime presentationTime = CMTimeMake(elapsedTime * 30, 30);


if(![adaptor appendPixelBuffer:renderTarget withPresentationTime:presentationTime]) {
    NSLog(@"Adaptor FAIL");
}

CVPixelBufferUnlockBaseAddress(renderTarget, 0);
CVPixelBufferRelease(renderTarget);

上面是相关的代码,我一直在为我的适配器提供一个像素缓冲区,它一直工作到现在为止 .

适配器刚刚失败,并记录“适配器失败” . 不要收到任何错误 .

我不知道我是否已完全关闭,尝试使用cocos2d应用程序的EAGLContext执行此操作 .

提前致谢 .

*** UPDATE ***

我变了,

CMTime presentationTime = CMTimeMake(elapsedTime * 30,30);

至,

CMTime presentationTime = CMTimeMake(elapsedTime * 120,120);

我认为30是不够的,因为它运行速度超过30 FPS . 我可能正在同时添加多个帧,因为帧速率增加导致适配器失败 . 所以适配器现在停止了失败,但屏幕仍然冻结 . 虽然我知道按钮的位置,但我设法停止录制并播放视频 . 有用 . 但每隔一帧闪烁黑屏 .