How many steps are required to record SCNView to video?

core idea

Through the callback of SCNView, get id<SCNSceneRenderer>the rendering target texture of metal renderPass, copy the rendering target texture to the Metal texture created based on CVPixelBuffer, and then write the CVPixelBuffer to the video file.

Prepare the video writing tool

Video creation can be easily performed using the system's own built- AVAssetWriterin code, which can be created and configured through the following codeAVAssetWriter

self.videoWriter = [AVAssetWriter assetWriterWithURL:outputURL fileType:AVFileTypeQuickTimeMovie error:&error];

NSDictionary *writerInputParams = [NSDictionary dictionaryWithObjectsAndKeys:
                                               AVVideoCodecTypeH264, AVVideoCodecKey,
                                                         [NSNumber numberWithInt:width], AVVideoWidthKey,
                                                         [NSNumber numberWithInt:height], AVVideoHeightKey,
                                                         AVVideoScalingModeResizeAspectFill, AVVideoScalingModeKey,
                                                         nil];
               
self.assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:writerInputParams];
if ([self.videoWriter canAddInput:self.assetWriterInput]) {
    [self.videoWriter addInput:self.assetWriterInput];
} else {
    // show error message
}

NSDictionary *attributes = [NSDictionary dictionaryWithObjectsAndKeys:
                            [NSNumber numberWithUnsignedInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange], (NSString*)kCVPixelBufferPixelFormatTypeKey,
                            [NSNumber numberWithBool:YES], (NSString *)kCVPixelBufferCGImageCompatibilityKey,
                            [NSNumber numberWithBool:YES], (NSString *)kCVPixelBufferCGBitmapContextCompatibilityKey,
                            nil];
self.writerAdaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:self.assetWriterInput sourcePixelBufferAttributes:attributes];
复制代码

CVPixelBuffer can be written to video file through the following API

[self.writerAdaptor appendPixelBuffer:cvBuffer withPresentationTime:presentationTime]
复制代码

CVPixelBuffer ready to accept image data

Create CVPixelBuffer and create MTLTexture from CVPixelBuffer to copy the render target texture of SCNView

int imageWidth = size.width;
int imageHeight = size.height;

NSDictionary *pixelAttributes = @{( id )kCVPixelBufferIOSurfacePropertiesKey : @{}};
CVPixelBufferCreate(
            kCFAllocatorDefault,
                    imageWidth,
                    imageHeight,
            kCVPixelFormatType_32BGRA,
                    (__bridge CFDictionaryRef)pixelAttributes,
                    &_renderTargetPixelBuffer);

CVReturn ret = CVMetalTextureCacheCreate(kCFAllocatorDefault, nil, _mtlDevice, nil, &_textureCache);

CVMetalTextureRef renderTargetMetalTextureRef;
ret = CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _textureCache, _renderTargetPixelBuffer, nil, MTLPixelFormatBGRA8Unorm, imageWidth, imageHeight, 0, &renderTargetMetalTextureRef);

id<MTLTexture> mtlTexture = CVMetalTextureGetTexture(renderTargetMetalTextureRef);
CFRelease(renderTargetMetalTextureRef);
复制代码

Copy the rendering result of SCNView to CVPixelBuffer

We need - (void)renderer:(id<SCNSceneRenderer>)renderer didRenderScene:(SCNScene *)scene atTime:(NSTimeInterval)timeto copy the rendering result in the callback of SCNView. In order to copy the rendering target texture of SCNView, the following settings need to be made

CAMetalLayer *metalLayer = (CAMetalLayer *)self.sceneView.layer;
[metalLayer setAllowsNextDrawableTimeout:NO];
metalLayer.framebufferOnly = NO;
复制代码

In the callback we MTLBlitCommandEncodercopy the render target texture to our own texture

MTLRenderPassDescriptor *renderPassDesc = renderer.currentRenderPassDescriptor;
id<MTLTexture> readTexture = renderPassDesc.colorAttachments[0].texture;
if (readTexture.width != _renderTargetTexture.width || readTexture.height != _renderTargetTexture.height) {
    _renderTargetTexture = [self createRenderTargetTexture:CGSizeMake(readTexture.width, readTexture.height)];
}

id<MTLCommandBuffer> blitCommandBuffer = [renderer.commandQueue commandBuffer];
id <MTLBlitCommandEncoder> blitEncoder = [blitCommandBuffer blitCommandEncoder];
[blitEncoder copyFromTexture:readTexture toTexture:_renderTargetTexture];
[blitEncoder endEncoding];

[blitCommandBuffer addCompletedHandler:^(id<MTLCommandBuffer> _Nonnull buffer) {
    [self updateWithFrame:self->_renderTargetPixelBuffer];
}];
[blitCommandBuffer commit];
复制代码

blitCommandBufferIt comes from SCNView commandQueue, so the command to copy the texture will be executed after the SCNView is rendered, ensuring that the copied texture is a complete rendering. Note that blocking threads blitCommandBuffercannot be used waitUntilCompleted, which will lead to deadlocks. Finally, updateWithFramewrite the CVPixelBuffer AVAssetWriter.

Guess you like

Origin juejin.im/post/7119884702999117854