Take a small video on iOS

  • demand

    The company’s mixed development, it’s not ideal to shoot small videos on the uni-side. In order to achieve the effect of imitating WeChat, the native plug-in started

  • Ideas

    Step 1: 1 AVCaptureSession, 1 AVCaptureVideoPreviewLayer[Consider compatible replacement with AVPreView]

    Step 2: Video recording requires video & audio, and corresponding AVCaptureDeviceInput, similarly corresponding AVCaptureVideoDataOutput and AVCaptureAudioDataOutput

    Step 3: Set output in the proxy to distinguish video and audio, and write the corresponding CMSampleBufferRef into the video file

    Step 4: Write to the video file, use AVAssetWriter, corresponding video & audio need two AVAssetWriterInput, add AVAssetWriter

    Step 5: CMSampleBufferRef keeps coming over, AssetWriter keeps writing until it stops

  • Serve

    I won’t write the first step of the initialization, if you have nothing to do, you can read my previous blog

    Step 2: Two AVCaptureDeviceInput and two Output, and set the proxy of Output

    self.videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:device error:&error];
    if (error) {
        NSLog(@"取得设备摄入videoInput对象时出错, 错误原因: %@", error);
        return;
    }
    
    // 设备添加到会话中
    if ([self.session canAddInput:self.videoInput]) {
        [self.session addInput:self.videoInput];
    }
    
    [self.videoOutput setSampleBufferDelegate:self queue:self.videoQueue];
    if ([self.session canAddOutput:self.videoOutput]) {
        [self.session addOutput:self.videoOutput];
    }
    
    // 音频相关
    AVCaptureDevice *adevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
    self.audioInput = [[AVCaptureDeviceInput alloc] initWithDevice:adevice error:&error];
    
    if ([self.session canAddInput:self.audioInput]) {
        [self.session addInput:self.audioInput];
    }
    
    [self.audioOutput setSampleBufferDelegate:self queue:self.videoQueue];
    if ([self.session canAddOutput:self.audioOutput]) {
        [self.session addOutput:self.audioOutput];
    }
    
    // 视频输出
    - (AVCaptureVideoDataOutput *)videoOutput {
        if (!_videoOutput) {
            _videoOutput = [[AVCaptureVideoDataOutput alloc] init];
            _videoOutput.alwaysDiscardsLateVideoFrames = YES;
        }
        return _videoOutput;
    }
    
    // 音频输出
    - (AVCaptureAudioDataOutput *)audioOutput {
        if (!_audioOutput) {
            _audioOutput = [[AVCaptureAudioDataOutput alloc] init];
        }
        return _audioOutput;
    }
    

    Step 3: Start Session, operate CMSampleBufferRef in the proxy

    #pragma mark - AVCaptureVideoDataOutputSampleBufferDelegate & AVCaptureAudioDataOutputSampleBufferDelegate
    - (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
        @autoreleasepool {
            // 视频
            if (connection == [self.videoOutput connectionWithMediaType:AVMediaTypeVideo]) {
                if (!self.manager.outputVideoFormatDescription) {
                    @synchronized(self) {
                        CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
                        self.manager.outputVideoFormatDescription = formatDescription;
                    }
                } else {
                    @synchronized(self) {
                        if (self.manager.state == StateRecording) {
                            [self.manager appendBuffer:sampleBuffer type:AVMediaTypeVideo];
                        }
                    }
                }
            }
    
            //音频
            if (connection == [self.audioOutput connectionWithMediaType:AVMediaTypeAudio]) {
                if (!self.manager.outputAudioFormatDescription) {
                    @synchronized(self) {
                        CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
                        self.manager.outputAudioFormatDescription = formatDescription;
                    }
                }
                @synchronized(self) {
                    if (self.manager.state == StateRecording) {
                        [self.manager appendBuffer:sampleBuffer type:AVMediaTypeAudio];
                    }
                }
            }
        }
    }

    Step 4: AVAssetWriter and corresponding Input

    // writer初始化
    self.writer = [AVAssetWriter assetWriterWithURL:_videoUrl fileType:AVFileTypeMPEG4 error:nil];
    
    _videoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:_videoSettings];
    //expectsMediaDataInRealTime 必须设为yes,需要从capture session 实时获取数据
    _videoInput.expectsMediaDataInRealTime = YES;
    
    _audioInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:_audioSettings];
    _audioInput.expectsMediaDataInRealTime = YES;
    
    if ([_writer canAddInput:_videoInput]) {
        [_writer addInput:_videoInput];
    }
    if ([_writer canAddInput:_audioInput]) {
        [_writer addInput:_audioInput];
    }

    Step 5: CMSampleBufferRef of Step 3 is written into the video file through AVAssetWriter

    - (void)appendBuffer:(CMSampleBufferRef)buffer type:(NSString *)mediaType {
        if (buffer == NULL) {
            NSLog(@"empty sampleBuffer");
            return;
        }
    
        @synchronized (self) {
            if (self.state < StateRecording) {
                NSLog(@"not ready yet");
                return;
            }
        }
    
        CFRetain(buffer);
        dispatch_async(self.queue, ^{
            @autoreleasepool {
                @synchronized (self) {
                    if (self.state > StateFinish) {
                        CFRelease(buffer);
                        return;
                    }
                }
    
                if (!self.canWrite && mediaType == AVMediaTypeVideo) {
                    [self.writer startWriting];
                    [self.writer startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(buffer)];
                    self.canWrite = YES;
                }
    
                if(!self.timer) {
                    dispatch_async(dispatch_get_main_queue(), ^{
                        self.timer = [NSTimer scheduledTimerWithTimeInterval:TIMER_INTERVAL target:self selector:@selector(updateProgress) userInfo:nil repeats:YES];
                        [[NSRunLoop currentRunLoop] addTimer:self.timer forMode:NSDefaultRunLoopMode];
                    });
                }
    
                // 写入视频数据
                if (mediaType == AVMediaTypeVideo) {
                    if (self.videoInput.readyForMoreMediaData) {
                        BOOL success = [self.videoInput appendSampleBuffer:buffer];
                        if (!success) {
                            @synchronized (self) {
                                [self stop:^{}];
                                [self destroy];
                            }
                        }
                    }
                }
    
                // 写入音频数据
                if (mediaType == AVMediaTypeAudio) {
                    if (self.audioInput.readyForMoreMediaData) {
                        BOOL success = [self.audioInput appendSampleBuffer:buffer];
                        if (!success) {
                            @synchronized (self) {
                                [self stop:^{}];
                                [self destroy];
                            }
                        }
                    }
                }
                CFRelease(buffer);
            }
        });
    }
    
  • Written at the end:

    1. When AVAssetWriterInput sets the video properties, design according to your own needs. The setting of bit rate and frame rate will affect the quality and size of the video after shooting, depending on the requirements of the respective projects.

    2. If there is a problem with the video viewing angle, you can adjust it from three directions

      1.VideoOrientation under the connect setting of layer

      2. VideoOrientation under the connect setting of AVCaptureOutput

      3. AVAssetWriterInput is set to transform for video, such as Rotation M_PI/2 angle

Guess you like

Origin blog.51cto.com/15070994/2635825