Explore iOS custom ijkplayer picture-in-picture playback

iOS provides AVPictureInPictureController for picture-in-picture playback control, but it can only be used by binding AVPlayer, which is quite a headache for developers. Coming soon after iOS 15.0, it supports SampleBufferDisplayLayer to customize the data source display layer, which means that we can customize third-party players to achieve picture-in-picture. Taking ijkplayer as an example, let's explore how to support picture-in-picture playback.

Table of contents

1. Determine the picture-in-picture support

2. Background playback configuration

1. Configure AudioSession

2. Configure background mode

3. Picture-in-picture life cycle

1. Enter picture-in-picture from full screen

2. Exit PIP and return to full screen

3. Life cycle

 4. PIP creation process

1. Initialization

2. Provide start and stop methods

3. Realize the picture-in-picture proxy method

4. Update the buffer layer

5. Update the playback status

6. Pit avoidance guide


1. Determine the picture-in-picture support

We can use isPictureInPictureSupported()  to determine whether the current device supports picture-in-picture.

2. Background playback configuration

1. Configure AudioSession

When the application is created, configure AudioSession to playback mode, the code is as follows:

func application(_ application: UIApplication,
                 didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
    
    let audioSession = AVAudioSession.sharedInstance()
     do {
        try audioSession.setCategory(.playback)
     } catch {
         print("Setting category to AVAudioSessionCategoryPlayback failed.")
     }

    return true
}

2. Configure background mode

Check the "Audio, Airplay, Picture in Picture" option, as shown below:

3. Picture-in-picture life cycle

1. Enter picture-in-picture from full screen

When the picture-in-picture is started, the willStartPictureInPicture() method is executed first, and the animation has not started yet. After the initialization is complete, execute the didStartPictureInPicture() method to complete the switch from full screen to picture-in-picture and enter the background state.

2. Exit PIP and return to full screen

When exiting the picture-in-picture, restore the UI of the app from the background state, and execute the willStopPictureInPicture() method. Finally complete the exit, execute the didStopPictureInPicture() method, and return to full-screen playback.

3. Life cycle

The complete life cycle of picture-in-picture includes: from full screen to picture-in-picture, and finally exiting picture-in-picture and returning to full screen. That is, the combination of the above two life cycles, as shown in the following figure:

 4. PIP creation process

1. Initialization

Initialize AVSampleBufferDisplayLayer, including setting frame, opaque, position, videoGravity, controlTimebase. The sample code is as follows:

- (AVSampleBufferDisplayLayer *) createDisplayLayer
{
    AVSampleBufferDisplayLayer *layer;
    layer = [[AVSampleBufferDisplayLayer alloc] init];
    layer.frame = self.bounds;
    layer.opaque = YES;
    layer.position = CGPointMake(CGRectGetMidX(self.bounds), CGRectGetMidY(self.bounds));
    layer.videoGravity = AVLayerVideoGravityResizeAspect;
    
    CMTimebaseRef controlTimebase;
    CMTimebaseCreateWithSourceClock( CFAllocatorGetDefault(), CMClockGetHostTimeClock(), &controlTimebase);
    layer.controlTimebase = controlTimebase;
    
    return layer;
}

Use AVSampleBufferDisplayLayer to initialize ContentSource, first determine whether the device supports picture-in-picture. The sample code is as follows:

- (BOOL)initPictureInPicture:(AVSampleBufferDisplayLayer *)displayLayer {
    if ([AVPictureInPictureController isPictureInPictureSupported] == NO) {
        NSLog(@"Sorry, don't support PictureInPicture mode!");
        return NO;
    }
    
    if (@available(iOS 15.0, *)) {
        _displayLayer  = displayLayer;
        _contentSource = [[AVPictureInPictureControllerContentSource alloc]
                            initWithSampleBufferDisplayLayer:displayLayer playbackDelegate:self];
        
        return YES;
    } else {
        return NO;
    }
}

2. Provide start and stop methods

Start picture-in-picture, including initializing AVPictureInPictureController, setting delegate, setting playback rate, and starting. . The sample code is as follows:

- (void)startPictureInPicture {
    if (@available(iOS 15.0, *)) {
        if (_pipController != nil) {
            return;
        }
        _pipController = [[AVPictureInPictureController alloc] initWithContentSource:_contentSource];
        _pipController.delegate = self;
        CMTimebaseSetRate(_displayLayer.controlTimebase, 1);
        CMTimebaseSetTime(_displayLayer.controlTimebase, CMTimeMake([_delegate getCurrentTime], 1));
        // 延时启动,否则可能会启动失败
        dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(100 * NSEC_PER_MSEC)), dispatch_get_main_queue(), ^{
            if ([self->_pipController isPictureInPicturePossible]) {
                [self->_pipController startPictureInPicture];
                NSLog(@"startPictureInPicture...");
            }
        });
    }
}

End picture-in-picture, the sample code is as follows:

- (void)stopPictureInPicture {
    if (_pipController != nil && [_pipController isPictureInPictureActive]) {
        [_pipController stopPictureInPicture];
    }
}

3. Realize the picture-in-picture proxy method

Implement the proxy method of AVPictureInPictureControllerDelegate as follows:

- (void)pictureInPictureControllerWillStartPictureInPicture:(AVPictureInPictureController *)pictureInPictureController {
    NSLog(@"pictureInPictureControllerWillStart...");
}

- (void)pictureInPictureControllerDidStartPictureInPicture:(AVPictureInPictureController *)pictureInPictureController {
    // 回调画中画开始
    [_delegate onStartPictureInPicture:nil];
}

- (void)pictureInPictureController:(AVPictureInPictureController *)pictureInPictureController
                                    failedToStartPictureInPictureWithError:(NSError *)error {
    // 画中画开始失败
    [_delegate onStartPictureInPicture:error];
}

- (void)pictureInPictureControllerWillStopPictureInPicture:(AVPictureInPictureController *)pictureInPictureController {
    NSLog(@"pictureInPictureControllerWillStop...");
}

- (void)pictureInPictureControllerDidStopPictureInPicture:(AVPictureInPictureController *)pictureInPictureController {
    [self stopPictureInPicture];
    // 回调画中画结束
    [_delegate onStopPictureInPicture];
}

- (void)pictureInPictureController:(AVPictureInPictureController *)pictureInPictureController
    restoreUserInterfaceForPictureInPictureStopWithCompletionHandler:(void (^)(BOOL restored))completionHandler {
    completionHandler(true);
    // 恢复全屏界面
}

4. Update the buffer layer

After ijk is decoded, it is encapsulated into CVPixelBuffer, then encapsulated into CMSampleBufferRef, and stored in displayLayer. The sample code is as follows:

-(void) enqueueBuffer:(CVPixelBufferRef)pixelBuffer
{
    if (!_displayLayer || !pixelBuffer)
        return;
    @autoreleasepool {
        CMSampleBufferRef sampleBuffer = nil;
        CMVideoFormatDescriptionRef format = nil;
        // 设置dts、pts
        CMSampleTimingInfo timeInfo = {.presentationTimeStamp = kCMTimeInvalid, .decodeTimeStamp = kCMTimeInvalid};
        OSStatus status = CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, &format);
        status = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, true, NULL, NULL,
                                                    format, &timeInfo, &sampleBuffer);
        if (format != nil) {
            CFRelease(format);
        }
        if (sampleBuffer == nil || status != noErr) {
            NSLog(@"sampleBuffer error!");
            return;
        }
        CFArrayRef attachments = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, YES);
        CFMutableDictionaryRef dict = (CFMutableDictionaryRef)CFArrayGetValueAtIndex(attachments, 0);
        // 设置立即渲染
        CFDictionarySetValue(dict, kCMSampleAttachmentKey_DisplayImmediately, kCFBooleanTrue);
        
        [_displayLayer enqueueSampleBuffer:sampleBuffer];
        CFRelease(sampleBuffer);
        if (_displayLayer.status == AVQueuedSampleBufferRenderingStatusFailed) {
            NSLog(@"displayLayer error=%@", [_displayLayer error]);
        }
    }
}

5. Update the playback status

Implement the proxy method of AVPictureInPictureSampleBufferPlaybackDelegate to update the playback status and seek progress. The sample code is as follows:

- (void)pictureInPictureController:(AVPictureInPictureController *)pictureInPictureController setPlaying:(BOOL)playing {
    _isPlaybackPaused = !playing;
    // 回调播放/暂停
    [_delegate setPlaying:playing];
    if (@available(iOS 15.0, *)) {
        [pictureInPictureController invalidatePlaybackState];
    }
    if (playing == NO) {
        _playClickTime = [[NSDate date] timeIntervalSince1970] * 1000;
    }
}

- (CMTimeRange)pictureInPictureControllerTimeRangeForPlayback:(AVPictureInPictureController *)pictureInPictureController {
    // 计算time range:0~duration
    NSTimeInterval durationTime = [_delegate getDurationTime];
    return CMTimeRangeMake(kCMTimeZero, CMTimeMake(durationTime * 1000, 1000));
}

- (BOOL)pictureInPictureControllerIsPlaybackPaused:(AVPictureInPictureController *)pictureInPictureController {
    return _isPlaybackPaused;
}

- (void)pictureInPictureController:(AVPictureInPictureController *)pictureInPictureController
         didTransitionToRenderSize:(CMVideoDimensions)newRenderSize {

}

- (void)pictureInPictureController:(AVPictureInPictureController *)pictureInPictureController skipByInterval:(CMTime)skipInterval
                 completionHandler:(void (^)(void))completionHandler {
    // 回调seek
    [_delegate seekTo:CMTimeGetSeconds(skipInterval)];
    if (@available(iOS 15.0, *)) {
        // seek开始会暂停播放,seek成功后恢复播放状态
        if (([[NSDate date] timeIntervalSince1970] * 1000 - _playClickTime) < 200) {
            [_delegate setPlaying:YES];
        }
        // 更新displaylayer的controlTimebase时钟,从而更新进度
        CMTimebaseSetTime(_displayLayer.controlTimebase, CMTimeMake([_delegate getCurrentTime] * 1000, 1000));
        // 更新播放状态
        [_pipController invalidatePlaybackState];
    }
}

ijkplayer provides interfaces such as play, seek, getCurrentPosition, getDuration, isPlaying, etc., to call the custom picture-in-picture Controller:

- (BOOL)isPlaybackPaused
{
    return [self isPlaying] == NO;
}

- (void)setPlaying:(BOOL)playing
{
    if (playing) {
        [self play];
    } else {
        [self pause];
    }
}

- (void)seekTo:(NSTimeInterval)relativePosition
{
    NSTimeInterval position = [self currentPlaybackTime] + relativePosition;
    [self setCurrentPlaybackTime:position];
}

- (NSTimeInterval)getCurrentTime
{
    return [self currentPlaybackTime];
}

- (NSTimeInterval)getDurationTime
{
    return [self duration];
}

6. Pit avoidance guide

(1) Delayed start: It takes about 100ms to start AVPictureInPictureController, otherwise it may fail to start the picture in picture, which is a bit cheating

(2) Set the clock: you need to set dts, pts, and set immediate rendering. cheating +1

(3) Update the playback status: the playback will be paused when the seek starts, and the playback status will resume when the seek succeeds, cheating +2

Guess you like

Origin blog.csdn.net/u011686167/article/details/131023251