WebRTC audio and video call - WebRTC video custom RTCVideoCapturer camera

WebRTC audio and video call - WebRTC video custom RTCVideoCapturer camera

In the past, WebRTC has been implemented to call ossrs service to realize the function of live video call. However, during use, the methods provided by the RTCCameraVideoCapturer class cannot modify and adjust camera lighting and other settings, so you need to customize the RTCVideoCapturer to capture images by yourself.

insert image description here

WebRTC on the iOS side calls ossrs to realize the live video call function, please check:
https://blog.csdn.net/gloryFlow/article/details/132262724

Here custom RTCVideoCapturer

1. Several classes needed for custom cameras

Need to know a few classes

  • AVCaptureSession
    AVCaptureSession is an object provided by iOS to manage and coordinate the data flow between the input device and the output device.

  • AVCaptureDevice
    AVCaptureDevice refers to the hardware device.

  • AVCaptureDeviceInput
    AVCaptureDeviceInput is used to capture Input data from the AVCaptureDevice object.

  • AVCaptureMetadataOutput
    AVCaptureMetadataOutput is used to process the capture output of timed metadata generated by AVCaptureSession.

  • AVCaptureVideoDataOutput
    AVCaptureVideoDataOutput is used to process video data output.

  • AVCaptureVideoPreviewLayer
    AVCaptureVideoPreviewLayer is the video preview layer captured by the camera, which is used to display the video.

2. Custom RTCVideoCapturer camera capture

To customize the camera, we need to add the output of AVCaptureVideoDataOutput to AVCaptureSession

 self.dataOutput = [[AVCaptureVideoDataOutput alloc] init];
        [self.dataOutput setAlwaysDiscardsLateVideoFrames:YES];
        self.dataOutput.videoSettings = @{
    
    (id)kCVPixelBufferPixelFormatTypeKey : @(needYuvOutput ? kCVPixelFormatType_420YpCbCr8BiPlanarFullRange : kCVPixelFormatType_32BGRA)};
        self.dataOutput.alwaysDiscardsLateVideoFrames = YES;
        [self.dataOutput setSampleBufferDelegate:self queue:self.bufferQueue];

if ([self.session canAddOutput:self.dataOutput]) {
    
    
            
            [self.session addOutput:self.dataOutput];
        }else{
    
    
            
            NSLog( @"Could not add video data output to the session" );
            return nil;
        }

Need to implement AVCaptureVideoDataOutputSampleBufferDelegate proxy method

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection;

Convert CMSampleBufferRef to CVPixelBufferRef, generate RTCVideoFrame from the processed CVPixelBufferRef, and call the didCaptureVideoFrame method implemented in WebRTC's localVideoSource.

The complete code is as follows

SDCustomRTCCameraCapturer.h

#import <AVFoundation/AVFoundation.h>
#import <Foundation/Foundation.h>
#import <WebRTC/WebRTC.h>

@protocol SDCustomRTCCameraCapturerDelegate <NSObject>

- (void)rtcCameraVideoCapturer:(RTCVideoCapturer *)capturer didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer;

@end

@interface SDCustomRTCCameraCapturer : RTCVideoCapturer

@property(nonatomic, weak) id<SDCustomRTCCameraCapturerDelegate>delegate;

// Capture session that is used for capturing. Valid from initialization to dealloc.
@property(nonatomic, strong) AVCaptureSession *captureSession;

// Returns list of available capture devices that support video capture.
+ (NSArray<AVCaptureDevice *> *)captureDevices;
// Returns list of formats that are supported by this class for this device.
+ (NSArray<AVCaptureDeviceFormat *> *)supportedFormatsForDevice:(AVCaptureDevice *)device;

// Returns the most efficient supported output pixel format for this capturer.
- (FourCharCode)preferredOutputPixelFormat;

// Starts the capture session asynchronously and notifies callback on completion.
// The device will capture video in the format given in the `format` parameter. If the pixel format
// in `format` is supported by the WebRTC pipeline, the same pixel format will be used for the
// output. Otherwise, the format returned by `preferredOutputPixelFormat` will be used.
- (void)startCaptureWithDevice:(AVCaptureDevice *)device
                        format:(AVCaptureDeviceFormat *)format
                           fps:(NSInteger)fps
             completionHandler:(nullable void (^)(NSError *))completionHandler;
// Stops the capture session asynchronously and notifies callback on completion.
- (void)stopCaptureWithCompletionHandler:(nullable void (^)(void))completionHandler;

// Starts the capture session asynchronously.
- (void)startCaptureWithDevice:(AVCaptureDevice *)device
                        format:(AVCaptureDeviceFormat *)format
                           fps:(NSInteger)fps;
// Stops the capture session asynchronously.
- (void)stopCapture;

#pragma mark - 自定义相机
@property (nonatomic, readonly) dispatch_queue_t bufferQueue;

@property (nonatomic, assign) AVCaptureDevicePosition devicePosition; // default AVCaptureDevicePositionFront

@property (nonatomic, assign) AVCaptureVideoOrientation videoOrientation;

@property (nonatomic, assign) BOOL needVideoMirrored;

@property (nonatomic, strong , readonly) AVCaptureConnection *videoConnection;

@property (nonatomic, copy) NSString *sessionPreset;  // default 640x480

@property (nonatomic, strong) AVCaptureVideoPreviewLayer *previewLayer;

@property (nonatomic, assign) BOOL bSessionPause;

@property (nonatomic, assign) int iExpectedFPS;

@property (nonatomic, readwrite, strong) NSDictionary *videoCompressingSettings;


- (instancetype)initWithDevicePosition:(AVCaptureDevicePosition)iDevicePosition
                        sessionPresset:(AVCaptureSessionPreset)sessionPreset
                                   fps:(int)iFPS
                         needYuvOutput:(BOOL)needYuvOutput;

- (void)setExposurePoint:(CGPoint)point inPreviewFrame:(CGRect)frame;

- (void)setISOValue:(float)value;

- (void)startRunning;

- (void)stopRunning;

- (void)rotateCamera;

- (void)rotateCamera:(BOOL)isUseFrontCamera;

- (void)setWhiteBalance;

- (CGRect)getZoomedRectWithRect:(CGRect)rect scaleToFit:(BOOL)bScaleToFit;

@end

SDCustomRTCCameraCapturer.m

#import "SDCustomRTCCameraCapturer.h"
#import <UIKit/UIKit.h>
#import "EFMachineVersion.h"

//#import "base/RTCLogging.h"
//#import "base/RTCVideoFrameBuffer.h"
//#import "components/video_frame_buffer/RTCCVPixelBuffer.h"
//
//#if TARGET_OS_IPHONE
//#import "helpers/UIDevice+RTCDevice.h"
//#endif
//
//#import "helpers/AVCaptureSession+DevicePosition.h"
//#import "helpers/RTCDispatcher+Private.h"

typedef NS_ENUM(NSUInteger, STExposureModel) {
    
    
    STExposureModelPositive5,
    STExposureModelPositive4,
    STExposureModelPositive3,
    STExposureModelPositive2,
    STExposureModelPositive1,
    STExposureModel0,
    STExposureModelNegative1,
    STExposureModelNegative2,
    STExposureModelNegative3,
    STExposureModelNegative4,
    STExposureModelNegative5,
    STExposureModelNegative6,
    STExposureModelNegative7,
    STExposureModelNegative8,
};

static char * kEffectsCamera = "EffectsCamera";

static STExposureModel currentExposureMode;

@interface SDCustomRTCCameraCapturer ()<AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate, AVCaptureMetadataOutputObjectsDelegate>
@property(nonatomic, readonly) dispatch_queue_t frameQueue;
@property(nonatomic, strong) AVCaptureDevice *currentDevice;
@property(nonatomic, assign) BOOL hasRetriedOnFatalError;
@property(nonatomic, assign) BOOL isRunning;
// Will the session be running once all asynchronous operations have been completed?
@property(nonatomic, assign) BOOL willBeRunning;

@property (nonatomic, strong) AVCaptureDeviceInput *deviceInput;
@property (nonatomic, strong) AVCaptureVideoDataOutput *dataOutput;
@property (nonatomic, strong) AVCaptureMetadataOutput *metaOutput;
@property (nonatomic, strong) AVCaptureStillImageOutput *stillImageOutput;

@property (nonatomic, readwrite) dispatch_queue_t bufferQueue;

@property (nonatomic, strong, readwrite) AVCaptureConnection *videoConnection;

@property (nonatomic, strong) AVCaptureDevice *videoDevice;
@property (nonatomic, strong) AVCaptureSession *session;

@end

@implementation SDCustomRTCCameraCapturer {
    
    
  AVCaptureVideoDataOutput *_videoDataOutput;
  AVCaptureSession *_captureSession;
  FourCharCode _preferredOutputPixelFormat;
  FourCharCode _outputPixelFormat;
  RTCVideoRotation _rotation;
    float _autoISOValue;
#if TARGET_OS_IPHONE
  UIDeviceOrientation _orientation;
  BOOL _generatingOrientationNotifications;
#endif
}

@synthesize frameQueue = _frameQueue;
@synthesize captureSession = _captureSession;
@synthesize currentDevice = _currentDevice;
@synthesize hasRetriedOnFatalError = _hasRetriedOnFatalError;
@synthesize isRunning = _isRunning;
@synthesize willBeRunning = _willBeRunning;

- (instancetype)init {
    
    
  return [self initWithDelegate:nil captureSession:[[AVCaptureSession alloc] init]];
}

- (instancetype)initWithDelegate:(__weak id<RTCVideoCapturerDelegate>)delegate {
    
    
  return [self initWithDelegate:delegate captureSession:[[AVCaptureSession alloc] init]];
}

// This initializer is used for testing.
- (instancetype)initWithDelegate:(__weak id<RTCVideoCapturerDelegate>)delegate
                  captureSession:(AVCaptureSession *)captureSession {
    
    
  if (self = [super initWithDelegate:delegate]) {
    
    
    // Create the capture session and all relevant inputs and outputs. We need
    // to do this in init because the application may want the capture session
    // before we start the capturer for e.g. AVCapturePreviewLayer. All objects
    // created here are retained until dealloc and never recreated.
    if (![self setupCaptureSession:captureSession]) {
    
    
      return nil;
    }
    NSNotificationCenter *center = [NSNotificationCenter defaultCenter];
#if TARGET_OS_IPHONE
    _orientation = UIDeviceOrientationPortrait;
    _rotation = RTCVideoRotation_90;
    [center addObserver:self
               selector:@selector(deviceOrientationDidChange:)
                   name:UIDeviceOrientationDidChangeNotification
                 object:nil];
    [center addObserver:self
               selector:@selector(handleCaptureSessionInterruption:)
                   name:AVCaptureSessionWasInterruptedNotification
                 object:_captureSession];
    [center addObserver:self
               selector:@selector(handleCaptureSessionInterruptionEnded:)
                   name:AVCaptureSessionInterruptionEndedNotification
                 object:_captureSession];
    [center addObserver:self
               selector:@selector(handleApplicationDidBecomeActive:)
                   name:UIApplicationDidBecomeActiveNotification
                 object:[UIApplication sharedApplication]];
#endif
    [center addObserver:self
               selector:@selector(handleCaptureSessionRuntimeError:)
                   name:AVCaptureSessionRuntimeErrorNotification
                 object:_captureSession];
    [center addObserver:self
               selector:@selector(handleCaptureSessionDidStartRunning:)
                   name:AVCaptureSessionDidStartRunningNotification
                 object:_captureSession];
    [center addObserver:self
               selector:@selector(handleCaptureSessionDidStopRunning:)
                   name:AVCaptureSessionDidStopRunningNotification
                 object:_captureSession];
  }
  return self;
}

//- (void)dealloc {
    
    
//  NSAssert(
//      !_willBeRunning,
//      @"Session was still running in RTCCameraVideoCapturer dealloc. Forgot to call stopCapture?");
//  [[NSNotificationCenter defaultCenter] removeObserver:self];
//}

+ (NSArray<AVCaptureDevice *> *)captureDevices {
    
    
#if defined(WEBRTC_IOS) && defined(__IPHONE_10_0) && \
    __IPHONE_OS_VERSION_MIN_REQUIRED >= __IPHONE_10_0
  AVCaptureDeviceDiscoverySession *session = [AVCaptureDeviceDiscoverySession
      discoverySessionWithDeviceTypes:@[ AVCaptureDeviceTypeBuiltInWideAngleCamera ]
                            mediaType:AVMediaTypeVideo
                             position:AVCaptureDevicePositionUnspecified];
  return session.devices;
#else
  return [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
#endif
}

+ (NSArray<AVCaptureDeviceFormat *> *)supportedFormatsForDevice:(AVCaptureDevice *)device {
    
    
  // Support opening the device in any format. We make sure it's converted to a format we
  // can handle, if needed, in the method `-setupVideoDataOutput`.
  return device.formats;
}

- (FourCharCode)preferredOutputPixelFormat {
    
    
  return _preferredOutputPixelFormat;
}

- (void)startCaptureWithDevice:(AVCaptureDevice *)device
                        format:(AVCaptureDeviceFormat *)format
                           fps:(NSInteger)fps {
    
    
  [self startCaptureWithDevice:device format:format fps:fps completionHandler:nil];
}

- (void)stopCapture {
    
    
  [self stopCaptureWithCompletionHandler:nil];
}

- (void)startCaptureWithDevice:(AVCaptureDevice *)device
                        format:(AVCaptureDeviceFormat *)format
                           fps:(NSInteger)fps
             completionHandler:(nullable void (^)(NSError *))completionHandler {
    
    
  _willBeRunning = YES;
  [RTCDispatcher
      dispatchAsyncOnType:RTCDispatcherTypeCaptureSession
                    block:^{
    
    
                      RTCLogInfo("startCaptureWithDevice %@ @ %ld fps", format, (long)fps);

#if TARGET_OS_IPHONE
                      dispatch_async(dispatch_get_main_queue(), ^{
    
    
                        if (!self->_generatingOrientationNotifications) {
    
    
                          [[UIDevice currentDevice] beginGeneratingDeviceOrientationNotifications];
                          self->_generatingOrientationNotifications = YES;
                        }
                      });
#endif

                      self.currentDevice = device;

                      NSError *error = nil;
                      if (![self.currentDevice lockForConfiguration:&error]) {
    
    
                        RTCLogError(@"Failed to lock device %@. Error: %@",
                                    self.currentDevice,
                                    error.userInfo);
                        if (completionHandler) {
    
    
                          completionHandler(error);
                        }
                        self.willBeRunning = NO;
                        return;
                      }
                      [self reconfigureCaptureSessionInput];
                      [self updateOrientation];
                      [self updateDeviceCaptureFormat:format fps:fps];
                      [self updateVideoDataOutputPixelFormat:format];
                      [self.captureSession startRunning];
                      [self.currentDevice unlockForConfiguration];
                      self.isRunning = YES;
                      if (completionHandler) {
    
    
                        completionHandler(nil);
                      }
                    }];
}

- (void)stopCaptureWithCompletionHandler:(nullable void (^)(void))completionHandler {
    
    
  _willBeRunning = NO;
  [RTCDispatcher
      dispatchAsyncOnType:RTCDispatcherTypeCaptureSession
                    block:^{
    
    
                      RTCLogInfo("Stop");
                      self.currentDevice = nil;
                      for (AVCaptureDeviceInput *oldInput in [self.captureSession.inputs copy]) {
    
    
                        [self.captureSession removeInput:oldInput];
                      }
                      [self.captureSession stopRunning];

#if TARGET_OS_IPHONE
                      dispatch_async(dispatch_get_main_queue(), ^{
    
    
                        if (self->_generatingOrientationNotifications) {
    
    
                          [[UIDevice currentDevice] endGeneratingDeviceOrientationNotifications];
                          self->_generatingOrientationNotifications = NO;
                        }
                      });
#endif
                      self.isRunning = NO;
                      if (completionHandler) {
    
    
                        completionHandler();
                      }
                    }];
}

#pragma mark iOS notifications

#if TARGET_OS_IPHONE
- (void)deviceOrientationDidChange:(NSNotification *)notification {
    
    
  [RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSession
                               block:^{
    
    
                                 [self updateOrientation];
                               }];
}
#endif

#pragma mark AVCaptureVideoDataOutputSampleBufferDelegate

//- (void)captureOutput:(AVCaptureOutput *)captureOutput
//    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
//           fromConnection:(AVCaptureConnection *)connection {
    
    
//  NSParameterAssert(captureOutput == _videoDataOutput);
//
//  if (CMSampleBufferGetNumSamples(sampleBuffer) != 1 || !CMSampleBufferIsValid(sampleBuffer) ||
//      !CMSampleBufferDataIsReady(sampleBuffer)) {
    
    
//    return;
//  }
//
//  CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
//  if (pixelBuffer == nil) {
    
    
//    return;
//  }
//
//#if TARGET_OS_IPHONE
//  // Default to portrait orientation on iPhone.
//  BOOL usingFrontCamera = NO;
//  // Check the image's EXIF for the camera the image came from as the image could have been
//  // delayed as we set alwaysDiscardsLateVideoFrames to NO.
//
//    AVCaptureDeviceInput *deviceInput =
//        (AVCaptureDeviceInput *)((AVCaptureInputPort *)connection.inputPorts.firstObject).input;
//    usingFrontCamera = AVCaptureDevicePositionFront == deviceInput.device.position;
//
//    switch (_orientation) {
    
    
//    case UIDeviceOrientationPortrait:
//      _rotation = RTCVideoRotation_90;
//      break;
//    case UIDeviceOrientationPortraitUpsideDown:
//      _rotation = RTCVideoRotation_270;
//      break;
//    case UIDeviceOrientationLandscapeLeft:
//      _rotation = usingFrontCamera ? RTCVideoRotation_180 : RTCVideoRotation_0;
//      break;
//    case UIDeviceOrientationLandscapeRight:
//      _rotation = usingFrontCamera ? RTCVideoRotation_0 : RTCVideoRotation_180;
//      break;
//    case UIDeviceOrientationFaceUp:
//    case UIDeviceOrientationFaceDown:
//    case UIDeviceOrientationUnknown:
//      // Ignore.
//      break;
//  }
//#else
//  // No rotation on Mac.
//  _rotation = RTCVideoRotation_0;
//#endif
//
//    if (self.delegate && [self.delegate respondsToSelector:@selector(rtcCameraVideoCapturer:didOutputSampleBuffer:)]) {
    
    
//        [self.delegate rtcCameraVideoCapturer:self didOutputSampleBuffer:sampleBuffer];
//    }
//}

#pragma mark - AVCaptureSession notifications

- (void)handleCaptureSessionInterruption:(NSNotification *)notification {
    
    
  NSString *reasonString = nil;
#if TARGET_OS_IPHONE
  NSNumber *reason = notification.userInfo[AVCaptureSessionInterruptionReasonKey];
  if (reason) {
    
    
    switch (reason.intValue) {
    
    
      case AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableInBackground:
        reasonString = @"VideoDeviceNotAvailableInBackground";
        break;
      case AVCaptureSessionInterruptionReasonAudioDeviceInUseByAnotherClient:
        reasonString = @"AudioDeviceInUseByAnotherClient";
        break;
      case AVCaptureSessionInterruptionReasonVideoDeviceInUseByAnotherClient:
        reasonString = @"VideoDeviceInUseByAnotherClient";
        break;
      case AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableWithMultipleForegroundApps:
        reasonString = @"VideoDeviceNotAvailableWithMultipleForegroundApps";
        break;
    }
  }
#endif
  RTCLog(@"Capture session interrupted: %@", reasonString);
}

- (void)handleCaptureSessionInterruptionEnded:(NSNotification *)notification {
    
    
  RTCLog(@"Capture session interruption ended.");
}

- (void)handleCaptureSessionRuntimeError:(NSNotification *)notification {
    
    
  NSError *error = [notification.userInfo objectForKey:AVCaptureSessionErrorKey];
  RTCLogError(@"Capture session runtime error: %@", error);

  [RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSession
                               block:^{
    
    
#if TARGET_OS_IPHONE
                                 if (error.code == AVErrorMediaServicesWereReset) {
    
    
                                   [self handleNonFatalError];
                                 } else {
    
    
                                   [self handleFatalError];
                                 }
#else
                                [self handleFatalError];
#endif
                               }];
}

- (void)handleCaptureSessionDidStartRunning:(NSNotification *)notification {
    
    
  RTCLog(@"Capture session started.");

  [RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSession
                               block:^{
    
    
                                 // If we successfully restarted after an unknown error,
                                 // allow future retries on fatal errors.
                                 self.hasRetriedOnFatalError = NO;
                               }];
}

- (void)handleCaptureSessionDidStopRunning:(NSNotification *)notification {
    
    
  RTCLog(@"Capture session stopped.");
}

- (void)handleFatalError {
    
    
  [RTCDispatcher
      dispatchAsyncOnType:RTCDispatcherTypeCaptureSession
                    block:^{
    
    
                      if (!self.hasRetriedOnFatalError) {
    
    
                        RTCLogWarning(@"Attempting to recover from fatal capture error.");
                        [self handleNonFatalError];
                        self.hasRetriedOnFatalError = YES;
                      } else {
    
    
                        RTCLogError(@"Previous fatal error recovery failed.");
                      }
                    }];
}

- (void)handleNonFatalError {
    
    
  [RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSession
                               block:^{
    
    
                                 RTCLog(@"Restarting capture session after error.");
                                 if (self.isRunning) {
    
    
                                   [self.captureSession startRunning];
                                 }
                               }];
}

#if TARGET_OS_IPHONE

#pragma mark - UIApplication notifications

- (void)handleApplicationDidBecomeActive:(NSNotification *)notification {
    
    
  [RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSession
                               block:^{
    
    
                                 if (self.isRunning && !self.captureSession.isRunning) {
    
    
                                   RTCLog(@"Restarting capture session on active.");
                                   [self.captureSession startRunning];
                                 }
                               }];
}

#endif  // TARGET_OS_IPHONE

#pragma mark - Private

- (dispatch_queue_t)frameQueue {
    
    
  if (!_frameQueue) {
    
    
    _frameQueue =
        dispatch_queue_create("org.webrtc.cameravideocapturer.video", DISPATCH_QUEUE_SERIAL);
    dispatch_set_target_queue(_frameQueue,
                              dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0));
  }
  return _frameQueue;
}

- (BOOL)setupCaptureSession:(AVCaptureSession *)captureSession {
    
    
  NSAssert(_captureSession == nil, @"Setup capture session called twice.");
  _captureSession = captureSession;
#if defined(WEBRTC_IOS)
  _captureSession.sessionPreset = AVCaptureSessionPresetInputPriority;
  _captureSession.usesApplicationAudioSession = NO;
#endif
  [self setupVideoDataOutput];
  // Add the output.
  if (![_captureSession canAddOutput:_videoDataOutput]) {
    
    
    RTCLogError(@"Video data output unsupported.");
    return NO;
  }
  [_captureSession addOutput:_videoDataOutput];

  return YES;
}

- (void)setupVideoDataOutput {
    
    
  NSAssert(_videoDataOutput == nil, @"Setup video data output called twice.");
  AVCaptureVideoDataOutput *videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];

  // `videoDataOutput.availableVideoCVPixelFormatTypes` returns the pixel formats supported by the
  // device with the most efficient output format first. Find the first format that we support.
  NSSet<NSNumber *> *supportedPixelFormats = [RTCCVPixelBuffer supportedPixelFormats];
  NSMutableOrderedSet *availablePixelFormats =
      [NSMutableOrderedSet orderedSetWithArray:videoDataOutput.availableVideoCVPixelFormatTypes];
  [availablePixelFormats intersectSet:supportedPixelFormats];
  NSNumber *pixelFormat = availablePixelFormats.firstObject;
  NSAssert(pixelFormat, @"Output device has no supported formats.");

  _preferredOutputPixelFormat = [pixelFormat unsignedIntValue];
  _outputPixelFormat = _preferredOutputPixelFormat;
//  videoDataOutput.videoSettings = @{(NSString *)kCVPixelBufferPixelFormatTypeKey : pixelFormat};
    
//    [videoDataOutput setAlwaysDiscardsLateVideoFrames:YES];
    videoDataOutput.videoSettings = @{
    
    (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA)};
    videoDataOutput.alwaysDiscardsLateVideoFrames = YES;
    
  [videoDataOutput setSampleBufferDelegate:self queue:self.frameQueue];
  _videoDataOutput = videoDataOutput;
    
//    // 设置视频方向和旋转
//    AVCaptureConnection *connection = [videoDataOutput connectionWithMediaType:AVMediaTypeVideo];
//    [connection setVideoOrientation:AVCaptureVideoOrientationPortrait];
//    [connection setVideoMirrored:NO];
}

- (void)updateVideoDataOutputPixelFormat:(AVCaptureDeviceFormat *)format {
    
    
//  FourCharCode mediaSubType = CMFormatDescriptionGetMediaSubType(format.formatDescription);
//  if (![[RTCCVPixelBuffer supportedPixelFormats] containsObject:@(mediaSubType)]) {
    
    
//    mediaSubType = _preferredOutputPixelFormat;
//  }
//
//  if (mediaSubType != _outputPixelFormat) {
    
    
//    _outputPixelFormat = mediaSubType;
//    _videoDataOutput.videoSettings =
//        @{ (NSString *)kCVPixelBufferPixelFormatTypeKey : @(mediaSubType) };
//  }
}

#pragma mark - Private, called inside capture queue

- (void)updateDeviceCaptureFormat:(AVCaptureDeviceFormat *)format fps:(NSInteger)fps {
    
    
  NSAssert([RTCDispatcher isOnQueueForType:RTCDispatcherTypeCaptureSession],
           @"updateDeviceCaptureFormat must be called on the capture queue.");
  @try {
    
    
    _currentDevice.activeFormat = format;
    _currentDevice.activeVideoMinFrameDuration = CMTimeMake(1, fps);
  } @catch (NSException *exception) {
    
    
    RTCLogError(@"Failed to set active format!\n User info:%@", exception.userInfo);
    return;
  }
}

- (void)reconfigureCaptureSessionInput {
    
    
  NSAssert([RTCDispatcher isOnQueueForType:RTCDispatcherTypeCaptureSession],
           @"reconfigureCaptureSessionInput must be called on the capture queue.");
  NSError *error = nil;
  AVCaptureDeviceInput *input =
      [AVCaptureDeviceInput deviceInputWithDevice:_currentDevice error:&error];
  if (!input) {
    
    
    RTCLogError(@"Failed to create front camera input: %@", error.localizedDescription);
    return;
  }
  [_captureSession beginConfiguration];
  for (AVCaptureDeviceInput *oldInput in [_captureSession.inputs copy]) {
    
    
    [_captureSession removeInput:oldInput];
  }
  if ([_captureSession canAddInput:input]) {
    
    
    [_captureSession addInput:input];
  } else {
    
    
    RTCLogError(@"Cannot add camera as an input to the session.");
  }
  [_captureSession commitConfiguration];
}

- (void)updateOrientation {
    
    
  NSAssert([RTCDispatcher isOnQueueForType:RTCDispatcherTypeCaptureSession],
           @"updateOrientation must be called on the capture queue.");
#if TARGET_OS_IPHONE
  _orientation = [UIDevice currentDevice].orientation;
#endif
}

- (instancetype)initWithDevicePosition:(AVCaptureDevicePosition)iDevicePosition
                        sessionPresset:(AVCaptureSessionPreset)sessionPreset
                                   fps:(int)iFPS
                         needYuvOutput:(BOOL)needYuvOutput
{
    
    
    self = [super init];
    if (self) {
    
    
        
        self.bSessionPause = YES;
        
        self.bufferQueue = dispatch_queue_create("STCameraBufferQueue", NULL);
        
        self.session = [[AVCaptureSession alloc] init];
        
        self.videoDevice = [self cameraDeviceWithPosition:iDevicePosition];
        _devicePosition = iDevicePosition;
        NSError *error = nil;
        self.deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:self.videoDevice
                                                                 error:&error];
        
        if (!self.deviceInput || error) {
    
    

            NSLog(@"create input error");

            return nil;
        }
        
        
        self.dataOutput = [[AVCaptureVideoDataOutput alloc] init];
        [self.dataOutput setAlwaysDiscardsLateVideoFrames:YES];
        self.dataOutput.videoSettings = @{
    
    (id)kCVPixelBufferPixelFormatTypeKey : @(needYuvOutput ? kCVPixelFormatType_420YpCbCr8BiPlanarFullRange : kCVPixelFormatType_32BGRA)};
        self.dataOutput.alwaysDiscardsLateVideoFrames = YES;
        [self.dataOutput setSampleBufferDelegate:self queue:self.bufferQueue];
        self.metaOutput = [[AVCaptureMetadataOutput alloc] init];
        [self.metaOutput setMetadataObjectsDelegate:self queue:self.bufferQueue];
        
        self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
        self.stillImageOutput.outputSettings = @{
    
    AVVideoCodecKey : AVVideoCodecJPEG};
        if ([self.stillImageOutput respondsToSelector:@selector(setHighResolutionStillImageOutputEnabled:)]) {
    
    

            self.stillImageOutput.highResolutionStillImageOutputEnabled = YES;
        }
        
        
        [self.session beginConfiguration];
        
        if ([self.session canAddInput:self.deviceInput]) {
    
    
            [self.session addInput:self.deviceInput];
        }else{
    
    
            NSLog( @"Could not add device input to the session" );
            return nil;
        }
        
        if ([self.session canSetSessionPreset:sessionPreset]) {
    
    
            [self.session setSessionPreset:sessionPreset];
            _sessionPreset = sessionPreset;
        }else if([self.session canSetSessionPreset:AVCaptureSessionPreset1920x1080]){
    
    
            [self.session setSessionPreset:AVCaptureSessionPreset1920x1080];
            _sessionPreset = AVCaptureSessionPreset1920x1080;
        }else if([self.session canSetSessionPreset:AVCaptureSessionPreset1280x720]){
    
    
            [self.session setSessionPreset:AVCaptureSessionPreset1280x720];
            _sessionPreset = AVCaptureSessionPreset1280x720;
        }else{
    
    
            [self.session setSessionPreset:AVCaptureSessionPreset640x480];
            _sessionPreset = AVCaptureSessionPreset640x480;
        }
        
        if ([self.session canAddOutput:self.dataOutput]) {
    
    
            
            [self.session addOutput:self.dataOutput];
        }else{
    
    
            
            NSLog( @"Could not add video data output to the session" );
            return nil;
        }
        
        if ([self.session canAddOutput:self.metaOutput]) {
    
    
            [self.session addOutput:self.metaOutput];
            self.metaOutput.metadataObjectTypes = @[AVMetadataObjectTypeFace].copy;
        }
        
        if ([self.session canAddOutput:self.stillImageOutput]) {
    
    

            [self.session addOutput:self.stillImageOutput];
        }else {
    
    

            NSLog(@"Could not add still image output to the session");
        }
        
        self.videoConnection =  [self.dataOutput connectionWithMediaType:AVMediaTypeVideo];
        
        
        if ([self.videoConnection isVideoOrientationSupported]) {
    
    
            
            [self.videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
            self.videoOrientation = AVCaptureVideoOrientationPortrait;
        }
        
        
        if ([self.videoConnection isVideoMirroringSupported]) {
    
    
            
            [self.videoConnection setVideoMirrored:YES];
            self.needVideoMirrored = YES;
        }
        
        if ([_videoDevice lockForConfiguration:NULL] == YES) {
    
    
//            _videoDevice.activeFormat = bestFormat;
            _videoDevice.activeVideoMinFrameDuration = CMTimeMake(1, iFPS);
            _videoDevice.activeVideoMaxFrameDuration = CMTimeMake(1, iFPS);
            [_videoDevice unlockForConfiguration];
        }
        
        
        [self.session commitConfiguration];
        
        NSMutableDictionary *tmpSettings = [[self.dataOutput recommendedVideoSettingsForAssetWriterWithOutputFileType:AVFileTypeQuickTimeMovie] mutableCopy];
        if (!EFMachineVersion.isiPhone5sOrLater) {
    
    
            NSNumber *tmpSettingValue = tmpSettings[AVVideoHeightKey];
            tmpSettings[AVVideoHeightKey] = tmpSettings[AVVideoWidthKey];
            tmpSettings[AVVideoWidthKey] = tmpSettingValue;
        }
        self.videoCompressingSettings = [tmpSettings copy];
        
        self.iExpectedFPS = iFPS;
        
        [self addObservers];
    }
    
    return self;
}

- (void)rotateCamera {
    
    
    
    if (self.devicePosition == AVCaptureDevicePositionFront) {
    
    
        self.devicePosition = AVCaptureDevicePositionBack;
    }else{
    
    
        self.devicePosition = AVCaptureDevicePositionFront;
    }
}

- (void)rotateCamera:(BOOL)isUseFrontCamera {
    
    
    
    if (isUseFrontCamera) {
    
    
        self.devicePosition = AVCaptureDevicePositionFront;
    }else{
    
    
        self.devicePosition = AVCaptureDevicePositionBack;
    }
}

- (void)setExposurePoint:(CGPoint)point inPreviewFrame:(CGRect)frame {
    
    
    
    BOOL isFrontCamera = self.devicePosition == AVCaptureDevicePositionFront;
    float fX = point.y / frame.size.height;
    float fY = isFrontCamera ? point.x / frame.size.width : (1 - point.x / frame.size.width);
    
    [self focusWithMode:self.videoDevice.focusMode exposureMode:self.videoDevice.exposureMode atPoint:CGPointMake(fX, fY)];
}

- (void)focusWithMode:(AVCaptureFocusMode)focusMode exposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point{
    
    
    NSError *error = nil;
    AVCaptureDevice * device = self.videoDevice;
    
    if ( [device lockForConfiguration:&error] ) {
    
    
        device.exposureMode = AVCaptureExposureModeContinuousAutoExposure;
//        - (void)setISOValue:(float)value{
    
    
        [self setISOValue:0];
//device.exposureTargetBias
        // Setting (focus/exposure)PointOfInterest alone does not initiate a (focus/exposure) operation
        // Call -set(Focus/Exposure)Mode: to apply the new point of interest
        if ( focusMode != AVCaptureFocusModeLocked && device.isFocusPointOfInterestSupported && [device isFocusModeSupported:focusMode] ) {
    
    
            device.focusPointOfInterest = point;
            device.focusMode = focusMode;
        }
        
        if ( exposureMode != AVCaptureExposureModeCustom && device.isExposurePointOfInterestSupported && [device isExposureModeSupported:exposureMode] ) {
    
    
            device.exposurePointOfInterest = point;
            device.exposureMode = exposureMode;
        }
        
        device.subjectAreaChangeMonitoringEnabled = YES;
        [device unlockForConfiguration];
        
        
    }
}

- (void)setWhiteBalance{
    
    
    [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {
    
    
        if ([captureDevice isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance]) {
    
    
            [captureDevice setWhiteBalanceMode:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance];
        }
    }];
}

- (void)changeDeviceProperty:(void(^)(AVCaptureDevice *))propertyChange{
    
    
    AVCaptureDevice *captureDevice= self.videoDevice;
    NSError *error;
    if ([captureDevice lockForConfiguration:&error]) {
    
    
        propertyChange(captureDevice);
        [captureDevice unlockForConfiguration];
    }else{
    
    
        NSLog(@"设置设备属性过程发生错误,错误信息:%@",error.localizedDescription);
    }
}

- (void)setISOValue:(float)value exposeDuration:(int)duration{
    
    
    float currentISO = (value < self.videoDevice.activeFormat.minISO) ? self.videoDevice.activeFormat.minISO: value;
    currentISO = value > self.videoDevice.activeFormat.maxISO ? self.videoDevice.activeFormat.maxISO : value;
    NSError *error;
    if ([self.videoDevice lockForConfiguration:&error]){
    
    
        [self.videoDevice setExposureModeCustomWithDuration:AVCaptureExposureDurationCurrent ISO:currentISO completionHandler:nil];
        [self.videoDevice unlockForConfiguration];
    }
}

- (void)setISOValue:(float)value{
    
    
//    float newVlaue = (value - 0.5) * (5.0 / 0.5); // mirror [0,1] to [-8,8]
    NSLog(@"%f", value);
    NSError *error = nil;
    if ( [self.videoDevice lockForConfiguration:&error] ) {
    
    
        [self.videoDevice setExposureTargetBias:value completionHandler:nil];
        [self.videoDevice unlockForConfiguration];
    }
    else {
    
    
        NSLog( @"Could not lock device for configuration: %@", error );
    }
}

- (void)setDevicePosition:(AVCaptureDevicePosition)devicePosition
{
    
    
    if (_devicePosition != devicePosition && devicePosition != AVCaptureDevicePositionUnspecified) {
    
    
        
        if (_session) {
    
    
            
            AVCaptureDevice *targetDevice = [self cameraDeviceWithPosition:devicePosition];
            
            if (targetDevice && [self judgeCameraAuthorization]) {
    
    
                
                NSError *error = nil;
                AVCaptureDeviceInput *deviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:targetDevice error:&error];
                
                if(!deviceInput || error) {
    
    
                    
                    NSLog(@"Error creating capture device input: %@", error.localizedDescription);
                    return;
                }
                
                _bSessionPause = YES;
                
                [_session beginConfiguration];
                
                [_session removeInput:_deviceInput];
                
                if ([_session canAddInput:deviceInput]) {
    
    
                    
                    [_session addInput:deviceInput];
                    
                    _deviceInput = deviceInput;
                    _videoDevice = targetDevice;
                    
                    _devicePosition = devicePosition;
                }
                
                _videoConnection =  [_dataOutput connectionWithMediaType:AVMediaTypeVideo];
                
                if ([_videoConnection isVideoOrientationSupported]) {
    
    
                    
                    [_videoConnection setVideoOrientation:_videoOrientation];
                }
                
                if ([_videoConnection isVideoMirroringSupported]) {
    
    
                    
                    [_videoConnection setVideoMirrored:devicePosition == AVCaptureDevicePositionFront];
                }
                
                [_session commitConfiguration];
                
                [self setSessionPreset:_sessionPreset];
                
                _bSessionPause = NO;
            }
        }
    }
}

- (void)setSessionPreset:(NSString *)sessionPreset
{
    
    
    if (_session && _sessionPreset) {
    
    
        
//        if (![sessionPreset isEqualToString:_sessionPreset]) {
    
    
        
        _bSessionPause = YES;
        
        [_session beginConfiguration];
        
        if ([_session canSetSessionPreset:sessionPreset]) {
    
    
            
            [_session setSessionPreset:sessionPreset];
            
            _sessionPreset = sessionPreset;
        }
        
        
        [_session commitConfiguration];
        
        self.videoCompressingSettings = [[self.dataOutput recommendedVideoSettingsForAssetWriterWithOutputFileType:AVFileTypeQuickTimeMovie] copy];
        
//        [self setIExpectedFPS:_iExpectedFPS];
        
        _bSessionPause = NO;
//        }
    }
}

- (void)setIExpectedFPS:(int)iExpectedFPS
{
    
    
    _iExpectedFPS = iExpectedFPS;
    
    if (iExpectedFPS <= 0 || !_dataOutput.videoSettings || !_videoDevice) {
    
    
        
        return;
    }
    
    CGFloat fWidth = [[_dataOutput.videoSettings objectForKey:@"Width"] floatValue];
    CGFloat fHeight = [[_dataOutput.videoSettings objectForKey:@"Height"] floatValue];
    
    AVCaptureDeviceFormat *bestFormat = nil;
    AVFrameRateRange *bestFrameRateRange = nil;
    
    for (AVCaptureDeviceFormat *format in [_videoDevice formats]) {
    
    
        
        CMFormatDescriptionRef description = format.formatDescription;
        
        if (CMFormatDescriptionGetMediaSubType(description) != kCVPixelFormatType_420YpCbCr8BiPlanarFullRange) {
    
    
            
            continue;
        }
        
        CMVideoDimensions videoDimension = CMVideoFormatDescriptionGetDimensions(description);
        if ((videoDimension.width == fWidth && videoDimension.height == fHeight)
            ||
            (videoDimension.height == fWidth && videoDimension.width == fHeight)) {
    
    
            
            for (AVFrameRateRange *range in format.videoSupportedFrameRateRanges) {
    
    
                
                if (range.maxFrameRate >= bestFrameRateRange.maxFrameRate) {
    
    
                    bestFormat = format;
                    bestFrameRateRange = range;
                }
            }
        }
    }
    
    if (bestFormat) {
    
    
        
        CMTime minFrameDuration;
        
        if (bestFrameRateRange.minFrameDuration.timescale / bestFrameRateRange.minFrameDuration.value < iExpectedFPS) {
    
    
            
            minFrameDuration = bestFrameRateRange.minFrameDuration;
        }else{
    
    
            
            minFrameDuration = CMTimeMake(1, iExpectedFPS);
        }
        
        if ([_videoDevice lockForConfiguration:NULL] == YES) {
    
    
            _videoDevice.activeFormat = bestFormat;
            _videoDevice.activeVideoMinFrameDuration = minFrameDuration;
            _videoDevice.activeVideoMaxFrameDuration = minFrameDuration;
            [_videoDevice unlockForConfiguration];
        }
    }
}

- (void)startRunning
{
    
    
    if (![self judgeCameraAuthorization]) {
    
    
        
        return;
    }
    
    if (!self.dataOutput) {
    
    

        return;
    }
    
    if (self.session && ![self.session isRunning]) {
    
    
        if (self.bufferQueue) {
    
    
            dispatch_async(self.bufferQueue, ^{
    
    
                [self.session startRunning];
            });
        }
        self.bSessionPause = NO;
    }
}


- (void)stopRunning
{
    
    
    if (self.session && [self.session isRunning]) {
    
    
        if (self.bufferQueue) {
    
    
            dispatch_async(self.bufferQueue, ^{
    
    
                [self.session stopRunning];
            });
        }
        self.bSessionPause = YES;
    }
}

- (CGRect)getZoomedRectWithRect:(CGRect)rect scaleToFit:(BOOL)bScaleToFit
{
    
    
    CGRect rectRet = rect;
    
    if (self.dataOutput.videoSettings) {
    
    
        
        CGFloat fWidth = [[self.dataOutput.videoSettings objectForKey:@"Width"] floatValue];
        CGFloat fHeight = [[self.dataOutput.videoSettings objectForKey:@"Height"] floatValue];
        
        float fScaleX = fWidth / CGRectGetWidth(rect);
        float fScaleY = fHeight / CGRectGetHeight(rect);
        float fScale = bScaleToFit ? fmaxf(fScaleX, fScaleY) : fminf(fScaleX, fScaleY);
        
        fWidth /= fScale;
        fHeight /= fScale;
        
        CGFloat fX = rect.origin.x - (fWidth - rect.size.width) / 2.0f;
        CGFloat fY = rect.origin.y - (fHeight - rect.size.height) / 2.0f;
        
        rectRet = CGRectMake(fX, fY, fWidth, fHeight);
    }
    
    return rectRet;
}

- (BOOL)judgeCameraAuthorization
{
    
    
    AVAuthorizationStatus authStatus = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];
    
    if (authStatus == AVAuthorizationStatusRestricted || authStatus == AVAuthorizationStatusDenied) {
    
    
        return NO;
    }
    
    return YES;
}

- (AVCaptureDevice *)cameraDeviceWithPosition:(AVCaptureDevicePosition)position
{
    
    
    AVCaptureDevice *deviceRet = nil;
    
    if (position != AVCaptureDevicePositionUnspecified) {
    
    
        
        NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
        
        for (AVCaptureDevice *device in devices) {
    
    
            
            if ([device position] == position) {
    
    
                
                deviceRet = device;
            }
        }
    }
    
    return deviceRet;
}

- (AVCaptureVideoPreviewLayer *)previewLayer
{
    
    
    if (!_previewLayer) {
    
    
        
        _previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
    }
    
    return _previewLayer;
}

- (void)snapStillImageCompletionHandler:(void (^)(CMSampleBufferRef imageDataSampleBuffer, NSError *error))handler
{
    
    
    if ([self judgeCameraAuthorization]) {
    
    
        self.bSessionPause = YES;
        
        NSString *strSessionPreset = [self.sessionPreset mutableCopy];
        self.sessionPreset = AVCaptureSessionPresetPhoto;
        
        // 改变preset会黑一下
        [NSThread sleepForTimeInterval:0.3];
        
        dispatch_async(self.bufferQueue, ^{
    
    
            
            [self.stillImageOutput captureStillImageAsynchronouslyFromConnection:[self.stillImageOutput connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
    
    
                
                self.bSessionPause = NO;
                self.sessionPreset = strSessionPreset;
                handler(imageDataSampleBuffer , error);
            }];
        } );
    }
}
BOOL updateExporureModel(STExposureModel model){
    
    
    if (currentExposureMode == model) return NO;
    currentExposureMode = model;
    return YES;
}

- (void)test:(AVCaptureExposureMode)model{
    
    
    if ([self.videoDevice lockForConfiguration:nil]) {
    
    
        if ([self.videoDevice  isExposureModeSupported:model]) {
    
    
            [self.videoDevice setExposureMode:model];
        }
        [self.videoDevice unlockForConfiguration];
    }
}

- (void)setExposureTime:(CMTime)time{
    
    
    if ([self.videoDevice lockForConfiguration:nil]) {
    
    
        if (@available(iOS 12.0, *)) {
    
    
            self.videoDevice.activeMaxExposureDuration = time;
        } else {
    
    
            // Fallback on earlier versions
        }
        [self.videoDevice unlockForConfiguration];
    }
}
- (void)setFPS:(float)fps{
    
    
    if ([_videoDevice lockForConfiguration:NULL] == YES) {
    
    
//            _videoDevice.activeFormat = bestFormat;
        _videoDevice.activeVideoMinFrameDuration = CMTimeMake(1, fps);
        _videoDevice.activeVideoMaxFrameDuration = CMTimeMake(1, fps);
        [_videoDevice unlockForConfiguration];
    }
}

- (void)updateExposure:(CMSampleBufferRef)sampleBuffer{
    
    
    CFDictionaryRef metadataDict = CMCopyDictionaryOfAttachments(NULL,sampleBuffer, kCMAttachmentMode_ShouldPropagate);
    NSDictionary * metadata = [[NSMutableDictionary alloc] initWithDictionary:(__bridge NSDictionary *)metadataDict];
    CFRelease(metadataDict);
    NSDictionary *exifMetadata = [[metadata objectForKey:(NSString *)kCGImagePropertyExifDictionary] mutableCopy];
    float brightnessValue = [[exifMetadata objectForKey:(NSString *)kCGImagePropertyExifBrightnessValue] floatValue];
    
    if(brightnessValue > 2 && updateExporureModel(STExposureModelPositive2)){
    
    
        [self setISOValue:500 exposeDuration:30];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
        [self setFPS:30];
    }else if(brightnessValue > 1 && brightnessValue < 2 && updateExporureModel(STExposureModelPositive1)){
    
    
        [self setISOValue:500 exposeDuration:30];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
        [self setFPS:30];
    }else if(brightnessValue > 0 && brightnessValue < 1 && updateExporureModel(STExposureModel0)){
    
    
        [self setISOValue:500 exposeDuration:30];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
        [self setFPS:30];
    }else if (brightnessValue > -1 && brightnessValue < 0 && updateExporureModel(STExposureModelNegative1)){
    
    
        [self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:40];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
    }else if (brightnessValue > -2 && brightnessValue < -1 && updateExporureModel(STExposureModelNegative2)){
    
    
        [self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:35];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
    }else if (brightnessValue > -2.5 && brightnessValue < -2 && updateExporureModel(STExposureModelNegative3)){
    
    
        [self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:30];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
    }else if (brightnessValue > -3 && brightnessValue < -2.5 && updateExporureModel(STExposureModelNegative4)){
    
    
        [self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:25];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
    }else if (brightnessValue > -3.5 && brightnessValue < -3 && updateExporureModel(STExposureModelNegative5)){
    
    
        [self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:20];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
    }else if (brightnessValue > -4 && brightnessValue < -3.5 && updateExporureModel(STExposureModelNegative6)){
    
    
        [self setISOValue:self.videoDevice.activeFormat.maxISO - 250 exposeDuration:15];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
    }else if (brightnessValue > -5 && brightnessValue < -4 && updateExporureModel(STExposureModelNegative7)){
    
    
        [self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:10];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
    }else if(brightnessValue < -5 && updateExporureModel(STExposureModelNegative8)){
    
    
        [self setISOValue:self.videoDevice.activeFormat.maxISO - 150 exposeDuration:5];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
    }

//    NSLog(@"current brightness %f min iso %f max iso %f min exposure %f max exposure %f", brightnessValue, self.videoDevice.activeFormat.minISO, self.videoDevice.activeFormat.maxISO, CMTimeGetSeconds(self.videoDevice.activeFormat.minExposureDuration),   CMTimeGetSeconds(self.videoDevice.activeFormat.maxExposureDuration));
}

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    
    
    if (!self.bSessionPause) {
    
    
        if (self.delegate && [self.delegate respondsToSelector:@selector(rtcCameraVideoCapturer:didOutputSampleBuffer:)]) {
    
    
            [self.delegate rtcCameraVideoCapturer:self didOutputSampleBuffer:sampleBuffer];
        }
//        if (self.delegate && [self.delegate respondsToSelector:@selector(captureOutput:didOutputSampleBuffer:fromConnection:)]) {
    
    
//            //[connection setVideoOrientation:AVCaptureVideoOrientationPortrait];
//            [self.delegate captureOutput:captureOutput didOutputSampleBuffer:sampleBuffer fromConnection:connection];
//        }
    }
//    [self updateExposure:sampleBuffer];
}


- (void)captureOutput:(AVCaptureOutput *)output didOutputMetadataObjects:(NSArray<__kindof AVMetadataObject *> *)metadataObjects fromConnection:(AVCaptureConnection *)connection{
    
    
    AVMetadataFaceObject *faceObject = nil;
    for(AVMetadataObject *object  in metadataObjects){
    
    
        if (AVMetadataObjectTypeFace == object.type) {
    
    
            faceObject = (AVMetadataFaceObject*)object;
        }
    }
    static BOOL hasFace = NO;
    if (!hasFace && faceObject.faceID) {
    
    
        hasFace = YES;
    }
    if (!faceObject.faceID) {
    
    
        hasFace = NO;
    }
}

#pragma mark - Notifications
- (void)addObservers {
    
    
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(appWillResignActive) name:UIApplicationWillResignActiveNotification object:nil];
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(appDidBecomeActive) name:UIApplicationDidBecomeActiveNotification object:nil];
    [[NSNotificationCenter defaultCenter] addObserver:self
               selector:@selector(dealCaptureSessionRuntimeError:)
                   name:AVCaptureSessionRuntimeErrorNotification
                 object:self.session];
}

- (void)removeObservers {
    
    
    [[NSNotificationCenter defaultCenter] removeObserver:self];
}

#pragma mark - Notification
- (void)appWillResignActive {
    
    
    RTCLogInfo("SDCustomRTCCameraCapturer appWillResignActive");
}

- (void)appDidBecomeActive {
    
    
    RTCLogInfo("SDCustomRTCCameraCapturer appDidBecomeActive");
    [self startRunning];
}

- (void)dealCaptureSessionRuntimeError:(NSNotification *)notification {
    
    
    NSError *error = [notification.userInfo objectForKey:AVCaptureSessionErrorKey];
    RTCLogError(@"SDCustomRTCCameraCapturer dealCaptureSessionRuntimeError error: %@", error);

    if (error.code == AVErrorMediaServicesWereReset) {
    
    
        [self startRunning];
    }
}

#pragma mark - Dealloc
- (void)dealloc {
    
    
    DebugLog(@"SDCustomRTCCameraCapturer dealloc");
    [self removeObservers];

    if (self.session) {
    
    
        self.bSessionPause = YES;
        [self.session beginConfiguration];
        [self.session removeOutput:self.dataOutput];
        [self.session removeInput:self.deviceInput];
        [self.session commitConfiguration];
        if ([self.session isRunning]) {
    
    
            [self.session stopRunning];
        }
        self.session = nil;
    }
}

@end

3. Realize WebRTC combined with ossrs video call function

WebRTC on the iOS side calls ossrs to realize the live video call function, please check:
https://blog.csdn.net/gloryFlow/article/details/132262724

Here are the places that need to be changed

Use SDCustomRTCCameraCapturer class in createVideoTrack

- (RTCVideoTrack *)createVideoTrack {
    
    
    RTCVideoSource *videoSource = [self.factory videoSource];
    self.localVideoSource = videoSource;

    // 如果是模拟器
    if (TARGET_IPHONE_SIMULATOR) {
    
    
        if (@available(iOS 10, *)) {
    
    
            self.videoCapturer = [[RTCFileVideoCapturer alloc] initWithDelegate:self];
        } else {
    
    
            // Fallback on earlier versions
        }
    } else{
    
    
        self.videoCapturer = [[SDCustomRTCCameraCapturer alloc] initWithDevicePosition:AVCaptureDevicePositionFront sessionPresset:AVCaptureSessionPreset1920x1080 fps:20 needYuvOutput:NO];
        
//        self.videoCapturer = [[SDCustomRTCCameraCapturer alloc] initWithDelegate:self];
    }
    
    RTCVideoTrack *videoTrack = [self.factory videoTrackWithSource:videoSource trackId:@"video0"];
    
    return videoTrack;
}

When you need to render to the interface, you need to set startCaptureLocalVideo to add a rendered interface View for localVideoTrack, and the renderer is a RTCMTLVideoView

self.localRenderer = [[RTCMTLVideoView alloc] initWithFrame:CGRectZero];
    self.localRenderer.delegate = self;
- (void)startCaptureLocalVideo:(id<RTCVideoRenderer>)renderer {
    
    
    if (!self.isPublish) {
    
    
        return;
    }
    
    if (!renderer) {
    
    
        return;
    }
    
    if (!self.videoCapturer) {
    
    
        return;
    }
    
    [self setDegradationPreference:RTCDegradationPreferenceMaintainResolution];
    
    RTCVideoCapturer *capturer = self.videoCapturer;
    if ([capturer isKindOfClass:[SDCustomRTCCameraCapturer class]]) {
    
    
        AVCaptureDevice *camera = [self findDeviceForPosition:self.usingFrontCamera?AVCaptureDevicePositionFront:AVCaptureDevicePositionBack];
        
        SDCustomRTCCameraCapturer *cameraVideoCapturer = (SDCustomRTCCameraCapturer *)capturer;
        [cameraVideoCapturer setISOValue:0.0];
        [cameraVideoCapturer rotateCamera:self.usingFrontCamera];
        self.videoCapturer.delegate = self;

        AVCaptureDeviceFormat *formatNilable = [self selectFormatForDevice:camera];;
        
        if (!formatNilable) {
    
    
            return;
        }
        
        DebugLog(@"formatNilable:%@", formatNilable);
        
        NSInteger fps = [self selectFpsForFormat:formatNilable];
        
        CMVideoDimensions videoVideoDimensions = CMVideoFormatDescriptionGetDimensions(formatNilable.formatDescription);
        float width = videoVideoDimensions.width;
        float height = videoVideoDimensions.height;
                
        DebugLog(@"videoVideoDimensions width:%f,height:%f", width, height);
        [cameraVideoCapturer startRunning];
//        [cameraVideoCapturer startCaptureWithDevice:camera format:formatNilable fps:fps completionHandler:^(NSError *error) {
    
    
//            DebugLog(@"startCaptureWithDevice error:%@", error);
//        }];
        [self changeResolution:width height:height fps:(int)fps];
    }
    
    if (@available(iOS 10, *)) {
    
    
        if ([capturer isKindOfClass:[RTCFileVideoCapturer class]]) {
    
    
            RTCFileVideoCapturer *fileVideoCapturer = (RTCFileVideoCapturer *)capturer;
            [fileVideoCapturer startCapturingFromFileNamed:@"beautyPicture.mp4" onError:^(NSError * _Nonnull error) {
    
    
                DebugLog(@"startCaptureLocalVideo startCapturingFromFileNamed error:%@", error);
            }];
        }
    } else {
    
    
        // Fallback on earlier versions
    }
    
    [self.localVideoTrack addRenderer:renderer];
}

Obtain the image captured by the camera - (void)rtcCameraVideoCapturer:(RTCVideoCapturer *)capturer didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer;

The specific code is as follows

- (void)rtcCameraVideoCapturer:(RTCVideoCapturer *)capturer didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer {
    
    

    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    RTCCVPixelBuffer *rtcPixelBuffer =
    [[RTCCVPixelBuffer alloc] initWithPixelBuffer:videoPixelBufferRef];
    RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer];
    int64_t timeStampNs = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) *
    1000000000;
    RTCVideoFrame *rtcVideoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer     

    
    [self.localVideoSource capturer:capturer didCaptureVideoFrame:rtcVideoFrame];
}

The complete code of WebRTCClient is as follows

WebRTCClient.h

#import <Foundation/Foundation.h>
#import <WebRTC/WebRTC.h>
#import <UIKit/UIKit.h>
#import "SDCustomRTCCameraCapturer.h"

#define kSelectedResolution @"kSelectedResolutionIndex"
#define kFramerateLimit 30.0

@protocol WebRTCClientDelegate;
@interface WebRTCClient : NSObject

@property (nonatomic, weak) id<WebRTCClientDelegate> delegate;

/**
 connect工厂
 */
@property (nonatomic, strong) RTCPeerConnectionFactory *factory;

/**
 是否push
 */
@property (nonatomic, assign) BOOL isPublish;

/**
 connect
 */
@property (nonatomic, strong) RTCPeerConnection *peerConnection;

/**
 RTCAudioSession
 */
@property (nonatomic, strong) RTCAudioSession *rtcAudioSession;

/**
 DispatchQueue
 */
@property (nonatomic) dispatch_queue_t audioQueue;

/**
 mediaConstrains
 */
@property (nonatomic, strong) NSDictionary *mediaConstrains;

/**
 publishMediaConstrains
 */
@property (nonatomic, strong) NSDictionary *publishMediaConstrains;

/**
 playMediaConstrains
 */
@property (nonatomic, strong) NSDictionary *playMediaConstrains;

/**
 optionalConstraints
 */
@property (nonatomic, strong) NSDictionary *optionalConstraints;

/**
 RTCVideoCapturer摄像头采集器
 */
@property (nonatomic, strong) RTCVideoCapturer *videoCapturer;

/**
 local语音localAudioTrack
 */
@property (nonatomic, strong) RTCAudioTrack *localAudioTrack;

/**
 localVideoTrack
 */
@property (nonatomic, strong) RTCVideoTrack *localVideoTrack;

/**
 remoteVideoTrack
 */
@property (nonatomic, strong) RTCVideoTrack *remoteVideoTrack;

/**
 RTCVideoRenderer
 */
@property (nonatomic, weak) id<RTCVideoRenderer> remoteRenderView;

/**
 localDataChannel
 */
@property (nonatomic, strong) RTCDataChannel *localDataChannel;

/**
 localDataChannel
 */
@property (nonatomic, strong) RTCDataChannel *remoteDataChannel;

/**
 RTCVideoSource
 */
@property (nonatomic, strong) RTCVideoSource *localVideoSource;


- (instancetype)initWithPublish:(BOOL)isPublish;

- (void)startCaptureLocalVideo:(id<RTCVideoRenderer>)renderer;

- (void)addIceCandidate:(RTCIceCandidate *)candidate;

- (void)answer:(void (^)(RTCSessionDescription *sdp))completionHandler;

- (void)offer:(void (^)(RTCSessionDescription *sdp))completionHandler;

- (void)setRemoteSdp:(RTCSessionDescription *)remoteSdp completion:(void (^)(NSError *error))completion;

- (void)setRemoteCandidate:(RTCIceCandidate *)remoteCandidate;

- (BOOL)changeResolution:(int)width height:(int)height fps:(int)fps;

- (NSArray<NSString *> *)availableVideoResolutions;

#pragma mark - switchCamera
- (void)switchCamera:(id<RTCVideoRenderer>)renderer;

#pragma mark - Hiden or show Video
- (void)hidenVideo;

- (void)showVideo;

#pragma mark - Hiden or show Audio
- (void)muteAudio;

- (void)unmuteAudio;

- (void)speakOff;

- (void)speakOn;

#pragma mark - 设置视频码率BitrateBps
- (void)setMaxBitrate:(int)maxBitrate;

- (void)setMinBitrate:(int)minBitrate;

#pragma mark - 设置视频帧率
- (void)setMaxFramerate:(int)maxFramerate;

- (void)close;

@end


@protocol WebRTCClientDelegate <NSObject>

// 处理美颜设置
- (RTCVideoFrame *)webRTCClient:(WebRTCClient *)client didCaptureSampleBuffer:(CMSampleBufferRef)sampleBuffer;

- (void)webRTCClient:(WebRTCClient *)client didDiscoverLocalCandidate:(RTCIceCandidate *)candidate;
- (void)webRTCClient:(WebRTCClient *)client didChangeConnectionState:(RTCIceConnectionState)state;
- (void)webRTCClient:(WebRTCClient *)client didReceiveData:(NSData *)data;

@end

WebRTCClient.m

#import "WebRTCClient.h"


@interface WebRTCClient ()<RTCPeerConnectionDelegate, RTCDataChannelDelegate, RTCVideoCapturerDelegate, SDCustomRTCCameraCapturerDelegate>

@property (nonatomic, assign) BOOL usingFrontCamera;

@end

@implementation WebRTCClient

- (instancetype)initWithPublish:(BOOL)isPublish {
    
    
    self = [super init];
    if (self) {
    
    
        self.isPublish = isPublish;
        self.usingFrontCamera = YES;
                                        
        RTCMediaConstraints *constraints = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:self.publishMediaConstrains optionalConstraints:self.optionalConstraints];
        
        RTCConfiguration *newConfig = [[RTCConfiguration alloc] init];
        newConfig.sdpSemantics = RTCSdpSemanticsUnifiedPlan;
        newConfig.continualGatheringPolicy = RTCContinualGatheringPolicyGatherContinually;

        self.peerConnection = [self.factory peerConnectionWithConfiguration:newConfig constraints:constraints delegate:nil];
        
        [self createMediaSenders];
        
        [self createMediaReceivers];
        
        // srs not support data channel.
        // self.createDataChannel()
        [self configureAudioSession];
        
        self.peerConnection.delegate = self;
    }
    return self;
}


- (void)createMediaSenders {
    
    
    if (!self.isPublish) {
    
    
        return;
    }
    
    NSString *streamId = @"stream";
    
    // Audio
    RTCAudioTrack *audioTrack = [self createAudioTrack];
    self.localAudioTrack = audioTrack;
    
    RTCRtpTransceiverInit *audioTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];
    audioTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;
    audioTrackTransceiver.streamIds = @[streamId];
    
    [self.peerConnection addTransceiverWithTrack:audioTrack init:audioTrackTransceiver];
    
    // Video
    RTCRtpEncodingParameters *encodingParameters = [[RTCRtpEncodingParameters alloc] init];
    encodingParameters.maxBitrateBps = @(6000000);
//    encodingParameters.bitratePriority = 1.0;
    
    RTCVideoTrack *videoTrack = [self createVideoTrack];
    self.localVideoTrack = videoTrack;
    RTCRtpTransceiverInit *videoTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];
    videoTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;
    videoTrackTransceiver.streamIds = @[streamId];
    
    // 设置该属性后,SRS服务无法播放视频画面
//    videoTrackTransceiver.sendEncodings = @[encodingParameters];
    [self.peerConnection addTransceiverWithTrack:videoTrack init:videoTrackTransceiver];
    
    [self setDegradationPreference:RTCDegradationPreferenceBalanced];
    [self setMaxBitrate:6000000];
}

- (void)createMediaReceivers {
    
    
    if (!self.isPublish) {
    
    
        return;
    }
    
    if (self.peerConnection.transceivers.count > 0) {
    
    
        RTCRtpTransceiver *transceiver = self.peerConnection.transceivers.firstObject;
        if (transceiver.mediaType == RTCRtpMediaTypeVideo) {
    
    
            RTCVideoTrack *track = (RTCVideoTrack *)transceiver.receiver.track;
            self.remoteVideoTrack = track;
        }
    }
}

- (void)configureAudioSession {
    
    
    [self.rtcAudioSession lockForConfiguration];
    @try {
    
    
        RTCAudioSessionConfiguration *configuration =
            [[RTCAudioSessionConfiguration alloc] init];
        configuration.category = AVAudioSessionCategoryPlayAndRecord;
        configuration.categoryOptions = AVAudioSessionCategoryOptionDefaultToSpeaker;
        configuration.mode = AVAudioSessionModeDefault;
        BOOL hasSucceeded = NO;
        NSError *error = nil;
        if (self.rtcAudioSession.isActive) {
    
    
          hasSucceeded = [self.rtcAudioSession setConfiguration:configuration error:&error];
        } else {
    
    
          hasSucceeded = [self.rtcAudioSession setConfiguration:configuration
                                            active:YES
                                             error:&error];
        }
        if (!hasSucceeded) {
    
    
            DebugLog(@"Error setting configuration: %@", error.localizedDescription);
        }
    } @catch (NSException *exception) {
    
    
        DebugLog(@"configureAudioSession exception:%@", exception);
    }
    
    [self.rtcAudioSession unlockForConfiguration];
}

- (RTCAudioTrack *)createAudioTrack {
    
    
    /// enable google 3A algorithm.
    NSDictionary *mandatory = @{
    
    
        @"googEchoCancellation": kRTCMediaConstraintsValueTrue,
        @"googAutoGainControl": kRTCMediaConstraintsValueTrue,
        @"googNoiseSuppression": kRTCMediaConstraintsValueTrue,
    };
    
    RTCMediaConstraints *audioConstrains = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:mandatory optionalConstraints:self.optionalConstraints];
    RTCAudioSource *audioSource = [self.factory audioSourceWithConstraints:audioConstrains];
    
    RTCAudioTrack *audioTrack = [self.factory audioTrackWithSource:audioSource trackId:@"audio0"];
    
    return audioTrack;
}

- (RTCVideoTrack *)createVideoTrack {
    
    
    RTCVideoSource *videoSource = [self.factory videoSource];
    self.localVideoSource = videoSource;

    // 如果是模拟器
    if (TARGET_IPHONE_SIMULATOR) {
    
    
        if (@available(iOS 10, *)) {
    
    
            self.videoCapturer = [[RTCFileVideoCapturer alloc] initWithDelegate:self];
        } else {
    
    
            // Fallback on earlier versions
        }
    } else{
    
    
        self.videoCapturer = [[SDCustomRTCCameraCapturer alloc] initWithDevicePosition:AVCaptureDevicePositionFront sessionPresset:AVCaptureSessionPreset1920x1080 fps:20 needYuvOutput:NO];
        
//        self.videoCapturer = [[SDCustomRTCCameraCapturer alloc] initWithDelegate:self];
    }
    
    RTCVideoTrack *videoTrack = [self.factory videoTrackWithSource:videoSource trackId:@"video0"];
    
    return videoTrack;
}

- (void)addIceCandidate:(RTCIceCandidate *)candidate {
    
    
    [self.peerConnection addIceCandidate:candidate];
}

- (void)offer:(void (^)(RTCSessionDescription *sdp))completion {
    
    
    if (self.isPublish) {
    
    
        self.mediaConstrains = self.publishMediaConstrains;
    } else {
    
    
        self.mediaConstrains = self.playMediaConstrains;
    }
    
    RTCMediaConstraints *constrains = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:self.mediaConstrains optionalConstraints:self.optionalConstraints];
    DebugLog(@"peerConnection:%@",self.peerConnection);
    __weak typeof(self) weakSelf = self;
    [weakSelf.peerConnection offerForConstraints:constrains completionHandler:^(RTCSessionDescription * _Nullable sdp, NSError * _Nullable error) {
    
    
        if (error) {
    
    
            DebugLog(@"offer offerForConstraints error:%@", error);
        }
        
        if (sdp) {
    
    
            [weakSelf.peerConnection setLocalDescription:sdp completionHandler:^(NSError * _Nullable error) {
    
    
                if (error) {
    
    
                    DebugLog(@"offer setLocalDescription error:%@", error);
                }
                if (completion) {
    
    
                    completion(sdp);
                }
            }];
        }
    }];
}

- (void)answer:(void (^)(RTCSessionDescription *sdp))completion {
    
    
    RTCMediaConstraints *constrains = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:self.mediaConstrains optionalConstraints:self.optionalConstraints];
    __weak typeof(self) weakSelf = self;
    [weakSelf.peerConnection answerForConstraints:constrains completionHandler:^(RTCSessionDescription * _Nullable sdp, NSError * _Nullable error) {
    
    
        if (error) {
    
    
            DebugLog(@"answer answerForConstraints error:%@", error);
        }
        
        if (sdp) {
    
    
            [weakSelf.peerConnection setLocalDescription:sdp completionHandler:^(NSError * _Nullable error) {
    
    
                if (error) {
    
    
                    DebugLog(@"answer setLocalDescription error:%@", error);
                }
                if (completion) {
    
    
                    completion(sdp);
                }
            }];
        }
    }];
}

- (void)setRemoteSdp:(RTCSessionDescription *)remoteSdp completion:(void (^)(NSError *error))completion {
    
    
    [self.peerConnection setRemoteDescription:remoteSdp completionHandler:completion];
}

- (void)setRemoteCandidate:(RTCIceCandidate *)remoteCandidate {
    
    
    [self.peerConnection addIceCandidate:remoteCandidate];
}

- (void)setMaxBitrate:(int)maxBitrate {
    
    
    if (self.peerConnection.senders.count > 0) {
    
    
        RTCRtpSender *videoSender = self.peerConnection.senders.firstObject;
        RTCRtpParameters *parametersToModify = videoSender.parameters;
        for (RTCRtpEncodingParameters *encoding in parametersToModify.encodings) {
    
    
            encoding.maxBitrateBps = @(maxBitrate);
        }
        [videoSender setParameters:parametersToModify];
    }
}

- (void)setDegradationPreference:(RTCDegradationPreference)degradationPreference {
    
    
    // RTCDegradationPreferenceMaintainResolution
    NSMutableArray *videoSenders = [NSMutableArray arrayWithCapacity:0];
    for (RTCRtpSender *sender in self.peerConnection.senders) {
    
    
        if (sender.track && [kRTCMediaStreamTrackKindVideo isEqualToString:sender.track.kind]) {
    
    
            // [videoSenders addObject:sender];
            RTCRtpParameters *parameters = sender.parameters;
            parameters.degradationPreference = [NSNumber numberWithInteger:degradationPreference];
            [sender setParameters:parameters];
            
            [videoSenders addObject:sender];
        }
    }
    
    for (RTCRtpSender *sender in videoSenders) {
    
    
        RTCRtpParameters *parameters = sender.parameters;
        parameters.degradationPreference = [NSNumber numberWithInteger:degradationPreference];
        [sender setParameters:parameters];
    }
}

- (void)setMinBitrate:(int)minBitrate {
    
    
    if (self.peerConnection.senders.count > 0) {
    
    
        RTCRtpSender *videoSender = self.peerConnection.senders.firstObject;
        RTCRtpParameters *parametersToModify = videoSender.parameters;
        for (RTCRtpEncodingParameters *encoding in parametersToModify.encodings) {
    
    
            encoding.minBitrateBps = @(minBitrate);
        }
        [videoSender setParameters:parametersToModify];
    }
}

- (void)setMaxFramerate:(int)maxFramerate {
    
    
    if (self.peerConnection.senders.count > 0) {
    
    
        RTCRtpSender *videoSender = self.peerConnection.senders.firstObject;
        RTCRtpParameters *parametersToModify = videoSender.parameters;
        // 该版本暂时没有maxFramerate,需要更新到最新版本
        for (RTCRtpEncodingParameters *encoding in parametersToModify.encodings) {
    
    
            encoding.maxFramerate = @(maxFramerate);
        }
        [videoSender setParameters:parametersToModify];
    }
}

#pragma mark - Private
- (AVCaptureDevice *)findDeviceForPosition:(AVCaptureDevicePosition)position {
    
    
  NSArray<AVCaptureDevice *> *captureDevices =
      [SDCustomRTCCameraCapturer captureDevices];
  for (AVCaptureDevice *device in captureDevices) {
    
    
    if (device.position == position) {
    
    
      return device;
    }
  }
  return captureDevices[0];
}

- (NSArray<NSString *> *)availableVideoResolutions {
    
    
  NSMutableSet<NSArray<NSNumber *> *> *resolutions =
      [[NSMutableSet<NSArray<NSNumber *> *> alloc] init];
  for (AVCaptureDevice *device in [SDCustomRTCCameraCapturer captureDevices]) {
    
    
    for (AVCaptureDeviceFormat *format in
         [SDCustomRTCCameraCapturer supportedFormatsForDevice:device]) {
    
    
      CMVideoDimensions resolution =
          CMVideoFormatDescriptionGetDimensions(format.formatDescription);
      NSArray<NSNumber *> *resolutionObject = @[@(resolution.width), @(resolution.height) ];
      [resolutions addObject:resolutionObject];
    }
  }

  NSArray<NSArray<NSNumber *> *> *sortedResolutions =
      [[resolutions allObjects] sortedArrayUsingComparator:^NSComparisonResult(
                                    NSArray<NSNumber *> *obj1, NSArray<NSNumber *> *obj2) {
    
    
        NSComparisonResult cmp = [obj1.firstObject compare:obj2.firstObject];
        if (cmp != NSOrderedSame) {
    
    
          return cmp;
        }
        return [obj1.lastObject compare:obj2.lastObject];
      }];

  NSMutableArray<NSString *> *resolutionStrings = [[NSMutableArray<NSString *> alloc] init];
  for (NSArray<NSNumber *> *resolution in sortedResolutions) {
    
    
    NSString *resolutionString =
        [NSString stringWithFormat:@"%@x%@", resolution.firstObject, resolution.lastObject];
    [resolutionStrings addObject:resolutionString];
  }

  return [resolutionStrings copy];
}


- (int)videoResolutionComponentAtIndex:(int)index inString:(NSString *)resolution {
    
    
    if (index != 0 && index != 1) {
    
    
        return 0;
    }
    NSArray<NSString *> *components = [resolution componentsSeparatedByString:@"x"];
    if (components.count != 2) {
    
    
        return 0;
    }
    return components[index].intValue;
}

- (AVCaptureDeviceFormat *)selectFormatForDevice:(AVCaptureDevice *)device {
    
    
    RTCVideoCapturer *capturer = self.videoCapturer;
    if ([capturer isKindOfClass:[SDCustomRTCCameraCapturer class]]) {
    
    
        SDCustomRTCCameraCapturer *cameraVideoCapturer = (SDCustomRTCCameraCapturer *)capturer;
        NSArray *availableVideoResolutions = [self availableVideoResolutions];
        NSString *selectedIndexStr = [[NSUserDefaults standardUserDefaults] objectForKey:kSelectedResolution];
        int selectedIndex = 0;
        if (selectedIndexStr && selectedIndexStr.length > 0) {
    
    
            selectedIndex = [selectedIndexStr intValue];
        }
        
        NSString *videoResolution = [availableVideoResolutions objectAtIndex:selectedIndex];
        DebugLog(@"availableVideoResolutions:%@, videoResolution:%@", availableVideoResolutions, videoResolution);
        NSArray<AVCaptureDeviceFormat *> *formats =
          [SDCustomRTCCameraCapturer supportedFormatsForDevice:device];
        int targetWidth = [self videoResolutionComponentAtIndex:0 inString:videoResolution];
        int targetHeight = [self videoResolutionComponentAtIndex:1 inString:videoResolution];
        AVCaptureDeviceFormat *selectedFormat = nil;
        int currentDiff = INT_MAX;

        for (AVCaptureDeviceFormat *format in formats) {
    
    
            CMVideoDimensions dimension = CMVideoFormatDescriptionGetDimensions(format.formatDescription);
            FourCharCode pixelFormat = CMFormatDescriptionGetMediaSubType(format.formatDescription);
            int diff = abs(targetWidth - dimension.width) + abs(targetHeight - dimension.height);
            if (diff < currentDiff) {
    
    
              selectedFormat = format;
              currentDiff = diff;
            } else if (diff == currentDiff && pixelFormat == [cameraVideoCapturer preferredOutputPixelFormat]) {
    
    
              selectedFormat = format;
            }
        }

        return selectedFormat;
    }
    
    return nil;
}

- (NSInteger)selectFpsForFormat:(AVCaptureDeviceFormat *)format {
    
    
    Float64 maxSupportedFramerate = 0;
    for (AVFrameRateRange *fpsRange in format.videoSupportedFrameRateRanges) {
    
    
        maxSupportedFramerate = fmax(maxSupportedFramerate, fpsRange.maxFrameRate);
    }
//    return fmin(maxSupportedFramerate, kFramerateLimit);
    return 20;
}

- (void)startCaptureLocalVideo:(id<RTCVideoRenderer>)renderer {
    
    
    if (!self.isPublish) {
    
    
        return;
    }
    
    if (!renderer) {
    
    
        return;
    }
    
    if (!self.videoCapturer) {
    
    
        return;
    }
    
    [self setDegradationPreference:RTCDegradationPreferenceMaintainResolution];
    
    RTCVideoCapturer *capturer = self.videoCapturer;
    if ([capturer isKindOfClass:[SDCustomRTCCameraCapturer class]]) {
    
    
        AVCaptureDevice *camera = [self findDeviceForPosition:self.usingFrontCamera?AVCaptureDevicePositionFront:AVCaptureDevicePositionBack];
        
        SDCustomRTCCameraCapturer *cameraVideoCapturer = (SDCustomRTCCameraCapturer *)capturer;
        [cameraVideoCapturer setISOValue:0.0];
        [cameraVideoCapturer rotateCamera:self.usingFrontCamera];
        self.videoCapturer.delegate = self;

        AVCaptureDeviceFormat *formatNilable = [self selectFormatForDevice:camera];;
        
        if (!formatNilable) {
    
    
            return;
        }
        
        DebugLog(@"formatNilable:%@", formatNilable);
        
        NSInteger fps = [self selectFpsForFormat:formatNilable];
        
        CMVideoDimensions videoVideoDimensions = CMVideoFormatDescriptionGetDimensions(formatNilable.formatDescription);
        float width = videoVideoDimensions.width;
        float height = videoVideoDimensions.height;
                
        DebugLog(@"videoVideoDimensions width:%f,height:%f", width, height);
        [cameraVideoCapturer startRunning];
//        [cameraVideoCapturer startCaptureWithDevice:camera format:formatNilable fps:fps completionHandler:^(NSError *error) {
    
    
//            DebugLog(@"startCaptureWithDevice error:%@", error);
//        }];
        [self changeResolution:width height:height fps:(int)fps];
    }
    
    if (@available(iOS 10, *)) {
    
    
        if ([capturer isKindOfClass:[RTCFileVideoCapturer class]]) {
    
    
            RTCFileVideoCapturer *fileVideoCapturer = (RTCFileVideoCapturer *)capturer;
            [fileVideoCapturer startCapturingFromFileNamed:@"beautyPicture.mp4" onError:^(NSError * _Nonnull error) {
    
    
                DebugLog(@"startCaptureLocalVideo startCapturingFromFileNamed error:%@", error);
            }];
        }
    } else {
    
    
        // Fallback on earlier versions
    }
    
    [self.localVideoTrack addRenderer:renderer];
}

- (void)renderRemoteVideo:(id<RTCVideoRenderer>)renderer {
    
    
    if (!self.isPublish) {
    
    
        return;
    }
    
    self.remoteRenderView = renderer;
}

- (RTCDataChannel *)createDataChannel {
    
    
    RTCDataChannelConfiguration *config = [[RTCDataChannelConfiguration alloc] init];
    RTCDataChannel *dataChannel = [self.peerConnection dataChannelForLabel:@"WebRTCData" configuration:config];
    if (!dataChannel) {
    
    
        return nil;
    }
    
    dataChannel.delegate = self;
    self.localDataChannel = dataChannel;
    return dataChannel;
}

- (void)sendData:(NSData *)data {
    
    
    RTCDataBuffer *buffer = [[RTCDataBuffer alloc] initWithData:data isBinary:YES];
    [self.remoteDataChannel sendData:buffer];
}

#pragma mark - switchCamera
- (void)switchCamera:(id<RTCVideoRenderer>)renderer {
    
    
    self.usingFrontCamera = !self.usingFrontCamera;
    [self startCaptureLocalVideo:renderer];
}

#pragma mark - 更改分辨率
- (BOOL)changeResolution:(int)width height:(int)height fps:(int)fps {
    
    
    if (!self.localVideoSource) {
    
    
        RTCVideoSource *videoSource = [self.factory videoSource];
        [videoSource adaptOutputFormatToWidth:width height:height fps:fps];
    } else {
    
    
        [self.localVideoSource adaptOutputFormatToWidth:width height:height fps:fps];
    }
    return YES;
}

#pragma mark - Hiden or show Video
- (void)hidenVideo {
    
    
    [self setVideoEnabled:NO];
    self.localVideoTrack.isEnabled = NO;
}

- (void)showVideo {
    
    
    [self setVideoEnabled:YES];
    self.localVideoTrack.isEnabled = YES;
}

- (void)setVideoEnabled:(BOOL)isEnabled {
    
    
    [self setTrackEnabled:[RTCVideoTrack class] isEnabled:isEnabled];
}

- (void)setTrackEnabled:(Class)track isEnabled:(BOOL)isEnabled {
    
    
    for (RTCRtpTransceiver *transceiver in self.peerConnection.transceivers) {
    
    
        if (transceiver && [transceiver isKindOfClass:track]) {
    
    
            transceiver.sender.track.isEnabled = isEnabled;
        }
    }
}


#pragma mark - Hiden or show Audio
- (void)muteAudio {
    
    
    [self setAudioEnabled:NO];
    self.localAudioTrack.isEnabled = NO;
}

- (void)unmuteAudio {
    
    
    [self setAudioEnabled:YES];
    self.localAudioTrack.isEnabled = YES;
}

- (void)speakOff {
    
    
    __weak typeof(self) weakSelf = self;
    dispatch_async(self.audioQueue, ^{
    
    
        [weakSelf.rtcAudioSession lockForConfiguration];
        @try {
    
    
            
            NSError *error;
            [self.rtcAudioSession setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker error:&error];
            
            NSError *ooapError;
            [self.rtcAudioSession overrideOutputAudioPort:AVAudioSessionPortOverrideNone error:&ooapError];
            DebugLog(@"speakOff error:%@, ooapError:%@", error, ooapError);
        } @catch (NSException *exception) {
    
    
            DebugLog(@"speakOff exception:%@", exception);
        }
        [weakSelf.rtcAudioSession unlockForConfiguration];
    });
}

- (void)speakOn {
    
    
    __weak typeof(self) weakSelf = self;
    dispatch_async(self.audioQueue, ^{
    
    
        [weakSelf.rtcAudioSession lockForConfiguration];
        @try {
    
    
            
            NSError *error;
            [self.rtcAudioSession setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker error:&error];
            
            NSError *ooapError;
            [self.rtcAudioSession overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:&ooapError];
            
            NSError *activeError;
            [self.rtcAudioSession setActive:YES error:&activeError];
            
            DebugLog(@"speakOn error:%@, ooapError:%@, activeError:%@", error, ooapError, activeError);
        } @catch (NSException *exception) {
    
    
            DebugLog(@"speakOn exception:%@", exception);
        }
        [weakSelf.rtcAudioSession unlockForConfiguration];
    });
}

- (void)setAudioEnabled:(BOOL)isEnabled {
    
    
    [self setTrackEnabled:[RTCAudioTrack class] isEnabled:isEnabled];
}

- (void)close {
    
    
    RTCVideoCapturer *capturer = self.videoCapturer;
    if ([capturer isKindOfClass:[SDCustomRTCCameraCapturer class]]) {
    
    
        SDCustomRTCCameraCapturer *cameraVideoCapturer = (SDCustomRTCCameraCapturer *)capturer;
        [cameraVideoCapturer stopRunning];
    }
    [self.peerConnection close];
    self.peerConnection = nil;
    self.videoCapturer = nil;
}

#pragma mark - RTCPeerConnectionDelegate
/** Called when the SignalingState changed. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
didChangeSignalingState:(RTCSignalingState)stateChanged {
    
    
    DebugLog(@"peerConnection didChangeSignalingState:%ld", (long)stateChanged);
}

/** Called when media is received on a new stream from remote peer. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection didAddStream:(RTCMediaStream *)stream {
    
    
    DebugLog(@"peerConnection didAddStream");
    if (self.isPublish) {
    
    
        return;
    }
    
    NSArray *videoTracks = stream.videoTracks;
    if (videoTracks && videoTracks.count > 0) {
    
    
        RTCVideoTrack *track = videoTracks.firstObject;
        self.remoteVideoTrack = track;
    }
    
    if (self.remoteVideoTrack && self.remoteRenderView) {
    
    
        id<RTCVideoRenderer> remoteRenderView = self.remoteRenderView;
        RTCVideoTrack *remoteVideoTrack = self.remoteVideoTrack;
        [remoteVideoTrack addRenderer:remoteRenderView];
    }
    
    /**
     if let audioTrack = stream.audioTracks.first{
     print("audio track faund")
     audioTrack.source.volume = 8
     }
     */
}

/** Called when a remote peer closes a stream.
 *  This is not called when RTCSdpSemanticsUnifiedPlan is specified.
 */
- (void)peerConnection:(RTCPeerConnection *)peerConnection didRemoveStream:(RTCMediaStream *)stream {
    
    
    DebugLog(@"peerConnection didRemoveStream");
}

/** Called when negotiation is needed, for example ICE has restarted. */
- (void)peerConnectionShouldNegotiate:(RTCPeerConnection *)peerConnection {
    
    
    DebugLog(@"peerConnection peerConnectionShouldNegotiate");
}

/** Called any time the IceConnectionState changes. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
    didChangeIceConnectionState:(RTCIceConnectionState)newState {
    
    
    DebugLog(@"peerConnection didChangeIceConnectionState:%ld", newState);
    if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didChangeConnectionState:)]) {
    
    
        [self.delegate webRTCClient:self didChangeConnectionState:newState];
    }
    
    if (RTCIceConnectionStateConnected == newState || RTCIceConnectionStateChecking == newState) {
    
    
        [self setDegradationPreference:RTCDegradationPreferenceMaintainResolution];
    }
}

/** Called any time the IceGatheringState changes. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
    didChangeIceGatheringState:(RTCIceGatheringState)newState {
    
    
    DebugLog(@"peerConnection didChangeIceGatheringState:%ld", newState);
}

/** New ice candidate has been found. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
    didGenerateIceCandidate:(RTCIceCandidate *)candidate {
    
    
    DebugLog(@"peerConnection didGenerateIceCandidate:%@", candidate);
    if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didDiscoverLocalCandidate:)]) {
    
    
        [self.delegate webRTCClient:self didDiscoverLocalCandidate:candidate];
    }
}

/** Called when a group of local Ice candidates have been removed. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
    didRemoveIceCandidates:(NSArray<RTCIceCandidate *> *)candidates {
    
    
    DebugLog(@"peerConnection didRemoveIceCandidates:%@", candidates);
}

/** New data channel has been opened. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
    didOpenDataChannel:(RTCDataChannel *)dataChannel {
    
    
    DebugLog(@"peerConnection didOpenDataChannel:%@", dataChannel);
    self.remoteDataChannel = dataChannel;
}

/** Called when signaling indicates a transceiver will be receiving media from
 *  the remote endpoint.
 *  This is only called with RTCSdpSemanticsUnifiedPlan specified.
 */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
    didStartReceivingOnTransceiver:(RTCRtpTransceiver *)transceiver {
    
    
    DebugLog(@"peerConnection didStartReceivingOnTransceiver:%@", transceiver);
}

/** Called when a receiver and its track are created. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
        didAddReceiver:(RTCRtpReceiver *)rtpReceiver
               streams:(NSArray<RTCMediaStream *> *)mediaStreams {
    
    
    DebugLog(@"peerConnection didAddReceiver");
}

/** Called when the receiver and its track are removed. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
     didRemoveReceiver:(RTCRtpReceiver *)rtpReceiver {
    
    
    DebugLog(@"peerConnection didRemoveReceiver");
}

#pragma mark - RTCDataChannelDelegate
/** The data channel state changed. */
- (void)dataChannelDidChangeState:(RTCDataChannel *)dataChannel {
    
    
    DebugLog(@"dataChannelDidChangeState:%@", dataChannel);
}

/** The data channel successfully received a data buffer. */
- (void)dataChannel:(RTCDataChannel *)dataChannel
didReceiveMessageWithBuffer:(RTCDataBuffer *)buffer {
    
    
    if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didReceiveData:)]) {
    
    
        [self.delegate webRTCClient:self didReceiveData:buffer.data];
    }
}

#pragma mark - RTCVideoCapturerDelegate处理代理
- (void)capturer:(RTCVideoCapturer *)capturer didCaptureVideoFrame:(RTCVideoFrame *)frame {
    
    
//    DebugLog(@"capturer:%@ didCaptureVideoFrame:%@", capturer, frame);
//    RTCVideoFrame *aFilterVideoFrame;
//    if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didCaptureVideoFrame:)]) {
    
    
//        aFilterVideoFrame = [self.delegate webRTCClient:self didCaptureVideoFrame:frame];
//    }
    
    //  操作C 需要手动释放  否则内存暴涨
//      CVPixelBufferRelease(_buffer)
    //    拿到pixelBuffersampleBuffer
//        ((RTCCVPixelBuffer*)frame.buffer).pixelBuffer
    
//    if (!aFilterVideoFrame) {
    
    
//        aFilterVideoFrame = frame;
//    }
//
//    [self.localVideoSource capturer:capturer didCaptureVideoFrame:frame];
}

- (void)rtcCameraVideoCapturer:(RTCVideoCapturer *)capturer didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer {
    
    

    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    RTCCVPixelBuffer *rtcPixelBuffer =
    [[RTCCVPixelBuffer alloc] initWithPixelBuffer:videoPixelBufferRef];
    RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer];
    int64_t timeStampNs = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) *
    1000000000;
    RTCVideoFrame *rtcVideoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer     

    
    [self.localVideoSource capturer:capturer didCaptureVideoFrame:rtcVideoFrame];
}

#pragma mark - Lazy
- (RTCPeerConnectionFactory *)factory {
    
    
    if (!_factory) {
    
    
        RTCInitializeSSL();
        RTCDefaultVideoEncoderFactory *videoEncoderFactory = [[RTCDefaultVideoEncoderFactory alloc] init];
        
        RTCDefaultVideoDecoderFactory *videoDecoderFactory = [[RTCDefaultVideoDecoderFactory alloc] init];
        _factory = [[RTCPeerConnectionFactory alloc] initWithEncoderFactory:videoEncoderFactory decoderFactory:videoDecoderFactory];
    }
    return _factory;
}

- (dispatch_queue_t)audioQueue {
    
    
    if (!_audioQueue) {
    
    
        _audioQueue = dispatch_queue_create("cn.dface.webrtc", NULL);
    }
    return _audioQueue;
}

- (RTCAudioSession *)rtcAudioSession {
    
    
    if (!_rtcAudioSession) {
    
    
        _rtcAudioSession = [RTCAudioSession sharedInstance];
    }
    
    return _rtcAudioSession;
}

- (NSDictionary *)mediaConstrains {
    
    
    if (!_mediaConstrains) {
    
    
        _mediaConstrains = [[NSDictionary alloc] initWithObjectsAndKeys:
                            kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveAudio,
                            kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveVideo,
                            kRTCMediaConstraintsValueTrue, @"IceRestart",
                            nil];
    }
    return _mediaConstrains;
}

- (NSDictionary *)publishMediaConstrains {
    
    
    if (!_publishMediaConstrains) {
    
    
        _publishMediaConstrains = [[NSDictionary alloc] initWithObjectsAndKeys:
                                   kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveAudio,
                                   kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveVideo,
                                   kRTCMediaConstraintsValueTrue, @"IceRestart",
                                   kRTCMediaConstraintsValueTrue, @"googCpuOveruseDetection",
                                   kRTCMediaConstraintsValueFalse, @"googBandwidthLimitedResolution",
//                                   @"2160", kRTCMediaConstraintsMinWidth,
//                                   @"3840", kRTCMediaConstraintsMinHeight,
//                                   @"0.25", kRTCMediaConstraintsMinAspectRatio,
//                                   @"1", kRTCMediaConstraintsMaxAspectRatio,
                                   nil];
    }
    return _publishMediaConstrains;
}

- (NSDictionary *)playMediaConstrains {
    
    
    if (!_playMediaConstrains) {
    
    
        _playMediaConstrains = [[NSDictionary alloc] initWithObjectsAndKeys:
                                kRTCMediaConstraintsValueTrue, kRTCMediaConstraintsOfferToReceiveAudio,
                                kRTCMediaConstraintsValueTrue, kRTCMediaConstraintsOfferToReceiveVideo,
                                kRTCMediaConstraintsValueTrue, @"IceRestart",
                                kRTCMediaConstraintsValueTrue, @"googCpuOveruseDetection",
                                kRTCMediaConstraintsValueFalse, @"googBandwidthLimitedResolution",
//                                @"2160", kRTCMediaConstraintsMinWidth,
//                                @"3840", kRTCMediaConstraintsMinHeight,
//                                @"0.25", kRTCMediaConstraintsMinAspectRatio,
//                                @"1", kRTCMediaConstraintsMaxAspectRatio,
                                nil];
    }
    return _playMediaConstrains;
}

- (NSDictionary *)optionalConstraints {
    
    
    if (!_optionalConstraints) {
    
    
        _optionalConstraints = [[NSDictionary alloc] initWithObjectsAndKeys:
                                kRTCMediaConstraintsValueTrue, @"DtlsSrtpKeyAgreement",
                                kRTCMediaConstraintsValueTrue, @"googCpuOveruseDetection",
                                nil];
    }
    return _optionalConstraints;
}

- (void)dealloc {
    
    
    [self.peerConnection close];
    self.peerConnection = nil;
}

@end

So far, the WebRTC video custom RTCVideoCapturer camera is completed.

Others
Before setting up the ossrs service, you can check: https://blog.csdn.net/gloryFlow/article/details/132257196
Before implementing iOS to call ossrs audio and video calls, you can check: https://blog.csdn.net/gloryFlow /article/details/132262724
Before WebRTC audio and video calls did not display high-resolution images, you can check: https://blog.csdn.net/gloryFlow/article/details/132240952
Modify the bit rate in SDP, you can check: https://blog.csdn.net/gloryFlow/article/details/132263021
GPUImage video call video beauty filter, you can view: https://blog.csdn.net/gloryFlow/article/details/132265842
RTC live local video Or album video, you can view: https://blog.csdn.net/gloryFlow/article/details/132267068

Four. Summary

WebRTC audio and video call - WebRTC video custom RTCVideoCapturer camera. It mainly obtains the CVPixelBufferRef of the picture captured by the camera, generates the RTCVideoFrame from the processed CVPixelBufferRef, and calls the didCaptureVideoFrame method implemented in the localVideoSource of WebRTC. There are many contents, and the description may be inaccurate, please forgive me.

https://blog.csdn.net/gloryFlow/article/details/132308673

Learning records, keep improving every day.

Guess you like

Origin blog.csdn.net/gloryFlow/article/details/132308673