ipadOS External cameras

It must be a USB-C interface, so you need to confirm the USB video Class (UVC) when using it. This defines a standard video input stream for USB devices. It seems that
HDMI can also be used.

Can use webCamera and external camera

AVFoundation

It is used to handle the creation, editing, playback and capture of batches of video multimedia data. It is the basic library for audio and video functions.
What functions does it have:
1. Can support multiple audio and video formats for playing audio and video
2. Record audio and video content through microphone and camera
3. Can edit video content and add special effects, etc.
4. Support processing of audio and video files Metadata, such as title, author, description
5. Mix multiple audio and video files into one file

AVCapture-prefixed

Insert image description here

Two cameras

userPreferredCamera(read/write)
The user chooses which camera they want to use. This should be set when the camera is connected to the app.

systemPreferedCamera (read only)
The system determines which camera is the most suitable. Generally, the system identifies the front camera. If you want it to use the rear camera, you can also modify this behavior.
It will first check the user PreferredCamera if

The process of using external camera

The app first uses AVCaptureDevices to represent the camera and microphone, which encapsulates AVCaptureDeviceInput to embed a

AVCaptureSession

AVCaptureSession is a central control object of the AVCapture graph.
AVCaptureSession will receive data from the capture device (such as camera, microphone), and then encapsulate the data to the output device (such as screen). Sometimes it will be placed in a file, or even changed. into a picture

AVCapturePutputs

包含movieFile,Phote,VideoData,AudioData,MetaData,DepthData
MovieFileOutput: 会记录QuickTime movies
PhotoOutput: captures high-quality stills and live photos
Data outputs:deliver video or audio buffers from the camera or mic to your app
MetaData,DepthData:for live camera preview ,is a special type of output

AVCaptureVideoPreviewLayer

Is a subclass of CALayer

AVCaptureVideoOrientation

Build in camera uses AVCaptureVideoOrientation to achieve selection, because build in camera relies on ipad rotation,
but it is obvious that the expanded camera cannot be rotated through the ipad direction,
and this is prohibited in IOS17. Generally, it is used to obtain UIDeviceOrientation and convert it into AVCaptureVideoOrientation.

UIDeviceOrientation deviceOrientation = [UIDevice currentDevice].orientation;
self.previewView.videoPreviewLayer.connection.videoOrientation = (AVCaptureVideoOrientation)deviceOrientation;

So in IOS17, we plan to rotate and expand the camera to match the orientation of the iPad, and
replace it with a new API

AVCaptureDeviceRotationCoordinator

This class initialization requires an AVCaptureDevice (mainly used to update the videoRotationAngleForHorizonLevelCature property) and an optional CALayer (mainly used to update videoRotationAngleForHorizonLevelPreview) (which displays the preview of the camera's video). It has two properties:
a
video rotation angle for horizon-level preview
A separate angle for horizon-level capture
is read only
horizon-level preview, which means that no matter whether the device is portrait, landscape, or the video frame of the inverted camera is vertical relative to gravity, the
AVCaptureDeviceRotationCoordinator instance will be responsible for updating the videoRotationAngleForHorizonLevelPreview

initWithDevice:preViewLayer:

Create a coordinator to provide a separate compensation angle for the capture device content and camera preview used by the app

self.videoDeviceRotationCoordinator = [[AVCaptureDeviceRotationCoordinator alloc] initWithDevice:self.videoDeviceInput.device previewLayer:self.previewView.videoPreviewLayer];
self.previewView.videoPreviewLayer.connection.videoRotationAngle = self.videoDeviceRotationCoordinator.videoRotationAngleForHorizonLevelPreview;

videoRotationAngleForHorizonLevelPreview

Mainly used to adjust preview.
Use the videoRotationAngleForHorizonLevelPreview to display video frames in the CALayer passed to the coordinator's initializer. I don't quite understand this sentence. It will
describe how many angles are needed for drinking and preview. This angle is relative to the coordinate system of UIKit or swiftUI.

videoRotationAngleForHorizonLevelCature

Mainly used to adjust photos and movies.
This attribute describes the physical direction of the camera.

Each external camera can be regarded as an ACVaptureDevice instance with
three important attributes: media type, device type, position
AVMediaType.video
AVCaptureDevice.DeviceType.external
AVCaptureDevice.Position.unspecified

The external camera requires more attention than the build-in camera because the user can connect and disconnect.

if reconnected, camera is represented by a new AVCaptureDevice
can detect the connection through key-value observe
AVCaptureDevice.isConnected and AVCaptureDevice.DiscoverySession.devices

When the device connection status changes, AVCaptureDevices will also post a norifications
AVCaptureDeviceWasConnected
AVCaptureDeviceWasDisconnected
and then send these messages to background queues
synchronize with AVCaptureSession and UI

isVideoRotationAngleSupported

It will check whether the connection supports angle rotation. Not all connections can rotate, only connections that transmit video or depth media data.

videoRotationAngle

Setting an appropriate value for this property will allow the preview to rotate

videoRotationAngleForhorizonLevelPreview

Show a camera preview

Avoid requesting rotation by setting the angle of the AVCaptureConnection for video data output. Changing the angle of the connection will cause frame delivery to break because the capture rendering pipeline needs to be reconfigured to apply the new rotation angle. Instead, you should rotate the CALayer that displays the camera preview.
If your application uses video data output with an AVAssetWriter for recording a custom movie, avoid using an AVCaptureConnection to rotate the video. Instead, use the transform property of the AVAssetWriterInput instance to set the rotation, which changes the output file's metadata. Using this approach, video applications apply rotation during playback, which is more energy-efficient than using a capture connection to rotate each frame. Your application needs to convert the rotation angle from degrees to radians because the asset writer input uses a CGAffineTransform that applies rotation in radians.

professional phrases

build-in camera built-in camera

How to upgrade to iOS17


Enter the Apple Beta software plan website, sign in, click IOS17, and then select IOS17 public in settings->general->software update->beta update on the iPad.

apple demo

CACam
must be IOS17 and cannot be allowed on simulate because xcode does not have permission to access the camera.

When AVCaptureSession delegates any interaction, an event is placed in the dedicated serial dispatch queue (sessionQUeue is not the main queue),
which does not hinder other events.

Please add image description

These used microphones or cameras require the user to grant permissions. AVFoundation uses AVAuthorizationStatus to enumerate the permission states to identify which devices are restricted, denied, etc.

By using the discovery session to list valid device types, for example, a broken camera is not counted as a valid device.

Each AVCaptionDevice represents an extended camera

external camera

There are three important data
media type, device type, position (the external camera device position is unspecified.
Use three attributes to discover this external camera through API

position determines whether it is an external camera

AVCaptureDevicePosition currentPosition = currentVideoDevice.position;

typedef NS_ENUM(NSInteger, AVCaptureDevicePosition) {
    
    
    AVCaptureDevicePositionUnspecified = 0,
    AVCaptureDevicePositionBack        = 1,
    AVCaptureDevicePositionFront       = 2,
} API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);

quick time movies

Movie files created by QuickTime (Quick Time), a multimedia technology and container format developed by Apple. QuickTime is a multimedia technology widely used for audio, video, and animation, while QuickTime Movie is a common file format used to store multimedia content containing elements such as audio, video, and text.

MetaData (metadata):

Metadata is data that describes data and provides information about the data such as creation date, author, file size, resolution, etc. In the field of photos, metadata can include the date the photo was taken, camera model, exposure time, focal length and other information. Metadata can help you better organize, manage, and understand your photo collection.

Depth:

In image processing, "depth" usually refers to the depth or color depth of an image. It represents the number of colors or gray levels that each pixel in the image can represent. Higher depth means the image has more color detail and grayscale, while lower depth causes the image to lose some detail. Common image depths include 8-bit (256 colors), 16-bit (65536 colors), and 24-bit (about 16 million colors).

AVCaptureDeviceTypeBuiltInWideAngleCamera:

Indicates a built-in wide-angle camera device, usually a standard front- or rear-facing camera used for general photo and video capture.

AVCaptureDeviceTypeBuiltInDualCamera:

Indicates a built-in dual camera device, usually including a wide-angle camera and a telephoto camera, which can be used to achieve functions such as optical zoom.

AVCaptureDeviceTypeBuiltInTrueDepthCamera:

Indicates the built-in TrueDepth camera device, which is usually used on devices that support Face ID and can be used for facial recognition, face tracking and other functions.

AVCaptureDevicePreviewLayer

Will mirror the expanded camera by default

question

If you want to compile on a real machine, you need to change xcode to xcode15 beta
official website download

Code logic

Insert image description here

FaceTime

Always use the front or extended camera
, but when the extended camera is connected, the btn for switching cameras will be hidden.

process

configureSession//入口

VCaptureDevice *videoDevice = AVCaptureDevice.systemPreferredCamera;//获取systemPreferredCamera
//获取UserPerredCamera
    NSUserDefaults *userDefaults = NSUserDefaults.standardUserDefaults;
	[userDefaults boolForKey:@"setInitialUserPreferredCamera"]


//如果都没找到就直接设置backCamera
AVCaptureDeviceDiscoverySession *backVideoDeviceDiscoverySession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeBuiltInDualCamera, AVCaptureDeviceTypeBuiltInWideAngleCamera] mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionBack];
videoDevice = backVideoDeviceDiscoverySession.devices.firstObject;
        
AVCaptureDevice.userPreferredCamera = videoDevice;
        
[userDefaults setBool:YES forKey:@"setInitialUserPreferredCamera"];

//设置device input
    AVCaptureDeviceInput* videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];

//添加systemPerferredCamera观察者
    [AVCaptureDevice addObserver:self forKeyPath:@"systemPreferredCamera" options:NSKeyValueObservingOptionNew context:SystemPreferredCameraContext];


createDeviceRotationCoordinator//还不知道干嘛的

function

discoverySessionWithDeviceTypes

+ (instancetype)discoverySessionWithDeviceTypes:(NSArray<AVCaptureDeviceType> *)deviceTypes
                                      mediaType:(AVMediaType)mediaType
                                       position:(AVCaptureDevicePosition)position;

Used to discover microphone and camera device
deviceTypes to specify the search device type such as AVCaptureDeviceTypeBuiltInWideAngleCamera, AVCaptureDeviceTypeBuiltInMicrophone

mediaType specifies the media type to be searched. For example: AVMediaTypeVideo, AVMediaTypeAudio
position: a constant of type AVCaptureDevicePosition, which specifies the camera position to be searched, which can be front (AVCaptureDevicePositionFront), rear (AVCaptureDevicePositionBack) or unspecified (AVCaptureDevicePositionUnspecified)

format

NSArray<AVCaptureDeviceFormat *> *formats
represents the capture type of the camera. Different formats have different resolutions, frame rates, and other capture parameters.

code scanner

Use back-facing at startup and do not allow the user to select the camera

changeCamera module

AVCam uses system perform camera by default. If it is not specified, it will check the current camera. If it is an expanded camera to obtain the front camera, it will switch to the rear camera. If it is the rear camera, it will first switch to the expanded camera and then switch to the front camera. So if there is an expanded camera It seems that I can't switch to the front position?

Is the rear camera selected by default?

Orientation

There are several Orientations

window Orientation

self.view.window.windowScene.interfaceOrientation;
should be the window Orientation displayed by the user interface, which should be understood as the operation interface. This value will be automatically updated by the system
to include UIInterfaceOrientation:

UIInterfaceOrientationPortrait           = UIDeviceOrientationPortrait,
UIInterfaceOrientationPortraitUpsideDown = UIDeviceOrientationPortraitUpsideDown,
UIInterfaceOrientationLandscapeLeft      = UIDeviceOrientationLandscapeRight,
UIInterfaceOrientationLandscapeRight     = UIDeviceOrientationLandscapeLeft

For example, videoOrientation = AVCaptureVideoOrientationPortrait is generally initialized first//Set to vertical first,
then determine whether the window orientation is unknown.
If not, set the current prientation to the window
, and then set previewView.videoPreviewLayer.connection.videoOrientation to this direction as well.

device orientation

UIDeviceOrientation deviceOrientation = [UIDevice currentDevice].orientation;

Automatically update always
previewView.videoPreviewLayer.connection.videoOrientation = [UIDevice currentDevice].orientation;

rotation problem

The original selection method was to let previewView.videoPreviewLayer.connection.videoOrientation = [UIDevice currentDevice].orientation;
to let preview match the rotation direction of the device.
It was originally useful because when the device was inverted, the image captured by the camera was actually inverted, so At this point, flip the preview and it will be normal.
However, if you choose to expand the camera, you will find that when the device is inverted, the camera may not be inverted, so the captured pictures are not inverted, but if the device is inverted, the image will be inverted, and it will look like no rotation operation was performed. .
So the current problem is that we can no longer monitor the rotation of the device, but should observe whether the camera has rotated. Because preview will automatically return to vertical, so as long as the camera is upright, it must be real.

question

Q: Assuming that no code logic operation is performed and the built-in camera is used,
then assume a situation: 1. The camera is upright and the device is upside down. Will an inverted image be displayed on the device? ,
2. If the camera is turned upside down and the device is upright, will an inverted image appear?
Then turn the device upside down at this time: the device and the camera are turned upside down at the same time. The image captured by the camera is upside down, and the device is upside down. Shouldn't the upright image be displayed?

A: Case 1 should not display an inverted image, but an upright one, because the relevant view will be automatically rotated, so when it is inverted at the same time, the image provided by the camera will be inverted, but when the device is inverted, the relevant view will be rotated upright, so the sum What is displayed at this time is the inverted

When I use the iPad's built-in camera to shoot, I rotate the iPad 90 degrees clockwise. At this time, the underlying operation is: first, autolayout re-lays out the preview, and rotates it 90 degrees counterclockwise to ensure that the preview is perpendicular to the horizontal plane. This was done automatically by the system before. At this time, because the built-in camera also rotates 90 degrees clockwise with the device, the picture seen on the iPad is actually an image rotated 90 degrees clockwise. I need to rotate the preview 90 degrees clockwise through code to correct it. image

But I still don’t quite understand how videoRotationAngleForHorizonLevelPreview is used in AVCam to adjust the angle of the preview when the angle of the preview changes. What is the difference between this and adjusting the angle of the iPad?

Guess you like

Origin blog.csdn.net/qq_43535469/article/details/132313404