Director of Google AR Optical Engineering: Vision Pro simplifies the definition of AR/VR/MR, making it easier for consumers to choose

After the debut of Apple Vision Pro, the industry's attention to XR, space technology and the industry has increased, and the actions of other companies in the industry have also attracted attention, such as Samsung, Google, Meta, and Microsoft. As an early promoter of AR glasses, Google has continued to explore AR and VR technologies for many years. However, according to recent reports, the company has stopped the Iris AR glasses project and shifted its research focus to XR software projects, such as cooperating with Samsung to develop XR headsets. operating system.

Recently, Bernard Kress, chairman of the SPIE AR|VR|MR webinar and director of Google AR optical engineering, confirmed the news that the Google AR glasses project has been discontinued. At the same time, I also talked about my own views on Apple Vision Pro and VST.

Spatial Computing Simplifies XR Marketing

According to Kress, one of the highlights of Apple Visin Pro is simplicity, which is not only reflected in the design, but also in the definition and naming of the product. For example, using the concept of space technology to integrate AR, VR, MR and other segmentation technologies simplifies product definition and makes it easier for consumers to choose. In addition, I also think that Apple Vision Pro is a great name, which is easier for people to remember for a long time than abbreviated names such as VR and XR.

Before Apple released Vision Pro, the definition of XR hardware in the immersive reality ecosystem was ambiguous, and there were many different subdivisions, such as smart glasses, AR headsets, MR headsets, VST headsets, VR headsets, and so on. For consumers, these complex classifications are difficult to distinguish.

AR/VR hardware technology classification

Vision Pro will change this situation, because it divides XR into two categories: spatial computing headsets and smart glasses, making product terms easier for consumers to understand (without considering the technical differences between VST and OST behind them). It is worth noting that at this stage, Apple seems to have postponed the development plan of AR glasses. Meta AR glasses seem to no longer aim to be launched in 2024. The Google AR glasses project has stopped. At the same time, the industry is also focusing on VST solutions, such as Apple Vision Pro, Meta Quest Pro, and headsets jointly developed by Qualcomm, Google, and Samsung.

At the same time, this also integrates other hardware forms such as MR into one category: toB solution/R&D prototype. In other words, consumers do not need to consider the technical details of XR hardware, which may be more concerned and easier to understand by corporate, industrial, and government customers.

In fact, the concept of spatial computing was first proposed by MIT Media Lab researcher Simon Greenwold in 2003. The definition is: a way of human-computer interaction that can store and manipulate information related to objects and spaces in the real world. The benefits are It can make the machine provide more comprehensive and important auxiliary functions for daily work and entertainment. Immersive and realistic spatial audio is also an important part of the concept of spatial computing, and Vision Pro can customize the spatial audio experience for different users through 3D spatial scanning and human ear scanning.

VST is not a new concept either. In 2015, Intel announced the prototype of the VST headset: Project Alloy, which was later cut off in 2017. Kress pointed out that the early VST technology had technical challenges in many aspects, including the angle difference between the human eye and the see-through camera, photon motion delay (ideally should be much lower than 10 milliseconds), wearing comfort, visual comfort, social comfort ( Whether the appearance conforms to the mainstream aesthetic).

Opinions on Vision Pro

According to the visual inspection of the experiencer, the weight of Vision Pro is about 500 to 600g, while the weight of Quest Pro is 720g. The weight distribution of the two headsets is different. Quest Pro uses front and rear counterweights to separate components such as batteries at the back of the head. The Vision Pro uses a split design with an external battery. The headband is mainly composed of fabric, with almost no electronic components. The weight of the headset is mainly concentrated on the bridge of the nose/face. Kress pointed out that the weight of HoloLens 2 is 570g, which is similar to that of Vision Pro, but the former uses a front and rear counterweight design, and the center of gravity of the whole machine is approximately at the center of gravity of the wearer's head. In contrast, the weight of Vision Pro is concentrated on the face, and wearing it for a long time may cause facial discomfort. It remains to be seen how the follow-up experience will be.

About the Eyesight external screen: Although the concept of displaying the user's face on the headset is crazy, it is visually beautiful. It can not only display the user's facial expression, but also simulate the appearance of the iridescent lens of HoloLens 2.

Kress believes that the Zeiss lenses equipped with the Vision Pro may not be particularly expensive, and the price should be lower than the expected $600. Because the Vision Pro's stereoscopic display is fixed focal length, there's no need for multifocal lenses or progressive lenses, and single focal length lenses shouldn't cost as much as $600.

For the lens part, Apple did not specify the specific optical solution of the Vision Pro. The official description is a custom-made catadioptric lens, suggesting that it may be some kind of Pancake lens.

Not long ago, Apple also acquired Mira, an AR headset manufacturer. Kress said: Although I don’t know the specific purpose of Apple’s acquisition of Mira, I guess Apple can use the AR headset box as a test prototype to evaluate UX, FOV, etc.

Whether to solve the difficulty of VST VR

Compared with Alloy many years ago, does Apple Vision Pro solve the difficulties of VST/digital perspective VR?

It is reported that the VST perspective delay of Vision Pro is 12ms, and the motion-to-photon delay should also be very low. Kress said: I think the latency of Lynx R1 may be lower than Vision Pro, mainly because of camera parallax.

In addition, compared to Lynx R1, Vision Pro's VST perspective camera is not directly in front of the human eye, but is located at the bottom of the headset. Apple can use digital mapping to compensate for the difference between the perspective camera perspective and the natural perspective of the human eye in real time. Reference: SPIE

Guess you like

Origin blog.csdn.net/qingtingwang/article/details/131553222