[Metaverse] Hardware, the number one entry in the Metaverse

For many, the most exciting aspect of the Metaverse is the new devices we can use to access, render, and manipulate it. It's easy to think of those ultra-powerful yet lightweight AR and immersive VR headsets. These devices aren't actually necessary for the Metaverse, but they're often considered the best or most natural way to experience the Metaverse's many virtual worlds. Executives at major tech companies seem to agree, even though presumed consumer demand for the devices has yet to translate into actual sales.

In addition to the tech giants, many mid-sized social technology companies are investing in proprietary AR/VR hardware although they have almost never produced consumer electronics, let alone distributed and services. Even in the face of continued rejection from consumers and developers, the scale of investment in these devices has not dwindled, in fact, out of a belief that history is about to repeat itself. Every time there is a massive shift in computing and networking, new devices and Better adapt to the performance gains that come with this transition. In turn, the company that tries to build these devices first has a better chance of tipping the balance of technological power, rather than just jumping on the bandwagon and creating a new line of business. The belief that AR and VR are the next big device technologies has been borne out by early signs.

Signs that mixed reality devices will lead the way also include the identification of numerous technical flaws in VR and AR headsets that could hinder mass adoption. Viewed in this light, some argue that current devices are to the Metaverse what tablets were to the smartphone age.

The greatest technological challenge of our time

We might be putting a supercomputer into a mirror frame. As we saw when we discussed computing, gaming devices don't just have to "display" previously created frames like a TV, they have to render those frames themselves. Just as the laws of physics need to be contended when dealing with latency, AR and VR headset breakthroughs face real limits. Increasing the number of pixels rendered per frame and the number of frames per second requires more processing power. These capabilities also need to be built into devices that can be worn comfortably on the head, rather than on a laptop or palm in the living room. Crucially, we need AR and VR processors to do more than just render more pixels.

We need AR and VR devices to perform commands we don't normally ask consoles or PCs to do. For example, the device includes a pair of external cameras that can help alert users who may be bumping into an object or wall. At the same time, the camera must be able to track the user's hand in order to reproduce it in a specific virtual world, or use it as a controller, which can be controlled using specific actions or gestures without pressing the controller's buttons. While this approach won't completely replace controllers, it will save the person wearing a VR or AR headset from having to carry them with them. Installing a camera inside an AR or VR headset to scan and track the user's face and eye movements allows the device to control the user's avatar based solely on facial and eye movements. However, these additional cameras also add weight and bulk to the headset, and they require more computing power, as well as more battery power. Of course, they also increase costs.

While industrial AR headsets can be bigger, people wear them and need to wear a helmet, and try to avoid neck strain, these are the constraints it faces, so the performance of AR must be improved several times .

Knowing the huge technical challenges faced by this kind of "supercomputer glasses", it is not difficult to explain why technology companies spend hundreds of millions of dollars every year to solve this problem. Despite such large-scale investment, it does not mean that breakthroughs will appear suddenly. Rather, it will be a process of continuous improvement, reducing the price and size of AR and VR devices while increasing their computing power and functionality. Even if a certain hardware platform or component supplier breaks key barriers, the market It usually takes 2 to 3 years for the rest of the project to be followed up. A key factor that sets a platform apart is the unique experience it brings to consumers.

We can already see shadows of the "smartphone wars" in the VR and AR race. AR and VR appear to be more of a challenge than smartphones in terms of gadgets. And, adapting a 2D touch interface to an almost invisible 30 space, interface design may also become more difficult. What will "pinch to zoom" or "slide to unlock" for AR and VR look like? What capabilities will users have, and when will they be useful?

Guess you like

Origin blog.csdn.net/daidai2022/article/details/132421834