【Metaverse】Long Live the Smartphone

It's fun to imagine a scenario where we could soon be in the Metaverse thanks to great new equipment. But, at least until the 2020s, it's likely that most of the devices in the Metaverse are devices we're already using.

Not only do AR and VR devices face significant technical, financial, and experience hurdles, but they also face lukewarm reception once they hit the market. Behind the rapid growth of the smartphone lies a simple fact: Although the personal computer is one of the most important inventions in human history, less than 1/6 of the world's population owns one more than 30 years after its inception personal computer. And for those lucky few, their PCs are large and immobile. AR and VR devices won't be their first computing devices, or even their first portable devices. They're struggling to become people's third and even fourth portable devices, and they're also likely to be among the least powerful devices you'll have for a long time.

But AR and VR will likely replace most of the devices we use today, though the journey may be long. Even with a total of 1 billion VR and AR headsets (two distinct device types) in use by 2030, four times the predicted 250 million, they still cover less than one-sixth of the vast majority of smartphone users . But that’s okay, by 2022, hundreds of millions of people will be spending hours every day in real-time rendered virtual worlds on smartphones and tablets, and these devices are improving rapidly.

Many new smartphones are also equipped with new-generation ultra-wideband chips that can send out as many as a billion radar pulses per second, as well as the reception that handles the return information. As a result, smartphones can create omnidirectional radar maps of users' homes and offices, and understand exactly where users are located in those maps (or other maps), and relative to other users and devices. Unlike GPS, UWB can provide location information down to centimeters. Your front door automatically unlocks when you come home, but not when you're indoors tidying up your shoe rack. With real-time radar maps, you can navigate most areas of your home without taking off your VR headset, and the device will alert you to areas where collisions are likely, or render potential obstacles in the headset so you can navigate around them. Amazingly, all of this is possible with standard consumer-grade hardware. This feature plays an increasing role in our daily lives.

Some believe that the future role of the smartphone includes acting as an "edge computer" or "edge server" for the user, providing connectivity and computing to the world around us. This model is already practiced in several ways. For example, most smartwatches sold today Neither has a cellular chip, but instead connects to the owner's phone via a cellular connection. This approach has limitations: the smartwatch can't make calls when the two phones it's tethered to are too far away, etc.

Given the scarcity, importance, and cost of computing resources, it is better to focus on single-device enhancements than to invest in many other devices, especially those with greater physical constraints, thermal constraints, and cost constraints. A computer on your wrist or head is no match for a computer in your pocket. The same is true in other respects. Personal credentials are probably the most important consideration. We may not want our data to be collected, stored or sent to a network of devices, compared to where most of us are willing to send this data from those devices to our most trusted devices (and the ones we carry with us) , and let the most trusted device decide which other devices have access to other parts of our online history, information, and which corresponding rights are available.

In order to realize the metaverse, the equipment we need, or should be able to think of, is divided into three categories.

● First, the "primary computing device": smartphones for most consumers, but possibly AR or immersive VR at some point in the future.

● Followed by "assistant" or "supportive" computing devices: such as PCs or PSs, and possibly AR and VR headsets. These devices do not necessarily need to depend on the main device, it is possible to use the main device as a supplementary device, but they are used less frequently than the main device and for more specific purposes.

●Finally, we have other devices: such as smart watches or tracking cameras, which enrich or expand the metaverse experience, but we rarely manipulate them directly.

Devices in every category and subcategory will increase engagement time and total spending in the Metaverse, and provide manufacturers with opportunities to create new lines of business. Huge sums of money have been invested in the development of these devices for a variety of purposes, although many of them take years to take shape.

The metaverse is an almost invisible experience: it is a persistent network of virtual worlds, data, and support systems. However, physical devices are the gateway to access and create these experiences. Without them, no forest can be known, heard, smelled, touched or seen. This fact reflects the important hardware and software strength of equipment manufacturers and operators. Manufacturers and carriers will determine which GPUs and CPUs to use, what wireless chipsets and standards to deploy, which sensors to include, and more. Although these intermediate technologies are critical to a particular experience, they rarely interact directly with developers or end users. Instead, they're accessed through the operating system, which manages how, when, and why developers use those features, articulates what experiences they offer users, and whether or to what extent a commission must be paid to the device's manufacturer .

In other words, the hardware not only determines what and when the Metaverse might offer, but also has the ability to influence how the Metaverse operates, and ideally, hardware manufacturers get as much of the revenue from the Metaverse's economic activity as possible soup. The more important a device is, and the more other devices it connects to, the more control the company that makes it has. To understand what this statement actually means, we need to take a closer look at payments.

The huge challenge of rendering is only a step for Metaverse hardware to break through

We need AR and VR processors to do more than just render more pixels.

1. We need AR and VR devices to perform commands that we don't normally ask consoles or gaming PCs to do.

2. These cameras must be able to track the user's hands so that they can be recreated in a specific virtual world, or they can be used as controllers, which can be controlled using specific movements or gestures without pressing the controller's buttons. While this approach won't completely replace a controller, it would save the person wearing a VR or AR headset from having to carry one with them.

3. Those extra cameras also add weight and bulk to the headset, and they require more computing power, as well as more battery power. Of course, they also increase costs. As always, the source of the magic is still the software itself, but the realization of the magic depends on the widespread use of hardware.

おすすめ

転載: blog.csdn.net/daidai2022/article/details/132561702