A new audio-visual experience that is “at your fingertips”—tactile feedback ecology, standards and new developments

  //  

Editor's note: In an environment where the audio and video industry is booming, how can the sense of touch drive audio-visual and bring a more immersive experience to users? LiveVideoStack 2023 Shanghai Station invited Lu Qiming to share with everyone the new audio-visual experience that is "at your fingertips". The shared content includes the physiological basis of touch, the development status of the tactile ecology, how to make a good touch, platform support for tactile application development, international standards, and finally introduces the integrated solution provided by RichTap to solve the two major pain points of the tactile ecology.

Text/Lu Qiming

Organize/LiveVideoStack

The theme I share today is a new audio-visual experience that is “at your fingertips”. I mainly share the current ecosystem of tactile feedback, what are the international standards and the latest developments.

First, give a brief self-introduction. I have been doing audio and video for more than 20 years. Last year I joined AAC Technology and entered a new field: tactile feedback . ( If you are interested, you can click on the link to learn more about my career review. )

ea6d9e911b49f0220cb054ad6c2e11fe.png

This picture can be used to express what I am sharing today. Audio-visual and tactile reactions create a chemical reaction that can bring a more immersive user experience. The vibrations generated by the mobile phones we usually hold in our hands, the game controllers used when playing games and the controllers used when wearing VR headsets, and the gloves and vests I have seen made by some foreign manufacturers can all produce some tactile feedback. Generally speaking, audio-visual + touch is to produce a more immersive experience. To pursue a more realistic experience in the metaverse, audio-visual alone is not enough.

0eca674782dcecfd67a250d8922194f8.png

The content I want to share today can be divided into six parts, namely the physiological basis of touch, the development status of the touch ecology, how to make good touch, platform support for touch application development, international standards, and finally the overall solution of RichTap.

-01-

Physiological basis of touch

3269e697b19c5cfc28046bae0cb0eddf.png

As can be seen from this cross-sectional view of the brain, there are four lobes in the brain - occipital lobe, parietal lobe, temporal lobe and frontal lobe, which handle vision, touch, taste and smell, and hearing respectively. The parietal lobe is specialized in processing touch. From the area of ​​the cerebral cortex and the number of neuron discharges per second, it can be seen that the proportion of touch-related processing is very high, second only to vision. Scientific research has found that the response speed of touch is also the fastest, five times that of vision. As an innate human ability to perceive and experience the world, touch should receive more attention. In the metaverse, we often feel that something feels too illusory because we can't touch it. Therefore, in the future, if the metaverse wants to achieve better development, it needs to receive more attention, especially the way the five senses are reproduced in the metaverse.

-02-

The development status of tactile ecology

58c829cee7fb98dd18d97bfed909b508.png

Next, let’s look at the current ecological development status of haptics. Taking mobile phones as an example, the tactile development of mobile phones is relatively mature and has entered the 3.0 era. In the first stage, the vibration touch of the mobile phone is only used to express incoming call reminders and text message reminders. Most people's cognition is still at the first stage, thinking that vibration is just vibrating the phone, but they don't know that vibration can be used in a variety of applications. In the first stage, the vibration of the mobile phone is expressed through the rotor motor.

In the second stage, linear motors are used to generate vibrations. Motor hardware, the hardware that generates vibrations, is also being developed further. System applications such as input methods, UI controls, ringtones, etc. all use vibration. The corresponding tactile ecology at this stage has also begun to develop.

The current touch ecosystem has entered the third stage, and more and more applications are beginning to use the ability of mobile phones to vibrate. At this stage, many mobile phones have already used high-end X-axis linear motors + high-voltage chips + Rich Tap overall touch solutions. Games, social media, audio and video applications, interactive advertising, more and more applications are beginning to use the native vibration capabilities of mobile phones to express various effects. At this stage, the infrastructure of software and hardware is constantly improving, and the haptic ecosystem is about to usher in a good opportunity to flourish.

b25ef1e37446a389049c0147667aad28.png

This diagram shows the evolution of haptic hardware. The first stage is a rotor motor, in which an unbalanced mass rotates around an axis. The rotor motor expresses the feeling of vibration, which is described by an onomatopoeia, which is "buzzing". It responds more slowly and vibrates more harshly. The second stage is called the Z-axis linear motor. Its movement space is only as large as the thickness of the mobile phone screen, and it vibrates reciprocally in it. Because of the limited space, its performance of tactile effects is also limited. The most high-end one is the X-axis linear motor that RichTap is promoting, which makes linear reciprocating motion in the horizontal direction of the mobile phone screen to generate vibration. The characteristics of this kind of motor are fast start and stop and strong vibration feeling. Its bandwidth is relatively wide, ranging from 50 Hz to 500 Hz, and can express rich vibrations. At the same time, it has low power consumption and saves more power. The upper right corner is a cross-sectional view of the motor, and the lower right corner is a comparison picture. The motor tuned by RichTap's overall tactile solution has a wider frequency range and greater vibration volume. It can express more vibration effects and can also simulate the vibration and touch of many scenes in the real world.

dee507973807d7877d64cd805e08c5c0.png

More than 60% of new machines every year are equipped with X-axis linear motors and high-quality tactile solutions. Including domestic Huawei, Xiaomi, OPPO, VIVO, Lenovo, etc.

In the content application ecosystem shown on the right, you can see dynamic videos made by bilibili, exciting scenes of TV series made by iQiyi, and the rhythm effects of some variety shows; for example, Douyin once launched a dynamic advertisement, QQ Music and NetEase Cloud Music also have product functions such as "Rhythm Lab" and "Hi-Dynamic Mode" that synchronize with the music; the phone will vibrate when WeChat's "bomb" emoticon is sent; there are also input methods, which can be used when typing. Adding vibration to the method can better simulate the feeling of physical keyboard tapping; the bottom category is games, which have greater demand for vibration because various scenes in the game have higher requirements for experience, including Honor of Kings, Peace Elite, Xiaoxiaole and Ace Racing, etc.

c1ec042efb00a84b9dff6e35726f358a.png

These are two typical applications of haptic ecology. In the "China Rap Peak Showdown 2022", which RichTap collaborated with iQiyi last year, users can feel the vibration effect when watching the singer sing on stage. After iQiyi's video vibration function is turned on, the phone will vibrate in sync with the song, giving users a stronger sense of presence. In addition to variety shows, there are also TV series. The popular TV series "Hurry Up", which was broadcast some time ago, used heartbeat effects to exaggerate the tense atmosphere of the plot in episodes 9, 11, 12, and 26. There are also previous popular dramas such as "Genius Basics" and "Heart Residence". iQiyi has also added a vibration function. iQiyi currently mainly uses heartbeat vibration effects, but in fact, there are many scenes in film and television dramas that can use vibration effects, such as shooting, explosions, etc.

Another typical application is games . "Peace Elite" has produced more than 200 vibration effects, such as footsteps, gunshots, broken glass, the sounds of characters being injured, etc., achieving coverage of the entire game scene. There is a settings menu in "Peace Elite", through which you can distinguish between high-quality vibration and low-quality vibration. Through certain compatibility strategies, the user's mobile phone can display the best vibration effect within the capabilities of its own motor hardware.

-03-

Design and expression of touch

77953452d5dc67ab6698ee164c00ecbc.png

Let’s take a closer look at the basic principles of tactile design. If you want to make such a vibration yourself, what should you do? From a design perspective, it can be divided into three layers, with the lowest layer being its basic attributes. The three parameters related to vibration are intensity, frequency, and duration of vibration .

Based on these three basic attributes, the second layer is the attributes that humans can perceive, such as the hardness, roughness, speed, quality, elasticity and rhythm of touched objects.

The top layer is called applicability. For example, the feeling of various materials and the vibration produced by stepping on grass and cement are different; it can also express the character's mood more intuitively like audio and video. Happiness or unhappiness, or even surprise, can be expressed by vibration; vibration can also express force feedback, that is, the resistance encountered when pressing hard. The current motor touch control is very powerful and can express the effects in many scenarios.

f5bddb011d3a707629a0286ed7bb4a8d.png

These are the six expressive dimensions of tactile design. Each dimension can be designed and expressed to varying degrees. The overall effect can be very rich.

c6d2043e4d6b466945b8d21987005127.png

How to achieve touch? The entire tactile ecology has two pain points: first, the design and implementation of effects. The implementation of tactile sensation depends on hardware and software. Regarding hardware, RichTap has cooperated extensively with mobile phone manufacturers. 60% of new phones are equipped with high-end X-axis linear motors, and the system layer integrates algorithms to control the motors. In terms of software, the design lacks tools. For example, I saw the vibration effect of an Xbox controller, which was written directly in code. RichTap has a tool that can visually design vibration effects.

In terms of application integration, vibration effects must be implemented on various platforms after they are designed, because each platform supports different situations, and the current tactile standards are lagging behind. International standards are being formulated, but they have not yet reached a very mature stage. Various platforms such as Apple, Android, Meta, and Sony have different support for vibration.

RichTap has corresponding solutions for these two pain points.

-04-

Platform support for haptic application development

3256a0eaa4e0f2de1794de80404be29f.png

Take a look at how each platform implements vibration:

Apple's tactile feel is the best in the entire industry. Apple's software and hardware can be said to be industry benchmarks, and it provides a wealth of software control interfaces. The earliest one is Haptic feedback in UlKit, which is feedback on the UI and defines several styles: heavy, light, hard, and soft. The earliest UI feedback was very convenient and can be reflected in Apple's applications.

Later, Apple launched a core technology in iOS 13-Core Haptics. If anyone has seen Apple's disassembly, they can see that there is a device called Taptic Engine, which is actually Apple's vibration motor.

At the software level, Apple has abstracted two concepts, intensity and sharpness frequency, to express vibration effects and various vibration modes, and then completely express the vibration effects on Apple phones through Core Haptics. It can also be saved in a separate file format called .ahap file. Some companies may have made their own formats, basically referring to Apple's.

258e84bd9b513af259fb1321f569d1be.png

Android's earliest vibration interface was vibrator, which could only achieve vibration simply by "vibrating for a few milliseconds." Because at that time, the mobile phone only needed a reminder function, and the reminder effect could be achieved by letting the motor vibrate for a few milliseconds. But this interface has been abandoned since Android 8.0. Later, the interface was improved, and it could be arranged in an array to vibrate according to a certain pattern. However, this interface was also abandoned in 8.0. What is recommended later is a stronger orchestration function, which can repeat vibrations at certain intervals and adjust the intensity each time it vibrates. Android 10.0 has preset four effects, including check, click, re-click, double-click, etc. If you just want to add UI feedback in an Android application, you can directly call the system interface and pass the preset effect ID. But if you want to create richer tactile effects, you can't rely on Android's preset effects.

d01b6158f83caea22547abfd8d6d8f10.png

Take another look at Windows PCs. PC does not have a vibration motor, so how to achieve vibration on a PC? Vibration can be achieved by connecting an Xbox controller. The trackpad of Windows laptops has a motor, but it is rarely used in applications. A more typical scenario is to connect the controller to the computer and play some PC games. The earliest high-performance multimedia on PCs used DirectX, of which DirectInput was specifically designed to handle peripheral input. But DirectInput does not yet support vibration of the handle. Later, Microsoft launched an interface called XInput, which can support up to four controllers connected to the PC and can make the left and right controllers vibrate. However, the only adjustable parameter it supports is rotation speed, which can only express a relatively thick vibration effect. What Microsoft now recommends is to use its other set of interfaces, which is a set of interfaces in Xbox's GDK called GameInput, which supports more complete functions. Because there are two triggers on the front of the Xbox controller, the vibration of the two triggers can only be controlled through the GameInput interface. GameInput also supports games similar to those played with a steering wheel, which also has some vibration feedback.

b9946d1f21246453064aa1e490104870.png

How to vibrate a web page? In fact, W3C also defines a vibration interface, which is very similar to Android's. It is an interface that simply sets how long to vibrate. A slightly more complicated thing is to program a certain pattern, such as vibrating for a period of time and stopping for a period of time. However, because this interface was abused by some web pages when the browser first defined it, Apple later completely canceled this vibration interface, so web pages cannot vibrate in Apple browser. You can also control the Gamepad controller on the web page, but it is still in draft status and many browsers do not support it.

The WeChat applet can vibrate. WeChat provides two interfaces, supporting long vibration and short vibration respectively. The short vibration is 15 milliseconds and the long vibration is limited to 400 milliseconds. This is a very simple control, and RichTap also used its two interfaces to create some scenes to demonstrate the experience improvement brought by vibration. Overall, it offers very limited control features.

744fcd35a4272cc4dcf38eb4ad3735f0.png

Let’s take a look at the game engine, taking Unity as an example. Unity has a classic Input Manager, but the Input Manager does not support tactile feedback. There is also an XR Plugin, because Unity makes VR games and has a VR handle. There are two motors on the VR handle, and vibration feedback can be initiated through the Input Device. Its adjustable parameters are intensity and duration, but not frequency. Unity later launched a new New Input System, which opened a Gamepad interface that can control the vibration of the game controller. However, the parameters it can adjust are only the rotation speed of the controller. It can be guessed that it is just a packaging of Microsoft's XInput interface. It can adjust the motor speed, pause Haptics, resume playback, reset and other simple control functions.

d782d709e739b4039d7e7f29ddd80bad.png

iQiyi’s head display can provide Unity’s XR SDK and Unreal SDK. Vibration control can be achieved through these SDKs and game engine plug-ins. They also provide Native SDK, and the control interface is relatively simple, that is, starting and stopping vibration. VR has two handles, and you can specify which handle to vibrate on, as well as the intensity and duration of the vibration. PICO provides Unity, Unreal and Native SDK, but the Native SDK has been offline in January 2023 and will be replaced by Open XR's Mobile SDK. The Open XR shown in the figure is actually an international standard, which defines the control interface of some peripheral devices of XR, including vibration control. The vibration control implemented by PICO is slightly stronger, it defines its own vibration format. There are also Meta's Quest and Sony's PlayStation VR2. From the perspective of SDK, they will provide control interfaces for Unity, Unreal, Native, and WebXR.

-05-

International standards & latest developments

28cf023b3f3b838b15397c4a8044d763.png

This is supported by international standards for vibration. The OpenXR 1.0 specification has two interfaces for vibration, one is to start vibration, and the other is to stop vibration. The parameters that can be adjusted include intensity, frequency, and duration. This kind of vibration control can only be a simple effect set by Hardcode in the code, which is relatively not that flexible. Based on the OpenXR 1.0 specification, Meta has extended it. The extension interface can describe relatively complex vibration effects through an envelope formed by a set of intensity values, and can also express vibration effects through audio PCM data. Through the form of Buffer, relatively long and complex vibration effects can be expressed. The current OpenXR can only support relatively simple vibration, control and relatively simple peripherals. We are currently also working with a haptic industry association to promote the standard evolution of OpenXR to support the haptics of more complex peripheral devices.

d847f545b7875e9c96431686179c6e9d.png

Regarding touch, MPEG also has related support. In 2020, haptics has been submitted to the MPEG File Format working group as a first-level media type. The so-called first level means that it will be at the same level as Audio and Video. In January 2022, this proposal has been compiled into the 7th edition of the standard version. In fact, tactility is already part of the MPEG standard.

I have to mention another company called Immersion, which is also fighting for haptics. Currently, there are many MIME Types of data transmitted on the Internet, including application, text, image, and audio. In the future, touch is as important as audio-visual, and it will become a major media type and file type. When we get an MP4 file, in addition to audio and video streams, it may also contain tactile vibration data streams, or even an MP4 file. The file is just a stream of vibration data. So from the perspective of the decoder, more compatibility needs to be considered. In the future, Apple's own .ahap file format will also become a part of the standard, transmitted and distributed on the Internet, and can be played directly on the terminal.

21702c4a1698cdc6c8b3a3deeb5f29e6.png

Another standard is IEEE P2861.3, which is a standard developed by Tencent. This standard is currently in the final approval stage and the standard itself is not public. The basic concept of this standard can be expressed in a diagram, which can be abstracted into different combinations of long signals and short signals. The intensity and frequency of a short signal can be adjusted, and are characterized by being very short, similar to a rapid excitation; the parameters that can be adjusted for a long signal include intensity, frequency, duration, and anchor points. During the entire long signal process, there can be multiple anchor points. Control, each point can have different intensity and frequency. Its characteristics are relatively dynamic, variable speed, and lasting for a certain period of time. A single long signal can have different expressions, such as continuous vibration without change, which is similar to a rectangle; fading and fading, like a trapezoid; or sharper, allowing to appear and disappear, and can rise and fall quickly at will, similar to Mountain shape, slope shape. There are various other shapes. These shapes can be controlled by anchor points to complete the design. Through the combination and superposition of various signals, various rich vibration effects in real scenes can be expressed.

-06-

RichTap overall solution

eb1181770c9f21339f8d46bec0d15fad.png

RichTap is a sub-brand of haptic solutions under AAC Technology Group. In terms of content providers, first of all, RichTap provides a PC tactile design tool. In this tool, long signals and short signals can be combined to express vibration effects, and can be connected to mobile phones. Because there is no vibration motor on the PC, you can instantly experience the vibration effect by connecting it to a mobile phone or controller. After the haptic design is completed, a cross-platform tactile description file (.he file) will be exported, which can be integrated in the application. Whether it is audio and video software or games, integrating RichTap's SDK and adding a tactile description file can reproduce the vibration touch in the application.

RichTap SDK calls the underlying control algorithm of the Core Engine on the device, which converts the control signal into an analog signal through the driver chip and then sends it to the actuator. The exciter is a vibration motor, and eventually the device in our hands will vibrate. The left side of the picture is RichTap's main software solution, and the right side is part of the cooperation with hardware manufacturers. As time goes by, the proportion of hardware that supports this solution and has RichTap motors will increase!

At present, the development of this motor hardware is relatively mature and has strong capabilities. The capabilities of the hardware can be brought into play through the integration of software solutions, making applications more immersive and realistic. RichTap has made two demo APPs (it is recommended to scan the QR code in the picture above to install). One is called RichTap Creator, which contains many vibration effects and can be applied in various scenarios. Another one is the Muse algorithm. In addition to manually designing various signals, the design tool also supports importing video or audio files. The algorithm automatically adds vibration signals to the audio and video, and finally exports vibration-related files.

There are two links here: one is GitHub, RichTap provides sample code, and also includes a free SDK. The free version of the SDK integrates about 50 vibration effects. If it is a simple application, directly integrating the free version of the SDK can meet the needs. For more information, please visit the official website.

ba30db2bc50179358332b547cf9407d6.png

This is a more comprehensive summary of the RichTap solution.

First, unified interfaces and independent debugging. From an integration perspective, both Android and iOS devices are supported, and the interfaces are very similar, making integration easy.

Second, after the design is completed, you can connect to the mobile phone for real-time debugging. There are many parameters to control, including static frequency, intensity, and duration, which can be expressed in the He file. When playing vibration, you can also adjust the intensity and frequency in real time, giving you very strong control.

Third, the RichTap solution separates and decouples effect design from programmers’ coding work. With a visual tool suite, designers can work independently and focus on vibration effect tuning. The consistency of effects across models is of great concern from a creative perspective. When a designer designs a vibration effect, he hopes that the experience will be consistent at all ends. With RichTap's SDK, we can try to achieve the consistency of effect performance and restore the design to the greatest extent.

Fourth, this solution provides design integration tools and game engine plug-ins to facilitate integration.

Fifth, provide a powerful template library. Because RichTap has done many game projects and application projects, it has accumulated a certain amount of experience in tactile design. You can make modifications based on the effect library to speed up your own tactile effect design.

Sixth, RichTap has a sound-to-vibration algorithm that can make tactile design faster and more convenient. For example, in online music variety shows, it is impossible to spend manpower to assign vibration effects to each song. They are usually pre-produced on the server through algorithm tools, and the vibration-related files are sent to the front-end when the user requests them, similar to playing a plug-in. For subtitles, just play the vibration effect against the timeline.

Seventh, taking into account audio and video playback scenarios, RichTap provides a synchronized playback function for audio and video, supporting double-speed playback.

Eighth, RichTap takes into account Android compatibility issues. Because Android is more fragmented, the motors installed on Android machines may be different, and some may not use RichTap's control scheme. However, RichTap's SDK perfectly solves these problems. On machines with RichTap motors and control schemes, The vibration effect can be expressed to the extreme, and on some low-quality models, the vibration effect can be optimized as much as possible using the Android native interface.

-07-

Summarize

522f8b0b0e913e0925ba5c0f5b13cf26.png

Finally, to summarize, the tactile ecology is actually a new ecology. Everyone in the audio and video industry is involved. In addition to audio-visual, the addition of touch can make audio-visual more immersive and bring a new dimension of experience. Touch is becoming increasingly mature in terms of hardware, and software applications are in the ascendant. International standards for touch are also in the process of being formulated, but they are still slightly lagging behind. But since it is a standard, it should attract everyone's attention.

AAC supports the prosperity of the tactile ecosystem through technical solutions, providing integrated software and hardware solutions. We hope that you can use the tool suite provided by AAC to design rich tactile effects. Through the SDK or game plug-in provided by AAC, you can quickly integrate vibration effects into your applications or game projects, making the application more vivid, real and immersive. .

The above is what I want to share, thank you.



6cb034f650b453cdb9f695bf15552459.png

Scan the QR code in the picture or click " Read the original text " 

Direct access to LiveVideoStackCon 2023 Shenzhen Station 10% off ticket purchase channel

Guess you like

Origin blog.csdn.net/vn9PLgZvnPs1522s82g/article/details/132913379