H5 Live Starter (theory section)

This article is reproduced in: Ape 2048 Website ➻ https://www.mk2048.com/blog/blog.php?id=i1kia0h2bb

Foreword

This summary is actually a week last year, the company will turn to technology to share my time writing. At that time the company is aggressively preparing live broadcast services, in charge of private thought would be entrusted with the task, to turn a lot of forum, I made a simple technique to share. Later Live service to make another colleague to bear, and he is no opportunity to practice live, pity it. Well, ado, we begin theoretical articles ~

technical background

It can be seen live from the PC to the mobile terminal has been developed to an increasing number of live class App on the line while moving into an unprecedented outbreak of the live stage, but for most mobile broadcast, that was to be achieved in order to Native Client mainly, but H5 end mobile broadcast also carries an irreplaceable role, such as H5 has spread fast, and easy release of the advantages, while the most critical is the H5 can also play live video.

Live 2016 is the first year, first because the major broadband providers comply with public opinion widening price, and second, large capital flows into the broadcast sector, and promote the iterative update technology. Market, the most commonly used is the introduction of Apple HLS broadcast protocol (H5 original support playback), of course, RTMP, HTTP-FLV, RTP and so on.

Live video file formats and protocols

Video file formats

Video file formats fact, we often referred to as container format, that is, we often talked about most in life in general format, flv, mp4, ogg format. It is to be understood that the particular bit stream into a box in a certain order. That use different formats to install video Is there a problem?

The answer is, no problem, but you need to know how to unlock the box and find the corresponding decoder for decoding. If you do that, then according to this, for these mp4, ogv, webm video formats and so on, as long as I have these corresponding decoder and a player, then there is no problem. Then the video bit stream into a box inside, where a certain period if there is a problem, then the final generated file is actually unusable, because the box itself is problematic.

However, there is a misunderstanding of the top places that I just understood as a static video stream. Imagine if a video requires continuous playback, for example, live, live broadcast and so on. Here, we took TS / PS stream to explain.

  • PS (Program Stream): Static file stream

  • TS (Transport Stream): Dynamic file stream

Container format directed to the above two, in fact, a video bitstream is made of a different process.

  • PS: The complete video bit exiled to a box, fixed file generated
  • TS: the received video, divided into different boxes. Finally generated file with multiple boxes.

Then the result is, if one or more boxes damaged, PS format can not watch, but TS will just skip frames or a mosaic effect. The specific difference between the two is this: the higher the fault tolerance for the video, it will choose TS, the lower the rate of fault-tolerant video, will choose PS.

Live HLS protocol

HTTP Live Streaming (referred to as HLS) is a protocol based on HTTP video streaming. This is a live streaming protocol proposed by Apple. Currently, IOS and Android versions will support high HLS. HLS what is it? HLS two main contents are .m3u8 .ts file and play the file.

HLS protocol is based on HTTP, and a server providing HLS need to do two things:

Encoding: In the image coding H.263 format to MP3 or HE-AAC encoding sound, eventually packed into the MPEG-2 TS (Transport Stream) of the container;

Segmentation: TS coded file as long as a good cut into small files suffix ts, and generates a plain text index file .m3u8;

Browser using m3u8 file. m3u8 with the list of audio formats like m3u, you can simply think m3u8 is to include multiple playlists ts files. Player-by-player in order, all done and then look m3u8 file request to obtain the playlist contains the latest ts files continue to be played again and again. The whole process is to rely on a live constantly updated m3u8 ts and a bunch of small files, m3u8 must be dynamically updated, ts can go CDN.

Here, we highlight about the process of the client. First of all, the reason is broadcast live, that its content is updated in real time. HLS that is how complete it?

We use the HLS can directly be included with a video:

<video autoplay controls>
    <source src="xxx.m3u8" type="application/vnd.apple.mpegurl" />
    <p class="warning">Your browser doesn't support video</p>
</video>

According to the above description, it is to actually file a request index of .m3u8. This file contains a description of the relevant .ts file, for example:

However, this is just a very simple, it does not involve any of the functions of the live stream. In fact, HLS entire architecture can be divided into:
masterplaylist mainly done is according to the current user's bandwidth, resolution, and other conditions the decoder decides which stream to use. So, master playlist for a better user experience and existence.

When filling the master playlist URL, then the user will only download once the master playlist. Then, the players decide which media playlist (that is, sub-m3u8 file) based on the current environment. If, at the playback, the user's playback condition is changed, the player will switch the corresponding media playlist.

Of course, HLS support functions, not just slice players (specifically applies to live), it also includes other proper function.

  • Ts file encryption using HTTPS

  • Fast / rewind

  • Ad insertion

  • Different resolution video switching

You can see the HLS protocol is essentially still one of the HTTP request / response, adaptability so well, the firewall will not be affected. But it also has a fatal weakness: the delay phenomenon is very obvious. If every 5 seconds ts according to segmentation, a ts 6 m3u8 put index, will bring at least a delay of 30 seconds. If you reduce the length of each ts, and reduce the number of index m3u8 the delay does reduce, but will lead to more frequent buffering, the pressure on the service side of the request will be doubled. You can only find a compromise point according to the actual situation.

Note: HLS support safari browser only on the PC side, similar to the chrome browser using HTML5 video tag can not play m3u8 format, can be directly used the Internet some of the more mature programs, such as: sewise-player, MediaElement, videojs-contrib-hls, jwplayer .

Live RTMP protocol

Real Time Messaging Protocol (referred to RTMP) is Macromedia developed a set of video broadcast protocol, now belongs to Adobe. And the same can be applied to HLS live video, the distinction is based on RTMP flash can not be played in the iOS browser, but real-time performance is better than the HLS. Therefore, the general use of this protocol to upload the video stream, a video stream that is pushed to the server.

The following is a comparison of the HLS and RTMP:

Live protocol HTTP-FLV

RTMP and HTTP-FLV similar, are distributing streaming for live video formats to FLV do. However, the two have a big difference.

  • Direct launch long connection, download the corresponding FLV files
  • Simple header information

Now available in the market, more commonly used is HTTP-FLV playback. However, since the end of the phone is not supported, so, H5 of HTTP-FLV is also a sore point. Now, however, flv.js can help high version of the browser to parse through mediaSource. HTTP-FLV is also very simple to use. And HLS, as you can just add a connection:

<object type="application/x-shockwave-flash" src="xxx.flv"></object>

Live the basic architecture

Currently live more mature product, generally are based on H5 and Server side and Native (android, ios) achieved with live, this routine is basically the following figure:

Complete broadcast can be divided into the following pieces:

  • Video recording end: generally audio and video input device or mobile phone side of the camera or microphone on your computer, mobile terminal currently based mobile video.

  • Video playback side: can be played on your computer, mobile phone side Native player, there is the H5 of video labels, is still end of the phone has been the main player Native.

  • Video Server: nginx is generally a server to accept video source to provide end video recording, while providing end to the video player streaming service.

The current H5 similar live page, achieve little technical difficulties, which can be divided by implementation: ① the bottom of the video background video tag using the video player to achieve ② concern, comment module uses WebScoket to new messages sent and received in real time through the DOM and CSS3 ③ achieve thumbs use CSS3 animations

For the barrage, it is slightly more complicated, you may need to focus on the following points:

  • Barrage real-time, can be used to send and receive real-time webscoket new barrage and rendered.

  • Webscoket do not support the browser, it can only send a request to downgrade long-polling or timer to get real-time front-end barrage.

  • Collision detection and animation (i.e., do not overlap barrage) or the like at the time of rendering barrage

H5 live program

Use flv.js do live

  • Brief introduction

flv.js is an open source project from the Bilibli. It parses the FLV file is fed to native HTML5 Video tag play audio and video data, the browser without the aid of FLV Flash player possible.

  • Advantage

Because the browser on native Video label uses hardware acceleration, good performance, high-definition support. Supports both taped and live. Get rid of dependence on Flash.

  • Browser dependent

flv.js rely on browser features Compatibility List

1、HTML5 Video

2、Media Source Extensions

3、WebSocket

4、HTTP FLV: fetch 或 stream

  • principle

flv.js only do one thing, after obtaining the audio and video data in FLV format by native JS FLV to decode the data, and then fed the native HTML5 Video tag by Media Source Extensions API. (HTML5 only supports native playback mp4 / webm format does not support FLV)

Why should flv.js perimeter, get FLV conversion from the server and then decoded and then fed Video label it? For the following reasons:

1, compatible with the current live program: Most programs live audio and video services are using FLV container format for transferring audio and video data.

2, FLV container format MP4 format compared to the simpler, faster and more convenient to resolve them.

  • Compatible Solutions

PC end

1, preferentially using HTTP-FLV, because it is a small delay, the performance is not bad 1080P is very smooth.

2, on the use of Flash does not support flv.js player RTMP streaming broadcast. Flash compatibility is very good, but the performance difference is disabled by default many browsers.

3, do not want to use Flash-compatible HLS can also be used, but the PC side only Safari supports HLS

Mobile end

1, preferentially using HTTP-FLV, because it is a small delay, support equipment performance run flv.js HTTP-FLV enough.

2, does not support flv.js on the use of HLS, but HLS delay very large.

3, HLS does not support it can not live, because the mobile terminal does not support Flash.

Well, after all, is the entry theoretical articles, follow-up if there are practices I will update the feeling reading now, than the heart ~

Guess you like

Origin www.cnblogs.com/qianduanwriter/p/11789635.html