Audio and video development-communication live broadcast protocol and video streaming丨RTMP-RTSP

In recent years, live broadcasting has become a hot topic in the Internet industry. Live answering, game live, competition live, Douyin live broadcast, and live education have emerged one after another. Live broadcast has long been a familiar technology. In fact, the rise of live broadcasting is not only related to the people's desire to speak for themselves in the new era, but also benefits from the acceleration of bandwidth and the development of CDN technology. With the maturity of CDN technology, it is becoming easier for enterprises to deploy cloud servers for live broadcast.

This article serves as a live broadcast introduction series, mainly to talk about the technical content of live broadcast protocol, video streaming, etc.

1. Live broadcast protocol (with a learning route map attached to the end of the article)

Streaming media is divided into live broadcast and on-demand. Generally speaking, the on-demand broadcast uses the HTTP protocol, and the live broadcast mainly uses RTMP, HLS, HTTP-FLV, etc. In recent years, the live broadcast protocol has also seen new developments such as DASH, but it is still in its infancy. The difference between live broadcast and on-demand protocols is rooted in their business differences.

On-demand, commonly used for the playback of media resources such as TV shows and movies on video sites such as Youku and iQiyi, that is, on-demand is a recorded video, and a thousand people watch the same video, regardless of the media data obtained by clicking in at any time They are all the same, but live broadcasts are not. The information you click in and watch is different at different times.

Generally speaking, live broadcast and on-demand are not intermingled with each other, but in recent years, some people have also innovated and developed-the live time-shift mode, that is, the combination of on-demand and live broadcast. The method is to record the live stream into a small piece of on-demand files, and then users can access any content at any location and any terminal. For example, you are watching a live broadcast of a football game, and then there is a wonderful scene. If you want to watch it again immediately, you can drag the progress bar to go back and replay it. After watching the replay, you can return to the live broadcast with one click.

The current live broadcast distribution mainly has the following characteristics:

1. The majority of flv and ts are less. The main reason is that the ts standard is too complicated. The standard open document of Flv is 11 pages, and that of ts is 174 pages. For general live broadcasts, flv can basically meet the demand, so there are fewer ts applications. Of course, we can also rely on FFmpeg, but it will encapsulate what you want and what you can't think of in streaming media, which is not accurate enough.

2, rtmp and hls coexist. Generally speaking, rtmp is used on the PC side and flash is used for playback; hls is used as a mobile phone and a tablet.

3. Real-time streaming generally uses rtmp. Rtmp can achieve a delay of 1 to 3 seconds, which is the lowest delay protocol in live broadcast except for rtsp. Direct playback is supported on the PC, and the mobile terminal can be decoded and played by FFmpeg. Besides rtmp, are there other protocols suitable for real-time streaming media playback?

In fact, http-flv is more suitable for real-time streaming than rtmp. Both have the same delay and can be broadcast directly on the PC side. The mobile side needs to use ffmpeg, but http-flv has the advantage of being able to penetrate walls. But most CDNs do not support http-flv live broadcast, because general web servers do not support http-flv, which is a streaming media problem.

Audio and video content knowledge related to the development of learning materials clicking learning materials acquisition

2. Live server (with a learning route map attached to the end of the article)

The transmission of streaming media data in the live broadcast mainly depends on the server. Currently open source streaming media servers include RED5, CRTMPD, NGINX-RTMP, and SRS.

RED5: The oldest open source streaming server based on flash streaming services. It is written in Java language, uses rtmp as the streaming media transmission protocol, and is fully compatible with FMS; it has the functions of streaming flv and MP3 files, real-time recording client streams as flv files, sharing objects, real-time video playback, and Remoting. However, due to its relatively backward technology, new live broadcast platforms have been abandoned.

CRTMPD: It is written in C++ language and supports multiple rtmp protocols, IPTV-related network protocols and streaming media servers for mobile devices. The use of single-threaded asynchronous sockets was at the leading level at the time, but when NGINX appeared, it gradually faded out of public view.

NGINX-RTMP: Based on the NGINX module, a streaming media server written in C language is also the most used streaming media server on the market. With the expansion of the CDN business in 2012, the demand for live broadcast services has skyrocketed. Because NGINX-RTMP shares a set of servers for live broadcast on-demand, and users are familiar with and trust NGINX; NGINX-RTMP has gradually become an industry monopoly.

SRS (Simple Rtmp Sever) is a domestic streaming media server. The product is positioned as an operation-level Internet live server cluster, pursuing better conceptual integrity and the simplest implementation of the code. According to the official website, its efficiency is very high, which can reach 3 times that of NGINX-RTMP, and there is a copy of Chinese and English documents, which is more suitable for the development environment of domestic programmers.

3. Live streaming (with a learning route map attached to the end of the article)

The overall process of live streaming is as follows

As shown in the figure above, in live broadcast, after collecting relevant data sources from cameras and microphones, some pre-encapsulation processing is usually required, such as denoising, beautification, voice change, etc., and then audio and video encoding is performed, and then the appropriate streaming media is used. After the protocol is encapsulated, the code rate can be adapted to the relevant site for display.

However, the methods of live streaming in different technical languages ​​are also different:

If you are an iOS or Android programmer, it will be easier to do RTMP streaming. You can directly find a database for streaming and give the video parameters and the final RTMP address to launch a standard RTMP stream.

If you are a C++ programmer, it will be a lot of trouble. You must at least master the three steps of acquisition, coding, and stream writing. Of course, these steps have libraries that can be called, but even so, assuming you use the FFmpeg library, it takes about 100 lines to complete the above action code; because the main code flow needs to include opening audio and video equipment, creating codecs, Set encoding parameters, initialize network stream handle, write protocol header, cyclically collect data, decode data, coded data, format encapsulation, and write network stream.

Of course, you can directly use the FFmpeg command line to complete the push stream with one command, but this is also limited to testing or simple demo, which is not applicable in a real engineering environment, because this simple command method has many functions Can't support it.

4. Summary: (Attach a learning route map at the end of the article)

In short, the difficulty of doing live broadcast is mainly related to the function you want to achieve. If you just plan to do the test yourself, then download an open source server code, encode and run, and then use FFmpeg one line command to push the stream, and then use the player to play it. finished. However, if you want to commercialize and meet the various needs of users, such as echo suppression, live broadcast with microphones, and beauty filters, the complexity of the problem will increase exponentially.

5. Audio and video development-learning route map

Audio and video development of learning materials and learning road map, if you need to, you can click on the audio and video learning materials into qun get

Links to learning materials: FFmpeg/WebRTC/RTMP audio and video streaming media advanced development-learning video

 

Guess you like

Origin blog.csdn.net/Linuxhus/article/details/115176762