Develop and build a live broadcast platform from scratch-tutorial summary

Write before you start

I have also shared a lot of scattered technical articles before, including technologies related to audio and video, image processing and streaming services . However, all of them are written to implement specific functions rather than systems. Therefore, many readers are confused and puzzled. Readers need to have a certain understanding of streaming media related architecture and knowledge to get started.

For developers, these are all good to say. But for non-technical readers, it can be said to be directly discouraging, so this series of articles for developers to get into technology to and non-technical to update two kinds of tutorials, respectively, for "non-technical to the" series of tutorials will not Code writing appears, and strive to be easy to understand; for developers to get "technical" series of tutorials, there will be a lot of code development and optimization tutorials.

This series of articles will use specific application scenarios as practical tutorials, systematically explain how the architecture of the entire live broadcast platform is designed, and how to develop these subsystems or module functions to build a complete live broadcast platform.

Supplement: This tutorial is also applicable to video conferencing , video surveillance and other application scenarios. The principle of Lianmai in video conference and live broadcast is the same. The difference between video surveillance and live broadcast platform is that there is no active push end, and there is more pull or transfer end.

Live broadcast platform structure

In order to prevent the problem of blind people touching the elephant. We will understand the structure of the entire live broadcast platform from face to point, and then decompose this face into individual subsystems or specific module functions, so that we can clearly understand what the function of each system module under the live broadcast platform is , How it works.

The simplest live broadcast platform structure:
The simplest live broadcast platform architectureIn fact, as long as we have built a streaming service, we have completed the simplest live broadcast platform. The rest is to wait for the main stream of the live UP, and then the user can watch it, is it very simple?

Of course, this is not enough. The user does not know how to obtain and play your live broadcast address. The live UP host does not know where to push the stream. The live broadcast platform is to help the user display the room of the live UP host, which is convenient for screening and viewing. It is convenient for users to communicate with the main host who is broadcasting live. As for the push streaming of the live UP main, you still need to know the live broadcast address and use the push tool such as OBS to push the stream before you can live broadcast.

CDN distribution :
Why is CDN distribution used? This is because there are more users watching live broadcast using the live broadcast platform. A major problem arises, that is, insufficient bandwidth.

For example: suppose you have only one live UP master who is pushing a 1080P HD live broadcast, the bit rate is 4Mbps / s (512KB per second), now there are 1,000 users watching at the same time, each user is also streaming 4Mbps / s ( 512KB per second), then one thousand people also need 4000Mbp / s (500MB per second real-time traffic) bandwidth, which means that less than 3,000 users need 10 Gigabit bandwidth.

How to do? It is conceivable that it is unrealistic to rely on bandwidth hard resistance alone. And domestic bandwidth costs are not cheap. So at this time, CDN is needed to help alleviate the bandwidth pressure of the streaming media center server. CDN distribution generally charges according to traffic, and there is no free lunch in the world.
At this time, the structure of the live broadcast platform becomes like this:
Insert picture description here

Push end

The software commonly used at the push end is the OBS push tool. This tool will not go into details, and it is widely used.
Of course, for developers, FFmpeg and opencv are definitely needed . FFmpeg is to do audio and video encoding and push streaming. Opencv is of course used to achieve beauty .

Streaming service

The streaming media service is a relay station for real-time video.
Streaming media services are generally speaking: nginx , srs , red5, etc. The follow-up tutorials of this series will mainly involve nginx and srs .

Pull stream player

PC end: VLC
web end: videojs, flv.js, hls.js, ckplayer, etc.
Mobile end: ijkplayer
WeChat applet: it is best to use the penguin cloud to play, otherwise there will be compatibility issues, pro test only penguin own webpage The player can be compatible with QQ WeChat applet and public account at the same time.

The follow-up of the series of tutorials will mainly use VLC to test the pull stream playback. Although VLC has a large delay, the compatibility is the best. The web-side test mainly uses videojs and flv.js and the penguin player

CDN distribution

Many CDN service providers support the distribution of streaming media such as rtmp, flv, hls, etc. Because they involve specific manufacturers and do not advertise, they are not listed here.

Next chapter

Develop and build a live broadcast platform from scratch-Streaming media services build 1-nginx streaming media service build
Develop and build a live broadcast platform from scratch-streaming media services build 2-srs streaming media service build

javacv series

Building a streaming service previously written

Thanks for supporting eguid original article. Welcome to reprint the article, but please also indicate the source, it is not easy to create, thank you very much!

Guess you like

Origin www.cnblogs.com/eguid/p/12741787.html