Live principle

Either you live, are inseparable from the interaction between the terminal end.

principle:

(1) recorded by the camera and microphone audio and video input devices, or mobile terminal on the computer real-time audio and video stream;
(2) then streaming audio and video coding;
(3) finally compile a good code packets are sent over broadcast protocol in real time to the server;
(4) the server through the streaming protocol data packets handed out in real time;
(5) as viewed through the terminal broadcast in real time protocol request packet, and decode and play.

Live Architecture

Mainly consists of three parts:

(1) capture audio and video data and the process plug flow
primarily work in this section are: acquisition of audio and video data, the video processing (such as beauty), audio and video encoding, and finally send the data packet to the server through the streaming protocol complete plug flow process.
This part common to the art: beauty using GPUImage, audio coding using faac, video coding using x264, plug flow frame used librtmp .

Dump processing and distribution (2) of the audio and video data server of
the server main work: transfer, storage and distribution of content data (i.e., a so-called CDN). The server can also screen capture audio and video, to show anchor picture, you can also add watermark for audio and video, real-time transcoding. Of course, it can work to record video. Server common to the technology are: SRS and Nginx.

Pull process stream (3) audio and video player to play the
first demux streams, and then decodes the audio and video playback. Of course there are gifts usually have live special effects and interactive live chat function. Common to the technology are: FFmpeg decoding library, VLC, ijkplayer, MediaCodec / MediaPlayer .

thanks for reading! Next more exciting!

Guess you like

Origin blog.csdn.net/weixin_40763897/article/details/95243413