iOS simple live video function development (real-time video and audio stream recording and encoding + RTMP transmission + real-time streaming decoding and playback)

Streaming end : It is very difficult to develop by yourself, the live video is stuck, and the audio is not available in time. Finally, the open source framework on github is used.

               The basic process of native development: AVFoundation obtains the video stream and obtains the unencoded CMSampleBuffer, which needs to be encoded into Mpeg-4 format. Coding is divided into soft coding and hard coding. Considering that VideoToolBox is open for use after iOS8, VideoToolBox is selected for coding. The bad thing is that there is no documentation for it. There is a secondary developed OC version called VideoToolBoxPlus on github, address: https://github.com/McZonk/VideoToolboxPlus , which successfully overcomes the hard coding difficulty. Select to use RTMP for push streaming, select the third-party library libRTMP, and compile successfully. The compilation method is in the previous note. The server is built with Nginx+rtmp, address: http://www.henishuo.com/mac_nginx_rtmp_server/?utm_source=tuicool&utm_medium=referral . The final death is in RTMP transmission. RTMP needs to transmit video streams in H.264 format, and the encoded CMSampleBuffer needs to be re-encoded into H.264 format for transmission. Extract relevant data from CMSampleBuffer to assemble SPS, PPS, and video data NALU, and then transmit the NALU to the server through RTMP. The end result is that the server can receive the data, but cannot parse the video format correctly. The analysis is an error in assembling NALU or an error in passing a value when transmitting with RTMP. This is the end of the native development that took more than a week.

               VideoCore: Address: https://github.com/jgh-/VideoCore After continuous searching, I finally found this framework, but after the pod came down, it was difficult to compile. It was updated six months ago. Sina has a folk over, I useless. Address: https://github.com/sinacloud/VideoCore . There is another one that looks like Tencent. It looks like some bugs have been changed in the submission comments, and the last submission date was just one month ago, indicating that it is still being maintained. You can try to use it : https://github.com/goodow/VideoCore .

               GDLiveStreaming : This is the third-party framework of choice. Address: https://github.com/goodow/GDLiveStreaming . It is said to be Tencent Live's audio and video capture and RTMP streaming, pure OC interface, easy to use. Tutorial: http://www.jianshu.com/p/83da490c0f95 . Using the local server built above to test, the live broadcast delay is about 4s, and it is a real open source code, you can study how to use it yourself. There are a bunch of live SDKs on the Internet, put them on github, look down, .a library! ! ! Good pit. This is quite conscientious, and it is also good for learning.


Streaming end : The FFMPEG+kxMovie introduced in the previous notes can be used normally. I found out that bilibili has an open source project called ijkplayer, which seems to be very good. The address is: https://github.com/Bilibili/ijkplayer .

Final result : By using GDLiveStreaming + (Nginx+RTMP) + (FFMPEG+KxMovie), a simple live video function is realized, and the mobile phone pushes the stream + the server + another mobile phone pulls the stream to watch the live broadcast.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325388664&siteId=291194637