A brief summary of my recent video playback

Because I want to build a simulcast sharing function similar to Apple's, I have done some shallow learning on audio and video during this period, and I will briefly summarize and record it now.

My requirement is to find a solution as simple as possible for two people to play a video, and the progress and operation can be synchronized, so there is basically no delay, and WebVTT subtitles can be displayed at the same time, preferably without the need for additional servers.

Video streaming solution

The front-end video stream is mainly used hls.js, flv.jsand dash.jsI finally used it hls.js.

hls.js

hls.jsIt is mostly used for online video playback. The principle is to slice a large video into small segments of a few seconds, so that you only need to keep loading small segments to play the video completely.

Use ffmpegto generate videos

ffmpeg -i video.mkv -b:v:0 12M -c:v h264_videotoolbox -c:s webvtt -c:a mp3 -map 0:v -map 0:a:0 -map 0:s:34 -f hls -var_stream_map "v:0,a:0,s:0,sgroup:subtitle,language:zh" -master_pl_name master.m3u8 -hls_time 6 -hls_playlist_type vod -muxdelay 0 video.m3u8 
  • -b:v:0 12M12M code rate
  • h264_videotoolboxIt is a hard decoding for Mac. At present, h264 is still the mainstream browser, although hevc is also supported recently.
  • -map 0:s:34Take the 34th subtitle and convert it to webvttformat
  • -var_stream_map "v:0,a:0,s:0,sgroup:subtitle,language:zh"Group
  • -hls_time 6Slice for 6 seconds
  • -muxdelay 0Solve the problem of subtitle out of sync.

Finally, it is loaded master.m3u8and ready to play.

flv.js

flv.jsIt is open source from bilibili. It can be used for live video streaming and uses RTMP streaming, but bilibili does not use this.

Use ffmpegpush streaming

ffmpeg -re -i video.mkv -c:v h264_videotoolbox -f flv rtmp://localhost/live/livestream 

First push the video to the streaming media server, and then read the video from the streaming media server.

rtmp://localhost/live/livestreamIt is the corresponding streaming media server, which is simple to use Node-Media-Serverand more complicated to use srs. This can convert RTMP streams into WebRTC streams.

dash.js

dash.jsIt also slices the video into live broadcasts or on-demand videos. Many major companies use this, such as bilibili.

WebRTC

WebRTCVideo playback can also be done, but the resolution is controlled by the browser and WebRTCis point-to-point. Although a server is not required, a signaling server is needed to exchange information, which is equivalent to creating a room number and another person joining the room. Then this The room number still needs a server to pass it.

I found a free plan and used it peer.js.

my thinking

Initially, I considered that the browser directly loads the local video and transmits it to the other party through WebRTC. The subtitles can also be transmitted through WebRTC, so there is no additional operation, and the WebRTC delay is low enough.

But one thing I didn't consider is that the videos I downloaded are basically HEVC HDR, so they need to be transcoded, and the subtitles also need to be converted to WebVTT. However, transcoding takes time. The browser can run it Wasm, and there is a project called ffmpeg.wasm, if you use the browser to decode hevc, it will work. As a result, the video of several hundred Mb will report an error and cannot be decoded at all.

Since the browser cannot decode, I will decode and push the RTMP stream locally flv.jsto , and then transmit the video stream from the browser through WebRTC. flv.jsHowever , videothe tag has no captureStreammethod and cannot pass the video stream to WebRTC. Good guy, think again. The method is canvasused to play videos and then canvascall them captureStream. The problem comes again. It does not support HDR. On the one hand, it is very confusing.

The best solution for WebRTC is to use SRS to convert RTMP to WebRTC, and then use WebRTC to play it on the browser.

After some thinking, I still chose the hls.js solution. If my M2 transcodes 1080p, the 1-hour video is about 6 minutes, which is acceptable. The main reason is that my computer has a dynamic public IP, so I just call it Xiao My partner accesses my local machine and uses hls.jsplayback, which is smooth and smooth, and then uses WebRTC to transmit the playback operation for synchronization.

So I am always learning and growing while messing around.

at last

We have compiled 75 JS high-frequency interview questions, and provided answers and analysis. This can basically guarantee that you can cope with the interviewer's questions about JS.



Friends in need can click on the card below to receive it and share it for free

Guess you like

Origin blog.csdn.net/web2022050901/article/details/129307575