Because I want to build a simulcast sharing function similar to Apple's, I have done some shallow learning on audio and video during this period, and I will briefly summarize and record it now.
My requirement is to find a solution as simple as possible for two people to play a video, and the progress and operation can be synchronized, so there is basically no delay, and WebVTT subtitles can be displayed at the same time, preferably without the need for additional servers.
Video streaming solution
The front-end video stream is mainly used hls.js
, flv.js
and dash.js
I finally used it hls.js
.
hls.js
hls.js
It is mostly used for online video playback. The principle is to slice a large video into small segments of a few seconds, so that you only need to keep loading small segments to play the video completely.
Use ffmpeg
to generate videos
ffmpeg -i video.mkv -b:v:0 12M -c:v h264_videotoolbox -c:s webvtt -c:a mp3 -map 0:v -map 0:a:0 -map 0:s:34 -f hls -var_stream_map "v:0,a:0,s:0,sgroup:subtitle,language:zh" -master_pl_name master.m3u8 -hls_time 6 -hls_playlist_type vod -muxdelay 0 video.m3u8
-b:v:0 12M
12M code rateh264_videotoolbox
It is a hard decoding for Mac. At present, h264 is still the mainstream browser, although hevc is also supported recently.-map 0:s:34
Take the 34th subtitle and convert it towebvtt
format-var_stream_map "v:0,a:0,s:0,sgroup:subtitle,language:zh"
Group-hls_time 6
Slice for 6 seconds-muxdelay 0
Solve the problem of subtitle out of sync.
Finally, it is loaded master.m3u8
and ready to play.
flv.js
flv.js
It is open source from bilibili. It can be used for live video streaming and uses RTMP streaming, but bilibili does not use this.
Use ffmpeg
push streaming
ffmpeg -re -i video.mkv -c:v h264_videotoolbox -f flv rtmp://localhost/live/livestream
First push the video to the streaming media server, and then read the video from the streaming media server.
rtmp://localhost/live/livestream
It is the corresponding streaming media server, which is simple to use Node-Media-Server
and more complicated to use srs
. This can convert RTMP streams into WebRTC streams.
dash.js
dash.js
It also slices the video into live broadcasts or on-demand videos. Many major companies use this, such as bilibili.
WebRTC
WebRTC
Video playback can also be done, but the resolution is controlled by the browser and WebRTC
is point-to-point. Although a server is not required, a signaling server is needed to exchange information, which is equivalent to creating a room number and another person joining the room. Then this The room number still needs a server to pass it.
I found a free plan and used it peer.js
.
my thinking
Initially, I considered that the browser directly loads the local video and transmits it to the other party through WebRTC. The subtitles can also be transmitted through WebRTC, so there is no additional operation, and the WebRTC delay is low enough.
But one thing I didn't consider is that the videos I downloaded are basically HEVC HDR, so they need to be transcoded, and the subtitles also need to be converted to WebVTT. However, transcoding takes time. The browser can run it Wasm
, and there is a project called ffmpeg.wasm
, if you use the browser to decode hevc, it will work. As a result, the video of several hundred Mb will report an error and cannot be decoded at all.
Since the browser cannot decode, I will decode and push the RTMP stream locally flv.js
to , and then transmit the video stream from the browser through WebRTC. flv.js
However , video
the tag has no captureStream
method and cannot pass the video stream to WebRTC. Good guy, think again. The method is canvas
used to play videos and then canvas
call them captureStream
. The problem comes again. It does not support HDR. On the one hand, it is very confusing.
The best solution for WebRTC is to use SRS to convert RTMP to WebRTC, and then use WebRTC to play it on the browser.
After some thinking, I still chose the hls.js solution. If my M2 transcodes 1080p, the 1-hour video is about 6 minutes, which is acceptable. The main reason is that my computer has a dynamic public IP, so I just call it Xiao My partner accesses my local machine and uses hls.js
playback, which is smooth and smooth, and then uses WebRTC to transmit the playback operation for synchronization.
So I am always learning and growing while messing around.
at last
We have compiled 75 JS high-frequency interview questions, and provided answers and analysis. This can basically guarantee that you can cope with the interviewer's questions about JS.
Friends in need can click on the card below to receive it and share it for free