JSMpeg live video demo practice

Technology stack
● JSMpeg
● node server
● node package: node-rtsp-stream
● mac environment installation ffmpeg (brew install ffmpeg)
1. JSMpeg

  1. What is JSmpeg?

JSMpeg: MPEG1 Video and MP2 Audio Codec in JavaScript
is a video player written in JavaScript. It consists of an MPEG-TS demultiplexer, MPEG1 video and MP2 audio decoders, WebGL and Canvas2D renderers, and Web Audio sound output. JSMpeg can load static videos via Ajax and allows low-latency streaming over websockets (~50ms)

Can JSMpeg stream 720 video at 30fps on an iPhone 5S, works in any modern browser (Chrome, Firefox, Safari, Edge), and is only 20kb compressed

<script src="jsmpeg.min.js"></script>
<div class="jsmpeg" data-url="video.ts"></div>
  1. Usage
    A JSMpeg video player can be created in HTML using the container's CSS class jsmpeg
<div class="jsmpeg" data-url="<url>"></div>

Or directly call the JSMpeg.Player() constructor in JavaScript
var player = new JSMpeg.Player(url, [, options]);
Note that using HTML elements (JSMpeg.VideoElement internally) in JSMpeg.Player IQ provides some Function. Namely the SVG pause/play button and the ability to unlock audio on the iOS animation hit day.

The url parameter accepts the URL of an MPEG.ts file or a Websocket server (ws://…) The
options parameter supports the following properties:
● canvas - HTML canvas element for video rendering. If not given, the renderer will create its own Canvas element.
● loop - whether to play the video in a loop (only for static files) default is true
● autoplay - whether to start playing immediately (only for static files). Default false
● audio - whether to decode audio. Default is true
● video - whether to decode video. Defaults to true
● Poster - URL to an image to use as a poster to display before the video plays.
● pauseWhenHidden - Whether to pause playback when the tab is inactive, defaults to true. Note that browsers usually restrict JS in inactive tabs
● disableGl - Whether to disable WebGL and always use the Canvas2D renderer. The default is false.
● disableWebAssembly - whether to disable WebAssembly and always use the JavaScript decoder. Default false
● preserveDrawingBuffer - whether to use preserveDrawingBuffer to create the WebGL context - required for "screenshots" via canvas.toDataURL(). The default is false.
● progressive - whether to load data in chunks (static files only). When enabled, playback can start before the entire source has fully loaded. Defaults to true.
● throttled - when using progressive, whether to delay loading chunks when they don't need to be played. Defaults to true.
● chunkSize - when using progressive, the chunk size (in bytes) to load at one time. The default is 1024 1024 (1mb).
● decodeFirstFrame - Whether to decode and display the first frame of the video. Used to set the canvas size and use the frame as a "poster" image. This has no effect when using autoplay or streaming sources. Defaults to true.
● maxAudioLag – When streaming, the maximum length of audio to queue (in seconds).
● videoBufferSize – When streaming, the size in bytes of the video decoding buffer. The default is 512
1024 (512kb). For very high bitrates, you may have to increase this value.
● audioBufferSize – the size in bytes of the audio decoding buffer when streaming. The default is 128*1024 (128kb). For very high bitrates, you may have to increase this value.
● onVideoDecode(decoder, time) – callback called after every decoded and rendered video frame
onAudioDecode(decoder, time) – callback called after every decoded audio frame
onPlay(player) – called when playback starts
onPause(player) – callback called when playback is paused (for example, when .pause() is called or the source ends)
onEnded(player) – callback called when playback reaches the end of the source (only called if loop is false ).
● onStalled(player) – callback called when there is not enough data to play
● onSourceEstablished(source) – callback called when the source receives data for the first time
● onSourceCompleted(source) - a callback called when the source has received all data

  1. demo practice
  2. HTML file
// index.html文件中
<script type="text/javascript" src="https://jsmpeg.com/jsmpeg.min.js"></script>
  1. accomplish
// index.jsx文件中
import React, { useEffect } from "react";

import styles from "./index.module.less";

const JSMpegCom = () => {
  useEffect(() => {
    const video = document.getElementById("video");
    const url = "ws://" + document.location.hostname + ":9998/";
    const player = new window.JSMpeg.Player(url, {
      canvas: video,
      disableWebAssembly: true,
      disableGl: true,
      autoplay: true,
      loop: true
    });
    player.play();
  }, []);

  return (
    <div className={styles["containers"]}>
      <h1>视频直播</h1>
      <canvas id="video" className={styles["video"]}>
        事实上
      </canvas>
    </div>
  );
};

export default JSMpegCom;

2. The node server
streams any RTSP stream and outputs it to websocket for use by jempeg. HTML5 streaming video (requires ffmpeg)

// server.js


const Stream = require("node-rtsp-stream");
// 设置rtsp视频流地址
const rtsp_urls =
  "rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov";
const streams = new Stream({
    
    
  name: "sockets",
  streamUrl: rtsp_urls,
  wsPort: 9998,
  ffmpegOptions: {
    
    
    // 选项ffmpeg标志
    "-stats": "", // 没有必要值的选项使用空字符串
    "-r": 30 // 具有必需值的选项指定键后面的值<br>    '-s':'1920*1080'
  }
});

Execution service:

node server.js

3. Demo effect

GitHub uploaded the demo warehouse address: https://github.com/coco723/blog/issues/12

Guess you like

Origin blog.csdn.net/gkf6104/article/details/122642245