The practice of building a simple live video system
Based on ffmpeg + Nginx + nginx-rtmp module + VLC implementation
1. Push stream (host side)
FFmpeg is a powerful open source multimedia framework that can be used to process multimedia data such as audio, video and pictures. At the same time, FFmpeg can also be used as a streaming tool to push local video or audio streams to remote servers.
1. Camera streaming
1.1. Obtain the camera list
To use ffmpeg for camera streaming, you need to get the device name of the camera first, and you can use the following command to get it:
-
Windows
ffmpeg -list_devices true -f dshow -i dummy
This command will list all audio and video devices (including webcams) available on the Windows system. Among them,
-list_devices true
the parameter means to list the information of the available devices,-f dshow
the parameter means to use the DirectShow framework to access the device, and-i dummy
the parameter means to use the virtual input file to access the device. -
MacOS
ffmpeg -f avfoundation -list_devices true -i ""
This command will list all audio and video devices (including cameras) available on the MacOS system. Among them,
-f avfoundation
the parameter means to use the AVFoundation framework to access the device,-list_devices true
the parameter means to list the information of available devices, and-i ""
the parameter means to use the default video device.
1.2. Camera streaming
After getting the name of the camera device, you can use ffmpeg to push the stream. The following is an example command for camera streaming using ffmpeg:
ffmpeg -f dshow -i video="USB Video Device" -vcodec libx264 -preset ultrafast -tune zerolatency -f flv rtmp://server/live/stream_key
In the above command, -f dshow
the parameter is used to specify the use of the DirectShow framework to access the device, and the parameter is used in the MacOS system-f avfoundation
to specify the use of the AVFoundation framework to access the device. -i video="USB Video Device"
parameter is used to specify the camera device name. -vcodec libx264
The parameters are used to specify the video encoding using the H.264 encoder, -preset ultrafast
the parameters are used to specify the encoding speed, -tune zerolatency
the parameters are used to specify the encoding delay, -f flv
the parameters are used to specify the output format as FLV, and rtmp://server/live/stream_key
the parameters are used to specify the server address and stream key for streaming.
rtmp://server/live/stream_key
Among them, server
is the address of the streaming server, live
is the name of the application, and stream_key
is the streaming key. Before using this command, you need to install a streaming media server that supports RTMP protocol on the server, such as Nginx-RTMP or Wowza Streaming Engine.
After executing the above command, ffmpeg will get the video stream from the camera and push it to the specified server. You can view the live content by accessing the address of the streaming server in a browser.
2. Video file streaming
To use ffmpeg
to push video files, you need to use the following command:
ffmpeg -re -i input.mp4 -c:v copy -c:a copy -f flv rtmp://streaming_server_address/stream_key
Among them, input.mp4
is the name of the video file to be streamed, streaming_server_address
is the address of the streaming media server to be streamed, stream_key
and is the streaming media key to be streamed.
The following is a description of each parameter in the command:
-re
Indicates streaming in real-time mode;-i input.mp4
Indicates the input video file to be streamed;-c:v copy
Indicates that the video stream does not need to be re-encoded;-c:a copy
Indicates that the audio stream does not need to be re-encoded;-f flv
Indicates that the output format is FLV;rtmp://streaming_server_address/stream_key
Indicates the streaming media server address and key.
Please note that this is just a simple example command and needs to be modified according to the actual situation. For example, video and audio encoding parameters may need to be adjusted to suit the streaming server requirements.
In addition, in addition to using the FFmpeg command line tool as a streaming tool, you can also use graphical user interface tools such as OBS Studio for streaming.
2. Server side
Server program: Nginx + nginx-rtmp module
1. Compile and deploy nginx-rtmp
This deployment method is applicable to Linux-based system deployment.
To use nginx-rtmp
to deploy the live broadcast system server, you need to follow the steps below:
- Install Nginx
First, you need to install Nginx, which can be installed on Ubuntu with the following command:
sudo apt-get update
sudo apt-get install nginx
- Download and compile nginx-rtmp module
Next, you need to download and compile nginx-rtmp
the module, you can use the following commands to do so:
sudo apt-get install build-essential libpcre3 libpcre3-dev libssl-dev
sudo apt-get install zlib1g-dev
cd /usr/src
sudo git clone https://github.com/arut/nginx-rtmp-module.git
sudo wget http://nginx.org/download/nginx-1.18.0.tar.gz
sudo tar -zxvf nginx-1.18.0.tar.gz
cd nginx-1.18.0
sudo ./configure --with-http_ssl_module --add-module=/usr/src/nginx-rtmp-module
sudo make
sudo make install
These commands will download and compile nginx-rtmp
the module and add it to Nginx.
- Configure Nginx
Next, Nginx needs to be configured to use nginx-rtmp
the module. The default Nginx configuration file can be edited with the following command:
sudo nano /usr/local/nginx/conf/nginx.conf
http
Add the following configuration information inside the block :
rtmp {
server {
listen 1935; # RTMP 监听端口
chunk_size 4096;
application live {
live on;
record off;
allow publish all;
allow play all;
push rtmp://localhost:1935/hls;
}
}
}
http {
server {
listen 8080;
location /hls {
types {
application/vnd.apple.mpegurl m3u8;
video/mp2t ts;
}
root /var/www/html;
add_header Cache-Control no-cache;
add_header Access-Control-Allow-Origin *;
}
}
}
These configurations will enable nginx-rtmp
the module and enable RTMP streaming on the default RTMP listening port (1935). live
An application named is defined here , and the live broadcast function is enabled and the recording function is disabled. Refer to Example nginx.conf for detailed configuration .
In this configuration, Nginx uses the RTMP module and the HTTP module to accept the video stream and distribute it to viewers. The RTMP module is used to accept pushed streams, and the HTTP module is used to provide HLS (HTTP Live Streaming) streams for viewers to watch.
- Restart Nginx
After editing the configuration file, you need to restart Nginx for the changes to take effect:
sudo /usr/local/nginx/sbin/nginx -s stop
sudo /usr/local/nginx/sbin/nginx
These commands will use nginx-rtmp
the module to deploy the live system server on the server. Please note that the parameters and configurations in these commands may need to be modified according to the actual situation.
2. Docker container deployment nginx-rtmp
This deployment method is applicable to Windows-based system deployment.
docker-compose.yml
The content of the file configuration is as follows:
version: "3.5"
services:
nginx-rtmp:
container_name: nginx-rtmp
image: tiangolo/nginx-rtmp
ports:
- "1935:1935"
restart: always
Start the container with the command in docker-compose.yml
the path where the configuration file is located docker-compose
:
- Normal start:
docker-compose up
- Background start:
docker-compose up -d
Among them, the configuration of Nginx can refer to the configuration of nginx by compiling and deploying nginx-rtmp module.
3. Streaming (viewer side)
Web-based
You can use HLS players (such as hls.js, Video.js, JWPlayer, plyr.js, etc.) to watch live video. Of course, the premise is that the server supports the media stream of the HLS (HTTP Live Streaming) protocol.
Based on PC desktop
Players such as VLC can be used for streaming viewing.