Table of contents
The front-end plays the picture of Hikvision camera
Method 1: HTMLVideoElement (obsolete)
Method 2: HTMLCanvasElement (obsolete)
need
The front end receives the hls stream converted from rtsp by webrtc-streamer
Click the start button to start recording for a period of time
Click the Stop button to stop recording
Click to upload to save the video to the server
accomplish
The front-end plays the picture of Hikvision camera
Since surveillance equipment mostly uses rtsp streaming to transmit video, but H5 cannot receive this type of video stream, webrtc-streamer is used for transfer and playback.
Record video
Method 1: HTMLVideoElement (obsolete)
1. Use the caputure of HTMLVideoElement to capture the video stream being played in the video component
2. Put the video stream into the list
3. Convert the list to blob data type
4. Use axios to transfer the stored blob data to the backend
5. The backend receives the blob data and stores it on the disk, and stores the path in the database
Disadvantages: Traditional HTML can be implemented, but the HTMLElement in ts format adapted to vue3 does not yet provide the HTMLVideoElement.captureStreamer method
Method 2: HTMLCanvasElement (obsolete)
1. Use the video component to play videos
2. Draw video to canvas
3. Capture canvas media stream
4. Use axios to transfer the stored blob data to the backend
5. The backend receives the blob data and stores it on the disk, and stores the path in the database
Disadvantages: Draw the video to the canvas, and then capture the media stream of the canvas, suitable for second-level recording
Method 3: javaCV
1. The front-end obtains the timestamp and task information and transmits them to the back-end
2. The backend uses javaCV to pull the rtsp stream, read the frame, and record the angle frame according to the timestamp.
Advantages: Suitable for long-term video storage
code
public class SceneService {
private volatile boolean stopFlag = false;
private volatile String videoPath;
private final SceneMapper sceneMapper;
private FFmpegFrameGrabber grabber;
private FFmpegFrameRecorder videoWriter;
private Thread videoThread;
@Autowired
public SceneService(SceneMapper sceneMapper){
this.sceneMapper = sceneMapper;
}
public void recordVideo(String taskName){
// 设置rtsp流地址
String rtspUrl = "rtsp://你的用户名:你的密码@127.0.0.1:8554/stream";
// 设置保存路径
String rootDir = "E:\\ProgramSoftware\\java\\AIDetectCloudPlatform\\recordVideo";
String fileName = taskName + ".mp4";
this.videoPath = Paths.get(rootDir, fileName).toString();
// 使用FFmpegFrameGrabber 类拉取rtsp流
try {
grabber = FFmpegFrameGrabber.createDefault(rtspUrl);
// 禁用音频
grabber.setOption("rtsp_transport", "tcp");
grabber.start();
System.out.println("开始拉流");
// 设置视频读取器
int width = grabber.getImageWidth();
int height = grabber.getImageHeight();
videoWriter = new FFmpegFrameRecorder(this.videoPath, width, height);
videoWriter.setFormat("mp4");
videoWriter.setAudioChannels(2);
videoWriter.start();
System.out.println("开始记录");
// 创建新的线程来处理循环
this.videoThread = new Thread(() -> {
try {
while (!this.videoThread.isInterrupted() && !this.stopFlag) {
Frame frame = grabber.grab();
if (frame == null) {
System.out.println("记录结束");
break;
}
long currentTimestamp = frame.timestamp;
System.out.printf("frame timeStamp %s \n", currentTimestamp);
// 向目标地址存储
videoWriter.record(frame);
}
videoWriter.stop();
grabber.stop();
this.stopFlag = false;
}catch (Exception e){
e.printStackTrace();
}
});
this.videoThread.start();
} catch (Exception e) {
e.printStackTrace();
}
}
public void stopRecordVideo(String taskId) {
this.stopFlag = true;
// 关闭线程
if ( this.videoThread != null && this.videoThread.isAlive()) {
System.out.println("记录结束");
this.videoThread.interrupt();
}
// 将录像与任务关联
sceneMapper.collectTaskVideo(taskId, this.videoPath);
}
}
question
1. Traditional HTML provides the media stream captureStream method for the video component. The HTMLELEMNT adapted to the ts used by vue3 does not yet provide a method to directly extract the media stream for the video component.
2. Redraw the video to the canvas. HTMLELEMENT provides a method to obtain the media stream of the canvas, but it is only suitable for short-term recording.
3、报错:java.lang.UnsatisfiedLinkError: Could not find jniavutil in class, module, and library paths.
Only introduced Javacv dependencies also need to introduce FFMEPG-Platform
4、报错:No audio output stream (Is audioChannels > 0 and has start() been called?)
setAudioChannels is not set for the instance object of the FFmpegFrameRecorder class