[C#] Use ffmpeg image2pipe to save pictures as mp4 videos

Article directory

need

Before officially starting, let me introduce what my requirements are, and how to use ffmpeg to achieve them based on this requirement. for reference only.

Demand point:

  1. save picture as video
  2. The number of pictures is not fixed, it is the bitmap converted from the frame data transmitted by the upstream webrtc. So as long as webrtc is on, there will always be a picture stream.
  3. The interval time of each frame image depends on different network environments, so it is not a fixed time interval.

accomplish

Before using native ffmpeg, the author used several third-party nuget libraries, such as: FFmpeg.AutoGen, Xabe.FFmpeg, Accord.Video.FFMPEG. The first two libraries either only support saving the existing pictures in the folder as mp4, or do not support setting the PTS of each frame, resulting in the generated mp4 playing too fast. Finally, Accord.Video.FFMPEG was selected. This library can meet the above three requirements. Unfortunately, this library has not been maintained for a long time. When the upstream FPS>15, WriteVideoFramethe frequency of method throwing exceptions will be greatly increased, resulting in memory leaks, and the current frame will also be lost.

Then the project uses .net452, and the version will not be upgraded for a while, which filters most of the nuget libraries. Finally, only the native ffmpeg can be used.
ffmpeg only provides an exe, and there is no official API for us to call. It only provides a lot of parameter descriptions, which is really confusing. After constantly looking at the documentation and searching and debugging, we found that configuring the following parameters can meet our needs.

-f image2pipe -use_wallclock_as_timestamps 1 -i - -c:v libx264 -pix_fmt yuv420p -vsync passthrough -maxrate 5000k  -an -y 123.mp4

The following is a brief introduction to each parameter:

  • image2pipe: Using the picture pipeline, we can stuff the picture data into the pipeline, and ffmpeg will keep adding it to the mp4 file. Used to meet requirements 1 and 2.
  • use_wallclock_as_timestamps 1: Turn on this option, ffmpeg will use the time of receiving this picture as the timestamp of this frame. The playback speed of the MP4 generated in this way is normal and meets the requirement 3.
  • pix_fmt yuv420p: Set the pixel format to solve the problem that the generated video cannot be played by windows media player.
  • -vsync passthrough: It can be understood as a dynamic frame rate, and the frame rate of the output mp4 is determined according to the upstream frame rate. By default there are the following options:
    • passthrough: Use the original timestamp of the frame.
    • cfr (1): According to the configuration of the output frame rate, perform automatic frame insertion (the upstream frame rate is lower than the input frame rate) or frame drop (the upstream frame rate is greater than the input frame rate).
    • vfr (2): Similar to passthrough, but when two frames have timestamp, one of them will be discarded.
    • drop: Similar to passthrough, except that the original timstamp of the frame will be discarded, and then a timestamp that meets the frame rate requirements will be regenerated.
    • auto (-1): Default behavior. Automatically selected before cfr and vfr.
  • maxrate: set the maximum bit rate
  • 123.mp4: The saved file name or path, please note that there should be no spaces in it.

The final C# code is as follows, we need to use Processthe class to start ffmpeg.exe.

public class FfmpegToVideoService 
{
    
    
        private bool _isRunning = false;
        private int _fps;
        private readonly Process _proc;

        /// <summary>
        /// Bitmap保存为MP4
        /// </summary>
        /// <param name="filePath">mp4要保存的路径,如:D:\\a\b\123.mp4</param>
        /// <param name="maxBitRate">最大比特率,单位K</param>
        public FfmpegToVideoService(string filePath,int maxBitRate = 5000)
        {
    
    
            var formattedPath = Path.GetFullPath(filePath);
            _proc = new Process();
            //-pix_fmt yuv420p -movflags +faststart  -r {30}  -i pipe:.bmp -r {_fps} -timecode 00:00:00.000
            //-vsync cfr自动差值 vfr根据timestamp,重复的丢弃    passthrough根据timestamp重复的不丢  -vsync passthrough
            //-r 30 入帧出帧都是30
            _proc.StartInfo.FileName = @"ffmpeg.exe";
            _proc.StartInfo.Arguments = $"-f image2pipe -use_wallclock_as_timestamps 1 -i - -c:v libx264 -pix_fmt yuv420p -vsync passthrough -maxrate {
      
      maxBitRate}k  -an -y {
      
      formattedPath}";
            _proc.StartInfo.WorkingDirectory = CommonFunctions.BasePath;
            _proc.StartInfo.UseShellExecute = false;
            _proc.StartInfo.RedirectStandardInput = true;
            _proc.StartInfo.RedirectStandardOutput = true;
            _proc.Start();
        }

		// 将Bitmap数据写入管道
        private void SendToPipe(Bitmap bitmap)
        {
    
    
            if (_proc.StartInfo.RedirectStandardInput)
            {
    
    
                using (MemoryStream ms = new MemoryStream())
                {
    
    
                    bitmap.Save(ms, ImageFormat.Png);
                    ms.WriteTo(_proc.StandardInput.BaseStream);
                }
            }
        }

        /// <summary>
        /// 异步线程启动服务
        /// </summary>
        public override void StartAsync()
        {
    
    
            _isRunning = true;
        }

        /// <summary>
        /// 停止服务
        /// </summary>
        public override void Stop()
        {
    
    
            _isRunning = false;
            try
            {
    
    
                _proc.StartInfo.RedirectStandardInput = false;
                _proc.StartInfo.RedirectStandardOutput = false;
                _proc.StandardInput.Close();
                _proc.StandardOutput.Close();
                _proc.Close();
            }
            catch (Exception ex)
            {
    
    
                Log.Error(ex, "");
            }
        }

        /// <summary>
        /// 添加item
        /// </summary>
        /// <param name="item"></param>
        public override void Add(FrameInfo item)
        {
    
    
            if(_isRunning)
            {
    
    
                SendToPipe(item.Bitmap);
            }
        }
    }

  1. https://trac.ffmpeg.org/wiki/Slideshow
  2. https://ffmpeg.org/ffmpeg.html#filter_005foption
  3. https://stackoverflow.com/questions/60977555/adding-current-time-as-timestamp-in-h-264-raw-stream-with-few-frames

Guess you like

Origin blog.csdn.net/catshitone/article/details/126930470