Dahua and Hikvision SDK docking, using javacv+ streaming media service to achieve real-time playback and playback

Recently, I need to connect Dahua and Hikvision cameras, and use the SDK to realize functions such as login, OSD setting, preview, and playback. Other functions are fine, and real-time preview and playback involve too many things. For the small partners related to camera development, it is almost collapsed, and there are very few materials in this area. I have achieved some results, so I will record the results. I hope this article can help everyone.

First of all, I won’t talk about small functions (login, loading SDK files, getting channel information, etc.). If you have any questions, you can leave a comment and leave a message. I will mainly talk about preview and playback. In fact, the preview and playback functions are the same, so I will talk about it Let’s preview it. There is one point to note before I talk about it. Hikvision’s video callback stream can be used directly. Dahua needs to set the format of the callback stream. There are specific methods in the SDK, which can be used directly. After calling the play function, there will be a callback, let's take a look at the callback method:

public class HuaRealPlayCallBack implements NetSDKLib.fRealDataCallBackEx {

    public HuaRealPlayCallBack() {
    }

    private static class CallBackHolder {
        private static HuaRealPlayCallBack instance = new HuaRealPlayCallBack();
    }

    public static HuaRealPlayCallBack getInstance() {
        return HuaRealPlayCallBack.CallBackHolder.instance;
    }

    @Override
    public void invoke(NetSDKLib.LLong lRealHandle, int dwDataType, Pointer pBuffer,
                       int dwBufSize, int param, Pointer dwUser) {
        // 指定回调流为PS格式
        if (dwDataType == 1001) {
            try {
                HuaSdkHandle huaSdkHandle = (HuaSdkHandle) HuaSdkMapCache.getCache(lRealHandle);
                if (huaSdkHandle!=null) {
                    huaSdkHandle.onMediaStream(pBuffer.getByteArray(0, dwBufSize), true);
                }
            } catch (IOException e) {
                e.printStackTrace();
            }
        }
    }
}

Here I use Dahua, so I judge the received callback stream format, otherwise it is a private stream by default. In the callback method, the handle of the current playback is obtained through the Map, put into it during playback, and then the generic is specified as a A separate processing class: HuaSdkHandle, call a method of this class in the callback, and pass the stream data of the callback to that method, and then after receiving the stream in that method, combine the pipeline stream PIpedInputStream, which is convenient to use FFmpegGrabber to pull Stream, because you need to use the pipeline stream as an input parameter to pull the stream, and then use FFmpegRecord to push the stream. The key code is as follows:

    FFmpegFrameGrabber grabber = null;
    FFmpegFrameRecorder recorder = null;
    PipedInputStream inputStream;
    PipedOutputStream outputStream;
    String pushAddress;


   /**
     * 异步接收海康/大华/宇视设备sdk回调实时视频裸流数据
     *
     * @param data
     * @param size
     */
    public void push(byte[] data, int size) {

        try {
            outputStream.write(data, 0, size);
        } catch (IOException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }

    }



       grabber = new FFmpegFrameGrabber(inputStream, 0);
        grabber.setOption("rtsp_transport", "tcp");
        grabber.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
        grabber.setVideoCodec(avcodec.AV_CODEC_ID_H264);
        grabber.setAudioStream(Integer.MIN_VALUE);
        grabber.setFormat("mpeg");
        long stime = System.currentTimeMillis();
        // 检测回调函数书否有数据流产生,防止avformat_open_input函数阻塞
        do {
            Thread.sleep(100);
            if (System.currentTimeMillis() - stime > 2000) {
                log.info("-----SDK回调无视频流产生------");
                return;
            }
        } while (inputStream.available() != 2048);

        // 只打印错误日志
        avutil.av_log_set_level(avutil.AV_LOG_QUIET);
        FFmpegLogCallback.set();

            grabber.start();
            log.info("--------开始推送视频流---------");
            recorder = new FFmpegFrameRecorder(pushAddress, grabber.getImageWidth(),               grabber.getImageHeight(), grabber.getAudioChannels());
            recorder.setInterleaved(true);
            // 画质参数
            recorder.setVideoOption("crf", "28");
            // H264编/解码器
            recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
            recorder.setVideoBitrate(grabber.getVideoBitrate());
            // 封装flv格式
            recorder.setFormat("flv");
            // 视频帧率,最低保证25
            recorder.setFrameRate(25);
            // 关键帧间隔 一般与帧率相同或者是帧率的两倍
            recorder.setGopSize(50);
            // yuv420p
            recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
            recorder.start();
            int count = 0;
            Frame frame;
            while (grabber.hasVideo() && (frame = grabber.grab()) != null) {
                count++;
                if (count % 100 == 0) {
                    log.info("推送视频帧次数:{}", count);
                }
                if (frame.samples != null) {
                    log.info("检测到音频");
                }
                recorder.record(frame);
            }
            if (grabber != null) {
                grabber.stop();
                grabber.release();
            }
            if (recorder != null) {
                recorder.stop();
                recorder.release();
            }

Among them, pushAddress is the address of the streaming media service. The streaming media service uses: ZLMediaKit . You can go to see the relevant documents. The deployment is very simple. I use the rtmp protocol to push here. After entering the streaming media service, it will automatically register For playback formats such as mp4 flv m3u8, use the playback address of the corresponding format of the streaming media service to play. Let’s take a look at the playback effect:

If you have any questions, please add me for Q consultation: 2921843100. This is probably the way of thinking. There is no problem with live broadcasting. Of course, this method takes up the network and bandwidth of the service. It needs to be optimized according to the specific scene. Pit, everyone will do more research and research.

Guess you like

Origin blog.csdn.net/ygl_csdn/article/details/123181204