Play RTSP or RTMP stream on Linux platform under Unity3D

background

Although the Windows platform has many advantages, the development of the Linux platform is still unstoppable, especially in traditional industries. However, the Linux ecological construction is always meaningless, especially for some commonly used components. This article is based on the existing Linux platform RTSP and RTMP playback modules. RTSP and RTMP live broadcast under Unity.

Technical realization

In fact, there is nothing to introduce at the Unity level. Like the Windows, Android, and iOS platforms, the native playback module is called, the decoded data is called back, and drawn in Unity. The main technical difficulties are still in the native processing, that is Pull the stream, decode, and call back the data.

The first intuitive feeling picture, this video captures the stopwatch timer form on the Windows platform, and then encodes and packages it and transmits it to the RTMP service. Unity3D's Linux platform RTMP player pulls the stream and plays it, with an overall delay of milliseconds.

On the Linux platform, we are the callback YUV data, which is NT_SP_E_VIDEO_FRAME_FROMAT_I420:

/*定义视频帧图像格式*/
public enum NT_SP_E_VIDEO_FRAME_FORMAT : uint
{
    NT_SP_E_VIDEO_FRAME_FORMAT_RGB32 = 1, // 32位的rgb格式, r, g, b各占8, 另外一个字节保留, 内存字节格式为: bb gg rr xx, 主要是和windows位图匹配, 在小端模式下,按DWORD类型操作,最高位是xx, 依次是rr, gg, bb
    NT_SP_E_VIDEO_FRAME_FORMAT_ARGB = 2, // 32位的argb格式,内存字节格式是: bb gg rr aa 这种类型,和windows位图匹配
    NT_SP_E_VIDEO_FRAME_FROMAT_I420 = 3, // YUV420格式, 三个分量保存在三个面上
}

Before starting playback, set the callback:

//video frame callback (YUV/RGB)
videoctrl[sel].video_frame_call_back_ = new SP_SDKVideoFrameCallBack(NT_SP_SetVideoFrameCallBack);
NTSmartPlayerSDK.NT_SP_SetVideoFrameCallBack(videoctrl[sel].player_handle_, (Int32)NT.NTSmartPlayerDefine.NT_SP_E_VIDEO_FRAME_FORMAT.NT_SP_E_VIDEO_FRAME_FROMAT_I420, window_handle_, videoctrl[sel].video_frame_call_back_);

 Video frame structure:

/*定义视频帧结构.*/
[StructLayoutAttribute(LayoutKind.Sequential)]
public struct NT_SP_VideoFrame
{
    public Int32 format_;  // 图像格式, 请参考NT_SP_E_VIDEO_FRAME_FORMAT
    public Int32 width_;   // 图像宽
    public Int32 height_;  // 图像高

    public Int64 timestamp_; // 时间戳, 一般是0,不使用, 以ms为单位的

    //具体的图像数据, argb和rgb32只用第一个, I420用前三个
    public IntPtr plane0_;
    public IntPtr plane1_;
    public IntPtr plane2_;
    public IntPtr plane3_;

    // 每一个平面的每一行的字节数,对于argb和rgb32,为了保持和windows位图兼容,必须是width_*4
    // 对于I420, stride0_ 是y的步长, stride1_ 是u的步长, stride2_ 是v的步长,
    public Int32 stride0_;
    public Int32 stride1_;
    public Int32 stride2_;
    public Int32 stride3_;
}

Specific callback processing:

private void SDKVideoFrameCallBack(UInt32 status, IntPtr frame, int sel)
{
    //这里拿到回调frame,进行相关操作
    NT_SP_VideoFrame video_frame = (NT_SP_VideoFrame)Marshal.PtrToStructure(frame, typeof(NT_SP_VideoFrame));

    VideoFrame  u3d_frame = new VideoFrame();

    u3d_frame.width_  = video_frame.width_;
    u3d_frame.height_ = video_frame.height_;

    u3d_frame.timestamp_ = (UInt64)video_frame.timestamp_;

    int d_y_stride = video_frame.width_;
    int d_u_stride = (video_frame.width_ + 1) / 2;
    int d_v_stride = d_u_stride;

    int d_y_size = d_y_stride * video_frame.height_;
    int d_u_size = d_u_stride * ((video_frame.height_ + 1) / 2);
    int d_v_size = d_u_size;

    int u_v_height = ((u3d_frame.height_ + 1) / 2);

    u3d_frame.y_stride_ = d_y_stride;
    u3d_frame.u_stride_ = d_u_stride;
    u3d_frame.v_stride_ = d_v_stride;

    u3d_frame.y_data_ = new byte[d_y_size];
    u3d_frame.u_data_ = new byte[d_u_size];
    u3d_frame.v_data_ = new byte[d_v_size];


    CopyFramePlane(u3d_frame.y_data_, d_y_stride,
        video_frame.plane0_, video_frame.stride0_, u3d_frame.height_);

    CopyFramePlane(u3d_frame.u_data_, d_u_stride,
       video_frame.plane1_, video_frame.stride1_, u_v_height);

    CopyFramePlane(u3d_frame.v_data_, d_v_stride,
       video_frame.plane2_, video_frame.stride2_, u_v_height);

    lock (videoctrl[sel].frame_lock_ )
    {
        videoctrl[sel].cur_video_frame_ = u3d_frame;
    }
}

After the Unity layer gets the video frame, refresh it:

private void UpdateProc(int sel)
{
   VideoFrame video_frame = null;

    lock (videoctrl[sel].frame_lock_)
    {
        video_frame = videoctrl[sel].cur_video_frame_;

        videoctrl[sel].cur_video_frame_ = null;
    }

    if ( video_frame == null )
        return;

    if (!videoctrl[sel].is_need_get_frame_)
        return;

    if (videoctrl[sel].player_handle_ == IntPtr.Zero )
        return;

    if ( !videoctrl[sel].is_need_init_texture_)
    {
        if (  video_frame.width_ != videoctrl[sel].video_width_
            || video_frame.height_ != videoctrl[sel].video_height_
            || video_frame.y_stride_ != videoctrl[sel].y_row_bytes_
            || video_frame.u_stride_ != videoctrl[sel].u_row_bytes_
            || video_frame.v_stride_ != videoctrl[sel].v_row_bytes_ )
        {
            videoctrl[sel].is_need_init_texture_ = true;
        }
    }

    if (videoctrl[sel].is_need_init_texture_)
    {
        if (InitYUVTexture(video_frame, sel))
        {
            videoctrl[sel].is_need_init_texture_ = false;
        }
    }

    UpdateYUVTexture(video_frame, sel);
}

 UpdateYUVTexture related implementation:

private void UpdateYUVTexture(VideoFrame video_frame, int sel)
{
    if (video_frame.y_data_ == null || video_frame.u_data_ == null || video_frame.v_data_ == null)
    {
        Debug.Log("video frame with null..");
        return;
    }

    if (videoctrl[sel].yTexture_ != null)
    {
        videoctrl[sel].yTexture_.LoadRawTextureData(video_frame.y_data_);
        videoctrl[sel].yTexture_.Apply();
    }

    if (videoctrl[sel].uTexture_ != null)
    {
        videoctrl[sel].uTexture_.LoadRawTextureData(video_frame.u_data_);
        videoctrl[sel].uTexture_.Apply();
    }

    if (videoctrl[sel].vTexture_ != null)
    {
        videoctrl[sel].vTexture_.LoadRawTextureData(video_frame.v_data_);
        videoctrl[sel].vTexture_.Apply();
    }
}

Related Player packages:

/*
 * SmartPlayerLinuxMono.cs
 *
 * WebSite: https://daniusdk.com
 * Github: https://github.com/daniulive/SmarterStreaming
 */

public void Play(int sel)
{
    if (videoctrl[sel].is_running)
    {
        Debug.Log("已经在播放..");
        return;
    }

    lock (videoctrl[sel].frame_lock_)
    {
        videoctrl[sel].cur_video_frame_ = null;
    }

    OpenPlayer(sel);

    if (videoctrl[sel].player_handle_ == IntPtr.Zero)
        return;

    //设置播放URL
    NTSmartPlayerSDK.NT_SP_SetURL(videoctrl[sel].player_handle_, videoctrl[sel].videoUrl);

    /* ++ 播放前参数配置可加在此处 ++ */

    int play_buffer_time_ = 0;
    NTSmartPlayerSDK.NT_SP_SetBuffer(videoctrl[sel].player_handle_, play_buffer_time_);                 //设置buffer time

    int is_using_tcp = 0;        //TCP模式

    NTSmartPlayerSDK.NT_SP_SetRTSPTcpMode(videoctrl[sel].player_handle_, is_using_tcp);

    int timeout = 10;
    NTSmartPlayerSDK.NT_SP_SetRtspTimeout(videoctrl[sel].player_handle_, timeout);

    int is_auto_switch_tcp_udp = 1;
    NTSmartPlayerSDK.NT_SP_SetRtspAutoSwitchTcpUdp(videoctrl[sel].player_handle_, is_auto_switch_tcp_udp);

    Boolean is_mute_ = false;
    NTSmartPlayerSDK.NT_SP_SetMute(videoctrl[sel].player_handle_, is_mute_ ? 1 : 0);                    //是否启动播放的时候静音

    int is_fast_startup = 1;
    NTSmartPlayerSDK.NT_SP_SetFastStartup(videoctrl[sel].player_handle_, is_fast_startup);              //设置快速启动模式

    Boolean is_low_latency_ = false;
    NTSmartPlayerSDK.NT_SP_SetLowLatencyMode(videoctrl[sel].player_handle_, is_low_latency_ ? 1 : 0);    //设置是否启用低延迟模式

    //设置旋转角度(设置0, 90, 180, 270度有效,其他值无效)
    int rotate_degrees = 0;
    NTSmartPlayerSDK.NT_SP_SetRotation(videoctrl[sel].player_handle_, rotate_degrees);

    int volume = 100;
    NTSmartPlayerSDK.NT_SP_SetAudioVolume(videoctrl[sel].player_handle_, volume);   //设置播放音量, 范围是[0, 100], 0是静音,100是最大音量, 默认是100


    // 设置上传下载报速度
    int is_report = 0;
    int report_interval = 1;
    NTSmartPlayerSDK.NT_SP_SetReportDownloadSpeed(videoctrl[sel].player_handle_, is_report, report_interval);
    /* -- 播放前参数配置可加在此处 -- */

    //video frame callback (YUV/RGB)
    videoctrl[sel].video_frame_call_back_ = new SP_SDKVideoFrameCallBack(NT_SP_SetVideoFrameCallBack);
    NTSmartPlayerSDK.NT_SP_SetVideoFrameCallBack(videoctrl[sel].player_handle_, (Int32)NT.NTSmartPlayerDefine.NT_SP_E_VIDEO_FRAME_FORMAT.NT_SP_E_VIDEO_FRAME_FROMAT_I420, window_handle_, videoctrl[sel].video_frame_call_back_);

    UInt32 flag = NTSmartPlayerSDK.NT_SP_StartPlay(videoctrl[sel].player_handle_);

    if (flag == DANIULIVE_RETURN_OK)
    {
        videoctrl[sel].is_need_get_frame_ = true;
        Debug.Log("播放成功");
    }
    else
    {
        videoctrl[sel].is_need_get_frame_ = false;
        Debug.LogError("播放失败");
    }

    videoctrl[sel].is_running = true;
}

 Called OpenPlayer implementation:

OpenPlayer mainly calls the underlying NT_SP_Open() interface, obtains the handle of the playback instance, and then sets the Event callback.

private void OpenPlayer(int sel)
{
    window_handle_ = IntPtr.Zero;

    if (videoctrl[sel].player_handle_ == IntPtr.Zero)
    {
        videoctrl[sel].player_handle_ = new IntPtr();
        UInt32 ret_open = NTSmartPlayerSDK.NT_SP_Open(out videoctrl[sel].player_handle_, window_handle_, 0, IntPtr.Zero);
        if (ret_open != 0)
        {
            videoctrl[sel].player_handle_ = IntPtr.Zero;
            Debug.LogError("调用NT_SP_Open失败..");
            return;
        }
    }

    videoctrl[sel].event_call_back_ = new SP_SDKEventCallBack(NT_SP_SDKEventCallBack);
    NTSmartPlayerSDK.NT_SP_SetEventCallBack(videoctrl[sel].player_handle_, window_handle_, videoctrl[sel].event_call_back_);

    videoctrl[sel].sdk_video_frame_call_back_ = new VideoControl.SetVideoFrameCallBack(SDKVideoFrameCallBack);
    videoctrl[sel].sdk_event_call_back_ = new VideoControl.SetEventCallBack(SDKEventCallBack);
}

Turn off playback:

private void ClosePlayer(int sel)
{
    videoctrl[sel].is_need_get_frame_ = false;
    videoctrl[sel].is_need_init_texture_ = false;

    if (videoctrl[sel].player_handle_ == IntPtr.Zero)
    {
        return;
    }

    UInt32 flag = NTSmartPlayerSDK.NT_SP_StopPlay(videoctrl[sel].player_handle_);
    if (flag == DANIULIVE_RETURN_OK)
    {
        Debug.Log("停止成功");
    }
    else
    {
        Debug.LogError("停止失败");
    }

    videoctrl[sel].player_handle_ = IntPtr.Zero;

    videoctrl[sel].is_running = false;
}

Summarize

For the live broadcast in the Unity environment, there are more Windows platforms or Android platforms, but very few of them are used on the Linux platform. On the one hand, the Linux platform itself has little demand. On the other hand, there are not many routines for reference on the Linux platform. , in fact, if the core function implementation under Windows or Android platform has been completed, it is very convenient to port it to Linux.

Under Unity, in simple terms, it is the pull-stream decoding callback, and the upper-layer drawing is actually not that complicated. It is necessary to pay attention to the writing method of DllImport, the conversion of the previous C++ structure or enumeration, and the compatibility of Unity3D with the Linux version. For developers familiar with C#, there is not much technical difficulty.

Guess you like

Origin blog.csdn.net/m0_60259116/article/details/123902230