因质量控制导致回放录像过快的原因分析

编者:李国帅

qq:9611153 微信lgs9611153

时间:2014/2/26 13:19:55

背景原因:

数年前资料,使用directshow播放录像文件,发生过快现象。其中一个原因是由质量控制引起的。

 

问题分析:

 

如果快速播放文件的话,也不能正常显示视频。

远程回放的时候快放现象,实际播放的时间比想要播放的时间短很多。

不能使用sleep暂停播放,那样的话视频也不会正常显示。

并且可能出现EC_QUALITY_CHANGE事件,一旦出现视频就会出现快放,直到文件结束,并且图像不再继续。

 

[1520] CMyPlayer::OnNotifyGraph eventCode=EC_QUALITY_CHANGE 11

[1520] CVideoOutPin::FillBuffer:Last WasteInMS Need=39 True=6,rtBeginPos=1199888 m_rtlastBeginPos=800000 rtEndPos=1999888,TimeStamp last=39600,now=46800

[1520] CVideoOutPin::FillBuffer:Last WasteInMS Need=80 True=6,rtBeginPos=1999888 m_rtlastBeginPos=1199888 rtEndPos=2799888,TimeStamp last=46800,now=54000

[1520] CVideoOutPin::FillBuffer:Last WasteInMS Need=80 True=6,rtBeginPos=2799888 m_rtlastBeginPos=1999888 rtEndPos=3200000,TimeStamp last=54000,now=57601

下面的消耗时间全部是6毫秒。对于从网络获取的数据,丢上一两帧是正常的,但是因此就不再播放是不合理的。

 

所以,播放的时候最好按照帧率进行播放,而不是推测帧率按照帧率来播放,太复杂,并且难以控制。

 

A filter sends this event if it drops samples in response to a quality control message.

It sends the event only when it adjusts the quality level, not for each sample that it

drops. For more information, see Quality-Control Management.

 

EC_QUALITY_CHANGE

(过滤图为了质量控制丢桢)

The graph is dropping samples, for quality control.

A filter sends this event if it drops samples in response to a quality control message.

It sends the event only when it adjusts the quality level, not for each sample that it drops.

 

This message means normally that the renderer is dropping samples (i.e. they are arriving too late, meaning the CPU is overloaded).

 

如果调高远程桌面venderer的显示效果。

需要调低分辨率,帧率吗? 使用cif也没有效果,依然显示一帧。

如果遇到这样的消息,需要重新建立graph吗?

到底是在解码filter丢帧还是在显示filter丢帧?加入中间过滤器看看。

 

质量控制 Microsoft DirectX 9.0 chm文档内容

http://www.directshows.ru/htm/defaultqualitycontrol.htm

Quality-Control Management

This is preliminary documentation and subject to change.

 

Quality control is a mechanism for adjusting the rate of data flow through the filter graph in response to run-time performance.

If a renderer filter is receiving too much data or too little data, it can send a quality message.数据太多太少就会发送质量消息

The quality message requests an adjustment in the data rate.

By default, quality messages travel upstream from the renderer until they reach a filter that can respond (if any).

An application can also implement a custom quality manager. In that case, the renderer passes quality messages directly to the application's quality manager.

--

Default Quality Control

This is preliminary documentation and subject to change.

 

The DirectShow Base Classes implement some default behaviors for video quality control.

 

Quality messages start at the renderer. The base class for video renderers is CBaseVideoRenderer, which has the following behavior:

 

When the video renderer receives a sample, it compares the time stamp on the sample with the current reference time.

The video renderer generates a quality message. In the base class, the Proportion member of the quality message is limited to a range of 500 (50%) to 2000 (200%). Values outside this range could result in abrupt quality changes.

By default, the video renderer sends the quality message to the upstream output pin (the pin connected to its input pin). Applications can override this behavior by calling the SetSink method.

What happens next depends on the upstream filter. Typically, this is a transform filter. The base class for transform filters is CTransformFilter, which uses the CTransformInputPin and CTransformOutputPin classes to implement input and output pins. Together, these classes have the following behavior:

 

The CTransformOutputPin::Notify method calls CTransformFilter::AlterQuality, a private method on the filter base class.

Derived filters can override AlterQuality to handle the quality message. By default, AlterQuality ignores the quality message.

If AlterQuality does not handle the quality message, the output pin calls CBaseInputPin::PassNotify, a private method on the filter's input pin.

PassNotify passes the quality message to the appropriate place—the next upstream output pin, or a custom quality manager.

Assuming that no transform filter handles the quality message, the message eventually reaches the output pin on the source filter. In the base classes, CBasePin::Notify returns E_NOTIMPL. How a particular source filter handles quality messages depends on the nature of the source. Some sources, such as live video capture, cannot perform meaningful quality control. Other sources can adjust the rate at which they deliver samples.

 

The following diagram illustrates the default behavior.

 

The base video renderer implements IQualityControl::Notify, which means you can pass quality messages to the renderer itself. If you set the Proportion member to a value less than 1000, the video renderer inserts a wait period between each frame that it renders, in effect slowing down the renderer. (You might do this to reduce system usage, for example.) For more information, see CBaseVideoRenderer::ThrottleWait. Setting the Proportion member to a value greater than 1000 has no effect.

 

 

----

EC_QUALITY_CHANGE 网络资料

 

Windows 7 - Intercept EC_QUALITY_CHANGE event

Asked By fungi821 on 12-Jul-07 01:06 AMHi,

         On the document it says EC_QUALITY_CHANGE event originates from the

Video Renderer. I have a situation that whenever FGM receives such event,

decoder drops frames constantly.

         If i want to intercept such event, such that so it won't be sent to the

decoder, how should i come about to achieve this? Write a trans-in-place

filter between the renderer and the decoder? Which function do i need to

override?

 

thanks!

 

In fact although the event message and the dropped frames occur at the same time, they are not directly cause and effect.

The quality message that causes the decoder to drop frames is a signal directly from the VR to the decoder's Notify function.

You can trap this, but if you just want to avoid dropping frames you are better off turning off the graph clock

(query the graph itself for IMediaFilter and call SetSyncSource(NULL)).

---------------msdn 时钟

http://msdn.microsoft.com/en-us/library/windows/desktop/dd407202(v=vs.85).aspx

 

The IBaseFilter interface inherits from IMediaFilter.

 

IMediaFilter::SetSyncSource method

HRESULT SetSyncSource(  [in]  IReferenceClock *pClock);

pClock [in]

Pointer to the clock's IReferenceClock interface, or NULL. If this parameter is NULL, the filter graph does not use a reference clock, and all filters run as quickly as possible.

 


IGraphBuilder *pGraph = 0;
IReferenceClock *pClock = 0;

CoCreateInstance(CLSID_FilterGraph, NULL, CLSCTX_INPROC_SERVER,  IID_IGraphBuilder, (void **)&pGraph);

// Build the graph.
pGraph->RenderFile(L"C:\\Example.avi", 0);

// Create your clock.
hr = CreateMyPrivateClock(&pClock);
if (SUCCEEDED(hr))
{
    // Set the graph clock.
    IMediaFilter *pMediaFilter = 0;
    pGraph->QueryInterface(IID_IMediaFilter, (void**)&pMediaFilter);
    pMediaFilter->SetSyncSource(pClock);
    pClock->Release();
    pMediaFilter->Release();
}

 

You can also set the filter graph to run with no clock, by calling SetSyncSource with the value NULL. If there is no clock, the graph runs as quickly as possible. With no clock, renderer filters do not wait for a sample's presentation time. Instead, they render each sample as soon as it arrives. Setting the graph to run without a clock is useful if you want to process data quickly, rather than previewing it in real time.

 

----------------------

sathaadj问题

 

Is there a possibility that the MJPEG Decompressor can generate EC_QUALITY_CHANGE event on its own.

I considered that it could be derived from the CVideoTransformFilter base class code which by default has

a quality management code. According to this code (vtrans.cpp) the filter which derives from it should call

ShouldSkipFrame() every time before sending the frame to the output pin.

 

The ShouldSkipFrame() code in vtrans.cpp is written such a way that it will not consider doing quality control

when there are no time stamps set on the incoming samples, so i decided to put a transform filter inbetween

Camera and the MJPEG Decompressor which would remove the timestamps from the IMediaSample by using

SetTime(NULL,NULL);

 

BOOL CVideoTransformFilter::ShouldSkipFrame( IMediaSample * pIn)

{

...

         NotifyEvent(EC_QUALITY_CHANGE,0,0);

...

    return m_bSkipping;

}

 

if i set the timestamps as NULL then GetTime should have returned failure and hence there should not have

been any frame drops by the MJPEG Decompressor filter. But i still have the same problems, i also coded AlterQuality()

function for my new filter which is inbetween camera and MJPG Decompressor but i never received any indication to it,

does that mean it has been handled in the MJPEG Decompressor filter itself?

 

I also tried SetSyncPoint(TRUE) on every sample sent to the MJPEG Decompressor which also doesn't work MJPEG Decompressor still drops all the key frames?!!!

 

I have previously tried to set the quality sink to my encoder filter also which also doesnt get any notification about quality change.

Can any one please help me in narrowing down this problem?????

--Stefan 回答

If you CPU hits 100 % the VMR7 will most likely send quality messages.

So if you override AlterQuality() it will be called. I just wrote a message killer filter which returns S_OK to stop quality messages being sent downstream, and it works.

Here are some questions to consider:

 

How do you know that the MJPG Decompressor freezes ?

Does this also happen with just the VMR7 connected to the Encoder ?

What transformation performs the Encoder ?

How does the Encoder react to quality messages ?

----sathaadj 再问

Hi Stefan,

 

          Thanks for your reply. I have removed the VMR7 from the graph by which my graph looks like

 

Camera ---> MJPG Decompressor --> Inf Tee Filter --> Encoder --> Network

 

There are no VMR's in this graph even then i am getting this event and the MJPG Decompressor freezes.

 

Here are the answers to your questions:

 

>>>>How do you know that the MJPG Decompressor freezes ?

 

I have checked whether i get any data in the encoder's input pin after the EC_QUALITY_CHANGE change message,

i dont get any buffers after the event occurs.

 

>>>>Does this also happen with just the VMR7 connected to the Encoder ?

>>>>What transformation performs the Encoder ?

 

My Encoder converts the RGB24 input to a network streamable video (H.264) which is streamed through my network sink filter.

So i have given NULL media types for my encoder's ouput pin, which makes it impossible to connect with the VMR7.

 

>>>>How does the Encoder react to quality messages ?

 

I just wanted to check whether they originate from the VMR7 filter so i just returned S_OK from the encoder's AlterQuality() function

anyways i did not get the EC_QUALITY_CHANGE event there.

 

Any sort of help is most welcome.... ;)

--------

That sounds as if quality messages are not part of your problem.

 

Did you try

Camera ---> MJPG Decompressor --> Inf Tee Filter --> Encoder-> Null Renderer ?

and

Camera ---> MJPG Decompressor --> Inf Tee Filter --> Null-Renderer ?

 

That should give you a direction which filter causes the problem.

 

The infinitee filter uses a queue on every output pin which can switch from direct sample delivery to asynchronous (threaded) delivery.

This behaviour depends on the implementation of the ReceiveCanBlock() method of the downstream filters. What does your encoder and network sink return in this ?

 

Regards,

Stefan.

2009年3月25日 13:20

----------sathaadj

Stefan,

 

        The graph is meant to see the local video as well as send it to the far end. When the camera is connected for VGA to the

MJPG Decompressor filter i am able to see the local video through the VMR7 properly. But only when i start the encoder to

give out the data to the network i see the problem.

 

        There are no problems when the camera runs for QVGA or CIF(untill the CPU is less utilized). So only when the resolution

is high (640*480) onwards, i start to see the problems. I have tested in another system where VGA encoding does not take

more CPU and it works fine there.

 

        Is there a way to find out which filter has sent this EC_QUALITY_CHANGE event other than setting the quality sink, which

wont be passed through if the filter can handle the event on its own?

 

        I still believe that MJPG decoder does some performance measurement on its own and decides to stop the streaming. Any

inputs on this view? Are there any other hidden interfaces not documented in MSDN to support these situations?

 

>>>>>>>What does your encoder and network sink return in this ?

 

       The encoder filter is derived from the CTransformFilter and i haven't coded ReceiveCanBlock() function. The nework sink

filter is derived from CBaseFilter and i have overridden CBaseInputPin::ReceiveCanBlock() function to return S_FALSE; So when i start

the graph i hit the network sink's ReceiveCanBlock function call, i hope that the Infinite Pin Tee Filter will see this result and take action.

 

       Will Infinite Tee start a new thread to deliver data for the encoder filter if it sees S_OK on a ReceiveCanBlocK() function call? I have

coded ReceiveCanBlock in encoder as well as the network sink to return S_OK and found the same results.

 

Thanks for your timely reply,

Satheesh.

--------------------------------------------------------------------------------

问题最后的解决方法:

在所有的过滤器连接之后,取消时间戳同步,避免graph根据图像的实际帧率和调整图像的质量,导致远程桌面无法播放视频。

HRESULT CGraphBase::ConnectFilters()
{
//... 连接graph中的所有过滤器,然后
	if (mMediaFilter && m_pVideoRenderer->GetRendererType() == Try_VMR7)
	{
		mMediaFilter->SetSyncSource(NULL);
	}
	return NOERROR;
//...
}
IMediaFilter*		mMediaFilter;
mMediaFilter = NULL;

HRESULT CGraphBase::CreateGraph(HWND inWindow)
{
//... 创建graph
	hr |= m_pGraph->QueryInterface(IID_IMediaFilter, (void **)&mMediaFilter);
	ATLASSERT(mMediaFilter != NULL);
//..
}
HRESULT CGraphBase::ReleaseGraph()
{
//...
	SAFE_RELEASE(mMediaFilter);
//.. 销毁graph
}

 

猜你喜欢

转载自blog.csdn.net/lgs790709/article/details/84838792
0条评论
添加一条新回复