Detailed RenderTexture in Unity3D

What is RenderTexture

There In Texture U3D in a special type, called the RenderTexture, it is essentially a word is connected to a server-side objects Texture of a FrameBufferObjecrt.

What is a server-sider the texture?

During rendering, texture starting with the presence of cpu side of the memory, the texture we usually called texture client-side, which ultimately is to be sent to storage in gpu, gpu rendering to use it, to that is called a gpu in texture server-side of. The tex between the cpu and gpu copy to take into account certain bandwidth bottlenecks.

What is FrameBufferObject?

FrameBuffer is gpu rendering results in the destination, we draw all results (including the color depth stencil, etc.) are finally there is this here, there is a default FBO it is directly attached to the display window of our region, it is to draw our drawing objects into the window area of the display. But modern gpu can usually create a lot of other FBO, which is not connected FBO window area, the existence of this purpose we created FBO is to allow us to render the results are stored in a storage area of gpu until after use, this is a very useful s things.
When the rendering results are rendered on a FBO, there are many ways to get these results, we can think of is to use this form as a result of a Texture get, there are usually several ways to get this texture so:
● the results are returned on the CPU side FBO textures is generally achieved gles is ReadPixels () this function that is currently set to readable copy FBO cpu side of a storage buffer, no If the fault current is set readable FBO FBO is the default, and that this function is in screenshots, if you own FBO is created, it just drawn to the above results back from the gpu memory storage.
● The results on this texture to the FBO a copy gpu, is generally implemented in gles is CopyTexImage2D (), it is generally part of the copy readable FBO a texture to an object is present on the gpu, direct examination the server-sider means can immediately be used to render gpu
● the texture associated with the object directly on the fbo a gpu, so that it is equal to while drawing to draw directly on the texure, this also eliminates the need for copy time, gles of It is generally used FramebufferTexture2D () such an interface.
Then the unity of RenderTexture It is an implementation of a third way, which defines a tex objects in the server-side, and then render to draw directly on the tex.

What's the use?

We can render the results of some of the scenes rendered to a tex, tex this may continue to be used. For example, the car's rearview mirror, rearview mirror can be attached to a rendertex, it is from this perspective rendering from a camera.

We can also use this to do some image processing operations, the conventional image processing in the cpu in a for loop, to process one pixel, if we render a square, and then passed into the picture as tex, write a fragment shader, will rendering the results rendered to a rendertex, then the result is something on rendertex image processing, image processing some of the effect of unity (such as blur, hdr, etc.) is to do so.

RenderTexture several ways to render the

① created in assets in a RenderTexture, and then attach it to a camera, so the camera real-time rendering result in this tex up.
② Sometimes we want every human control rendering, you can disable the camera off once and then manually call the render.
③ Sometimes we want to use a special shader to render this RenderTexture, it can be called a cam RenderWithShader this function, you will use the specified shader to render a scene, this time on the scene of the original shader objects will be automatically replaced this shader, and the parameters are passed by name. What's the use? For example, I want a black and white perspective of the current scene, then you can write a black and white rendering of shader, call this function. (There is also a replacement of the concept of shader, not much to say, look unity document)
④ We can also do not have to create rendertexture in assets, directly Graphics.Blit (src, target, mat) function to render to render texture on, here's the target that you want to plot render texrture, src is this mat in _mainTex need to use, can be ordinary tex2d, may be another rendertex, the nature of this function is to draw a four-square, then this material mat with src do maintex, then first clear as black, and then rendered to the target. This is a quick way for image processing. We can see a lot of post-processing effects UNITY Graphics.Blit is a series of operations to complete a heavy processing of the images, if done on a cpu that would almost be stuck.

Getting results from a rendertex

In most cases we render to rt is to continue to use other mat as tex. At this time we only need to pass this call settexture rt on the line on the mat, which is totally operate on the gpu.

But sometimes we want to copy it back to this side of the cpu memory, for example, you want to save as an image, you want to see what this figure, because you can not directly hold rt to get information for each pixel of it, because he has no memory of information on this side. Texture2d reason why, is because the choice of tex2D read / write attribute, it retains a memory side of the mirror. This test is a part of the write back of a) way back to rt gpu copy from memory, note that this is not a high operating efficiency. copy back method is usually like this

Texture2D uvtexRead = new Texture2D()

     RennderTexture currentActiveRT = RenderTexture.active;
     // Set the supplied RenderTexture as the active one
     RenderTexture.active = uvTex;
     uvtexRead.ReadPixels(new Rect(0, 0, uvTexReadWidth, uvTexReadWidth), 0, 0);
     RenderTexture.active = currentActiveRT;

The code above is equal to the current first set fbo readable object and then call its related operations will be read back into memory.

Some other issues

①rendertexture format, rt format and general format of tex2D not the same thing, we consult the documentation, see rt supported format there are many, is certainly the most basic ARGB32 support, a lot of this machine supports ARRBHALF or ARGBFLOAT the format of this floating point format is useful, imagine that you want to uv scene information stored on a map, you want to save the color is not 256, but a floating point. But before use must query the current gpu support this format
② If you want to copy from RenderTexture back to memory, and then copy back rt tex format must match, and must be rgba32 or RGBA24 this basic type, you float copy back should not do
③rendertexture distribution and destruction, if you want to frequent a new rt out, then do not direct new, but the use of GetTemporary and ReleaseTemporary RenderTexture offer, it will maintain a pool inside, repeatedly reusing some of the format size rt the same resources, because let gpu assign a new tex you really want to time-consuming. More importantly, this also calls DiscardContents
④DiscardContents () rendertex this interface is very important, it is good practice to always call this function before drawing on rt DiscardContents you should try to have a content of each, taken generally optimization is, in some based on gpu tile, the tile between rt, and some memory to exist various synchronization, you are ready to draw on if there is already a rt, this will trigger the synchronization, and this function tells rt gpu piece of content do not need him, and I want to repaint anyway, thus avoiding the huge cost of synchronization produced. In short, or try, it will automatically handle this matter for you to use this interface GetTemporray

Guess you like

Origin blog.csdn.net/qq_34562355/article/details/91881523