3D engine and rendering

Common 3D Engines

The 3D engine can be divided into client-side 3D engine and Web-side 3D engine according to the platform ; it can be divided into game 3D engine and CAD/CAM/CAE 3D engine according to the purpose ;

The common game engines include UE4 and Unity3D; the 3D engines used in CAD engineering are divided into commercial and open source, the more famous commercial ones are ACIS, ParaSolid; the open source is OCCT; these 3D engines are based on the underlying layers such as DirectX or OpenGL They are built on top of them; they can be truly three-dimensional engines; while OpenGL and DirectX can only be called graphics drawing interfaces

The 3D engines on the Web side mainly include Three.js, Babylon.js, Cesium.js, etc. These B-side 3D engines are almost 3D engines developed based on WebGL, and webGL is the B-side API package set of OpenGL ES; Engines have their own application scenarios, such as Three.js for effect display, Babylon.js for game development, and Cesium.js for GIS development;

insert image description here

underlying graphics library

Direct3D and OpenGL are graphics rendering APIs, which encapsulate some hardware layers. Through this set of API application layers, the GPU can be used to control the graphics rendering and rendering, so that the caller does not need to care about how the CPU calls the GPU, how the GPU allocates memory, and how to process code asynchronously. etc.

 To put it simply, DirectX is more powerful than OpenGL. OpenGL is mainly a graphics rendering library; from the perspective of graphics rendering technology, they are relatively low-level technologies; they can be used directly as a 3D engine, but more often than not, application software does not Software will be developed directly based on them. It is developed based on a dedicated 3D engine. For example, develop a game based on the UE4 3D game engine; develop a CAD 3D drawing software based on the ACIS CAD 3D engine, etc.

3D graphics rendering pipeline (3D graphic rendering pipeline)

rendering technology

Rendering is one of the final stages of the 3D production pipeline. Think of it as combining all the information in the scene (objects, materials, lights, cameras) to produce a single or series of final rendered images. This part of the production is often computationally intensive and can sometimes take hours - depending on the complexity, quality and intended platform of the scene.

Graphics rendering styles are mainly divided into photorealistic rendering (Photorealistic rendering) and non-photorealistic rendering (Non-photorealistic rendering, NPR). The purpose of photorealistic rendering is to render a photorealistic picture, while the purpose of non-realistic rendering is more diverse, mainly to simulate the artistic drawing style and present the effect of hand-painting.

There are currently two main methods of creating 2D images from 3D scenes: rasterization and ray tracing.

Rasterization has been around since the first days of computer-generated graphics. The computer divides all objects in the scene into triangles, calculates their position on the screen, and overlays a grid of pixels, then determines the color of each pixel based on material and transparency. color. Rasterization algorithms are useful when you need to clearly see everything in the scene and understand what's in front of you and what's behind you. Viewports in 3D software use rasterization to display the geometry that is created, however when it comes to lighting and shadows, rasterization generally appears to be like "guessing", which causes our results to look unrealistic, so let's There are many additional algorithms that add realism; rasterization is a rendering technique commonly used in real-time graphics.

Ray tracing refers to creating physically correct images, and it mimics how our brains build images for us: light rays hit surfaces and reflect from them, which, depending on the material, changes the properties of the light rays, so when When our eyes catch it, it is perceived as having a specific color. Typically, ray tracers work in the opposite direction: a ray is shot from the camera (our point of view) to the object, when the ray hits the surface, it bounces off (or passes through if the surface is transparent) and produces several other rays, these new The ray hits other surfaces and bounces and so on until it reaches the light source, then the last ray takes the properties of this light source and returns this information along the line, calculating the color at each bounce point. Ray tracing is a recursive algorithm , resulting in a large number of calculations done at once, so each bounce produces a limited number of new rays, however as hardware technology has developed, ray tracing renderers have become better and better at creating objects that are indistinguishable from reality Visual effects, so it is also widely used in film and architectural visualization . Ray tracing is the most commonly used method of photorealistic rendering today.

Real-time rendering and non-real-time rendering

3D real-time rendering

Real-time rendering is mainly used in the field of games. The computer will calculate and display the rendered results in real time, and the frame rate is around 20-120. Therefore, it is necessary to maximize the display of authenticity at a certain frame rate. The computer will use some "tricks" in the process of image processing to perceive the "real" world with the naked eye. These "tricks" include lens flare (lens flare), depth of field (depth of field) and motion blur (motion blur) . The computing power of the computer determines the realism of the rendering, which usually requires the assistance of the GPU.

3D non-real-time rendering

3D non-real-time rendering is usually a movie or video. With the help of the limited computing power of the computer, a more realistic effect can be achieved by extending the rendering time. Ray tracing and radiosity are commonly used techniques in non-real-time rendering to achieve a more realistic feeling. With the development of technology, different kinds of matter forms have more accurate calculation techniques, such as particle system (simulation of rain, smoke and fire), volume sampling (simulation of fog, dust), as well as caustic properties and subsurface scattering (subsurface scattering). ). Substances in different layers during rendering are calculated separately and composited into one final scene.

Rendering-related optics

Global Illumination

In addition to the basic lighting set in the 3D scene, global illumination takes into account the light reflected between different substances, which is superimposed into the rendering process to make it more realistic.

Ray Tracing

In the rendering process, the camera is used to locate the relationship with the objects in the 3D scene. So the "camera" is the position of the eyes. Similar to objects in the real world because light hits the object and then reflects into the eye, ray tracing is the reverse process of this. As shown in the figure, the View Ray emitted by the Camera hits the material Scene Object of the 3D scene through the pixels of the screen (every pixel is calculated), and is pushed back to the Light Source by calculating the reflection index of the material, so that every Each pixel can get a calculation result to display the relevant visual information of the location.

Phone Shading

In the lighting model, it mainly consists of three parts: ambient light (ambient), diffuse reflection (diffuse), and specular reflection (specular). Different materials will present different visual information depending on the light.

Object Materials and Textures (Object Texture)

The physical properties related to rendering include the consistency of the liquid, the refraction and reflection of light, the thickness of the material itself, the texture roughness of the surface, etc. These parameters affect the direction and intensity of light, making the calculation multidimensional. BRDF (Bidirectional Reflectance Distribution Function) is used to describe the relationship between incident light and light reflected and refracted by objects.

Mathematical principles related to rendering

  • Object rendering equation (Rendering Equation) : outgoing light energy = object self-illumination + reflection
  • Spherical Harmonics spherical harmonic lighting function
  • LightMap lightmap : the process of precomputing the brightness of surfaces in a scene, allowing global illumination, shadows, and ambient lighting to be added at relatively low computational cost

References

3D rendering - Zhihu

Games-104 Game Engine 05: The Mathematical Magic of Rendering Light and Materials - Programmer Sought

Guess you like

Origin blog.csdn.net/m0_64768308/article/details/131030781