Unity engine basics

Scene perspective movement

Right mouse button to rotate scene

After selecting the view tool, move the scene with the left mouse button

Hold down the right button and click W/S/A/D/Q/R to roam the scene

Select the object and hold down alt (option). Use the left mouse button to rotate the scene around the object. Right click to zoom the scene.

Select the object and press the F key or double-click the object in the Hierarchy panel to treat the object as the center of the scene.

Fast transformation tools

Mobile SceneQ

Moving object W

Rotating object E

Scale object R

·Practical tips

Vertex adsorption: After selecting the object, press and hold V, select the vertex and drag the vertex to the target vertex.

Camera(Cam)

Only images within the viewing frustum can be realized

Audio Listener audio listener: receives the scene input audio source audio souse and plays the music through the computer's speakers

clear flags: handle blank parts (where there are no objects)

skybox A wrapper around the entire scene, used to simulate the sky material replaceable material

ctrl+shift+F quick positioning

Drag the camera into the group to follow the movement

There can only be one audio listener

Perspective in projection is 3D Orthographic is 2D

It is best to use 2D for the map camera (clear flags are set to depth only to not render the blank part)

*Put a piece on the character's head and add it to the group to display the geometry in the map camera. The main camera is not displayed:

1. Add Layer mark to the piece and Layer model to the object and select it.

2. Select the model in the main camera Culling Mask and cancel the mark.

3. Select mark in map camera Culling Mask and cancel model

Material

Create a new Material in the project panel

Mesh Renderer in the Inspector panel is responsible for rendering

Rendering Mode

opaque opaque default mode

cutout removes the transparent channel

transparent mode changes the A in Albedo to adjust transparency

fade gradient

Main Maps

Albedo basic map determines object surface texture and color

Metallic uses metallic properties to simulate appearance

Specular reflection

Smoothness

Normal Map describes the concavity and convexity of the object surface

Emission self-illumination (none does not affect the environment, real-time and dynamically change the backed baking effect in real time)

Tiling The number of texture tiles along different axes

Offset sliding texture

Shader

A program embedded in the rendering pipeline that controls the GPU's algorithm for computing image effects.
Shader material object relationship
Shader determines whether it has certain functions such as texture color.

Material is just a panel with specific values

InstanOC

Insert image description here

rendering pipeline

Insert image description here
CPU is responsible for calculation and GPU is responsible for rendering

draw call: the number of times an object needs to be rendered in one frame

After running, the Batches in the statistics are draw calls (generally one draw call per object, and the light is more complicated). Tris is the number of triangles, and Verts is the number of vertices.

Vertex processing: 1. Receive model vertex data (cpu sends it to it) [Any graphic is composed of points and lines (composed of several triangles)] 2. Coordinate system conversion

Image source assembly: Component surfaces connect adjacent vertices to draw triangular surfaces

Rasterization: Calculate the pixels on the triangle face and provide reasonable interpolation parameters for the subsequent shading stage

Pixel processing: Shading each pixel area and writing it to the cache

cache

A memory block that stores pixel data (the most important buffers are the frame buffer and depth buffer)

Frame buffer: stores the color of each pixel, that is, the rendered image. The frame buffer is always in the video memory. The graphics card continuously reads and outputs it to the screen.

Depth buffer z-buffer: stores the depth information of the pixel (the distance from the object to the camera). When rasterizing, the depth value of each pixel is calculated. If the new depth value is closer than the existing value (closer to the camera), the pixel color is written to the framebuffer and replaces the depth buffer.

Occlusion Culling

Before objects are sent to the rendering pipeline, objects that cannot be seen within the camera's perspective are eliminated, thereby reducing the amount of rendering data and improving rendering performance. (Disadvantage: Determining whether an object is occluded consumes additional CPU)

step:

1. Create layers

2. Specify the layer and label for the game object (which will participate in occlusion elimination) and label (the IOClod script will be automatically attached)

3. Add a collider component to the object

4. Camera additional script IOCcam

Lod (Multiple Levels of Detail)

According to the location and importance of the object model's nodes in the display environment, the resource allocation for object rendering is determined, and the number of faces and details of non-important objects are reduced to obtain efficient rendering operations. (The draw call will not change if the precise mold is used for a close position and the simple mold is used for a far position)

lighting system

Insert image description here

Global Illumination (GI global illumination)

A lighting system capable of calculating direct light, indirect light, ambient light and reflected light

The GI algorithm can be used to make the rendered lighting effect more realistic
Insert image description here

direct light

Through the light component (culling mask, you can select the illumination object) (you can add a light source to the scene)

Shadows: You can switch cast shadows and receive shadows in Mesh Renderer. Turn off unnecessary shadows to optimize rendering.

*Quality adjustment quality in project settings in edit

Shadow culling (set shadow distance): edit—project settings—quality—shadows distances

ambient light

Lighting that acts on all objects in the scene (set through Windows-Rendering-lightings-environment lighting)

Ambient Source Ambient light source:

sky box sets ambient lighting through sky box color

Gradient gradient color: sky sky color Equator horizon color Ground ground color

Ambient Color solid color

Ambient Intensity Ambient light intensity

Ambient GI ambient light GI mode: 1. Realtime real-time updated environment light source will change the selection of this item 2. Backed baking environment light source will not change the selection of this item

reflected light

Reflection effect calculated on all objects based on skybox or cubemap, controlled by Reflection in Environment Lightings

Reflection Source reflection source

indirect light

The light reflected from the surface of an object after receiving light

(Controlled by the Bounce Intensity in the Light component)

(Indirect lighting can be viewed through the Irradiance mode of the Scene panel)

Mark immovable objects as static (Inspector—Static). Only objects marked with Lightmaping Static can produce indirect bounce lighting.

Real-time GI

Steps:

1.Set the game object to Lightmaping Static

2. Start the Precomputed Realtime GI of the Lighting panel

3. Click the Build button (if Auto is checked, the editor will automatically detect changes to the scene and repair the lighting effects)

edit-preference-GI cache can modify the cache

Precomputed Realtime GI:

Realtime Resolution Real-time calculation of resolution

CPU Usage: The larger the CPU utilization value, the higher the real-time rendering efficiency.

Baking GI

When the scene contains a large number of objects, real-time lighting and shadows have a great impact on game performance. Using baking technology, the light effects can be pre-rendered into textures and then applied to the objects to simulate light and shadow, thereby improving performance and suitable for low-performance games. Programs running on the device.

sound

Unity supports formats: mp3, ogg, wav, aif, mod, it, s3m, xm

Sound is divided into 2D and 3D

2D: suitable for background music

3D: There is a sense of space, big near and small far away

Producing sound in a scene relies mainly on two components:

Audio ListenerThe audio listener receives the sound emitted by the audio source Audio Source in the scene and plays it through the computer's speakers.

Audio Clip: Audio resources that need to be played

Mute quiet

Play on Awake will play automatically when the scene starts.

Loop loop playback

Pitch

Stereo Pan: 2D settings for left and right channels

Spatial Blend: 2D and 3D switching

In 3D Sound Settings, the Volume Rolloff volume attenuation method is generally selected as Linear Rolloff (the x-axis is the distance from the sound source and the y-axis is the volume)

Min Distance Start attenuation distance

Max Distance end falloff distance

Guess you like

Origin blog.csdn.net/m0_73241844/article/details/131504196