[Three.js] Chapter 11 Textures Textures

11. Textures

introduce

Bored of your red cube? Time to add some texture.
But first, what are textures and what can we do with them?

What are textures?

As you probably know, textures are images that cover the surface of geometry. Many types of textures can have different effects on the appearance of geometry. It's not just about color.
Below are the most common texture types, using João Paulo's famous door texture. If you like his work, buy him a Ko-fi or become a Patreon .

color (or albedo)

Albedo textures are the simplest one. It just takes the pixels of the texture and applies them to the geometry.

Alpha

Alpha textures are grayscale images where white is visible and black is not.

high

A height texture is a grayscale image that will shift vertices to create some relief. If you want to see it, you need to add segments.

normal

Normal textures add small details. It doesn't move the vertices, but it tricks the light into thinking the face is oriented differently. Normal textures are great for adding detail with good performance since you don't need to subdivide your geometry.

ambient occlusion

An ambient occlusion texture is a grayscale image that fakes shadows in cracks in a surface. While it's not physically accurate, it certainly helps with contrast.

Metallicity

A metalness texture is a grayscale image that will specify which parts are metallic (white) and non-metallic (black). This information will help generate reflection.

roughness

Roughness is a metallic grayscale image that specifies which parts are rough (white) and which are smooth (black). This information will help to dispel the light. The carpet is so bumpy that you can't see the reflections, but the water is smooth and you can see the reflections. Here, the wood is even because it has a clear coat on top.

photonic reactor

These textures (especially metallic and roughness) follow what we call PBR principles. PBR stands for Physically Based Rendering. It regroups many techniques that tend to follow real life directions for real results.
While there are many other techniques, PBR is becoming the standard for realistic rendering and is used by many software, engines and libraries.
For now, we'll just focus on how to load textures, how to use them, what transformations we can apply and how to optimize them. We'll see more about PBR in later lessons, but if you're curious, you can learn more here:

How to load textures

Get the URL of the image

To load a texture we need the URL of the image file.
Since we're using build tools, there are two ways to get it.
You can put the image texture in /src/a folder and import it like a JavaScript dependency:

import imageSource from './image.png'

console.log(imageSource)

Or you can put that image /static/in a folder and just /staticadd the image's path (without) to the URL to access it:

const imageSource = '/image.png'

console.log(imageSource)

Note that this /static/folder is only valid because of the configuration of the Vite template. If you use another type of scaffolding, you may need to adjust your project. We'll use folder manipulation
for the rest of the lesson ./static/

load image

You can find the door textures we just looked at in the folder /static/, and there are multiple ways to load them.

Use native JavaScript

Using vanilla JavaScript, first you have to create an Imageinstance, listen to loadthe event, then change its srcproperties to start loading the image:

const image = new Image()
image.onload = () =>
{
    
    
    console.log('image loaded')
}
image.src = '/textures/door/color.jpg'

You should see 'image loaded'appear in the console. As you can see, we set the src of the image to the path '/textures/door/color.jpg', but there is none in the folder /static.
We cannot use this image directly. We need to create a texture from this image first.
This is because WebGL requires a very specific format, accessible by the GPU, and also because some changes will be applied to textures, like mipmapping, which we'll see more about later. Create textures
using the Texture class:

const image = new Image()
image.addEventListener('load', () =>
{
    
    
    const texture = new THREE.Texture(image)
})
image.src = '/textures/door/color.jpg'

What we need to do now is for texturethe constant materialto be used in . Unfortunately, the texturevariable is declared inside the function and we cannot access it outside of that function . This is a JavaScript scope limitation. We could create the mesh inside the function ,
but a better solution is to create the texture outside the function and then update the texture after the image is loaded by setting the texture property to :needsUpdateneedsUpdatetrue

const image = new Image()
const texture = new THREE.Texture(image)
image.addEventListener('load', () =>
{
    
    
    texture.needsUpdate = true
})
image.src = '/textures/door/color.jpg'

When you do this, you can use the texturevariable right away and the image will be transparent until loaded.
To see the texture on the cube, replace the color property with map and use the textureas value:

const material = new THREE.MeshBasicMaterial({
    
     map: texture })


You should see door textures on each side of the cube.

Use TextureLoader

Native JavaScript techniques are not that complicated, but there is a more straightforward approach using TextureLoader . Instantiate the variable with the
TextureLoader.load(...) class and use its methods to create the texture:

const textureLoader = new THREE.TextureLoader()
const texture = textureLoader.load('/textures/door/color.jpg')

Internally, Three.js will perform operations to load the image and update the texture when ready. You can load any number of textures
with just one TextureLoader instance.
You can send 3 functions after the path. They will be required to participate in the following cycles:

  • load When the image is loaded successfully
  • progress while loading is in progress
  • error if something goes wrong
const textureLoader = new THREE.TextureLoader()
const texture = textureLoader.load(
    '/textures/door/color.jpg',
    () =>
    {
    
    
        console.log('loading finished')
    },
    () =>
    {
    
    
        console.log('loading progressing')
    },
    () =>
    {
    
    
        console.log('loading error')
    }
)

If the texture doesn't work, it might be useful to add these callbacks to see what's going on and catch the error.

Use the loading manager LoadingManager

Finally, if you have multiple images to load, and want to share events, such as being notified when all images are loaded, you can use a LoadingManager .
Create an instance of the LoadingManager class and pass it to the TextureLoader :

const loadingManager = new THREE.LoadingManager()
const textureLoader = new THREE.TextureLoader(loadingManager)

You can listen to various events by replacing the following properties with your own functions onStart, onLoad, onProgress and onError:

const loadingManager = new THREE.LoadingManager()
loadingManager.onStart = () =>
{
    
    
    console.log('loading started')
}
loadingManager.onLoad = () =>
{
    
    
    console.log('loading finished')
}
loadingManager.onProgress = () =>
{
    
    
    console.log('loading progressing')
}
loadingManager.onError = () =>
{
    
    
    console.log('loading error')
}

const textureLoader = new THREE.TextureLoader(loadingManager)

You can now start loading all the images you need:

// ...

const colorTexture = textureLoader.load('/textures/door/color.jpg')
const alphaTexture = textureLoader.load('/textures/door/alpha.jpg')
const heightTexture = textureLoader.load('/textures/door/height.jpg')
const normalTexture = textureLoader.load('/textures/door/normal.jpg')
const ambientOcclusionTexture = textureLoader.load('/textures/door/ambientOcclusion.jpg')
const metalnessTexture = textureLoader.load('/textures/door/metalness.jpg')
const roughnessTexture = textureLoader.load('/textures/door/roughness.jpg')

As you can see here we renamed texturethe variable colorTexture, so don't forget to change it material:

const material = new THREE.MeshBasicMaterial({
    
     map: colorTexture })

LoadingManager is useful if you want to show a loader and only hide it when all assets are loaded . As we'll see in future lessons, you can also use it with other types of loaders.

UV unwrapping UV unwrapping

While it is logical how to place textures on cubes, for other geometry things can be a bit trickier.
Try replacing your BoxGeometry with some other geometry:

const geometry = new THREE.BoxGeometry(1, 1, 1)

// Or
const geometry = new THREE.SphereGeometry(1, 32, 32)

// Or
const geometry = new THREE.ConeGeometry(1, 1, 32)

// Or
const geometry = new THREE.TorusGeometry(1, 0.35, 32, 100)

As you can see, the texture is stretched or squeezed in different ways to cover the geometry.
This is called UV unwrapping. You can imagine this like opening origami or candy wrappers to flatten them. Each vertex has a 2D coordinate on a plane (usually a square).

You can actually see those UV 2D coordinates in the properties geometry.attributes.uv:

console.log(geometry.attributes.uv)

These UV coordinates are generated by Three.js when you work with primitives. If you create your own geometry and want to apply a texture to it, you must specify UV coordinates.
If you use 3D software to make the geometry, you must also do UV unwrapping.

Don't worry; most 3D software also has an auto-unfold feature that should do the trick.

Transforming the texture Transforming the texture

Let's go back to the cube with a texture and see what transformations we can apply to that texture.

repeat

You canrepeat repeat a texture using the Vector2 property, which means it has xand yproperties.
Try changing these properties:

const colorTexture = textureLoader.load('/textures/door/color.jpg')
colorTexture.repeat.x = 2
colorTexture.repeat.y = 3


As you can see, the texture is not repeated, but smaller, and the last pixel seems to be stretched. This is due to textures not being set to repeat themselves
by default . To change it you have to use constant update and properties.THREE.RepeatWrappingwrapSwrapT

  • wrapSis xthe axis
  • wrapTis ythe axis
colorTexture.wrapS = THREE.RepeatWrapping
colorTexture.wrapT = THREE.RepeatWrapping


You can also change the orientation THREE.MirroredRepeatWrapping:

colorTexture.wrapS = THREE.MirroredRepeatWrapping
colorTexture.wrapT = THREE.MirroredRepeatWrapping

Offset offset

You can offset the texture using the offset attribute which is also a Vector2 with and attributes . Changing these will simply offset the UV coordinates:x``y

colorTexture.offset.x = 0.5
colorTexture.offset.y = 0.5

rotation

You can rotate a texture using rotationthe property, which is a simple number corresponding to an angle expressed in radians:

colorTexture.rotation = Math.PI * 0.25


If you remove offsetthe and repeatattributes, you'll see that the rotation happens to the bottom left corner of the cube face:

that is, in fact the UV coordinates are 0, 0. If you want to change the axis of that rotation, you can do so using a property that is also a Vector2center :

colorTexture.rotation = Math.PI * 0.25
colorTexture.center.x = 0.5
colorTexture.center.y = 0.5

The texture will now rotate on its center.

filter and map

If you look at the top face of the cube, which is almost hidden, you'll see a very blurry texture.

This is due to filtering and mipmapping.
Mipmapping (or "mip mapping" with spaces) is a technique that involves repeatedly creating half smaller versions of a texture until a 1x1 texture is obtained. All these texture changes are sent to the GPU, and the GPU will choose the most appropriate version of the texture.
Three.js and the GPU already handle all of this, you just need to set the filtering algorithm you want to use. There are two types of filter algorithms: downscaling filters and upscaling filters.

narrow filter

Downscaling filters occur when texels are smaller than rendered pixels. In other words, the texture is too big for the surface and it covers.
You can use minFilterthis property to change the texture's minification filter.
There are 6 possible values:

  • THREE.NearestFilter
  • THREE.LinearFilter
  • THREE.NearestMipmapNearestFilter
  • THREE.NearestMipmapLinearFilter
  • THREE.LinearMipmapNearestFilter
  • THREE.LinearMipmapLinearFilter

The default value is THREE.LinearMipmapLinearFilter. If you're not happy with how your texture looks, you should try other filters.
We won't see each one, but we will test THREE.NearestFilter, which has a very different result:

colorTexture.minFilter = THREE.NearestFilter

If you're using a device with a pixel ratio higher than 1, there won't be much of a difference. If not, put the camera where the face is almost hidden and you should get more detail and weird artifacts.
checkerboard-1024x1024.pngIf you test with the textures in the folder /static/textures/, you'll see these artefacts more clearly:

const colorTexture = textureLoader.load('/textures/checkerboard-1024x1024.png')

The artefacts you see are called moiré patterns , and you're generally expected to avoid them.

Magnification filter! It is very important to make the image clear

The Magnify filter works similarly to the Minify filter, but the texels are larger than the render pixels. In other words, the texture is too small for the surface it covers.
checkerboard-8x8.pngYou can see the result using textures also located in the folder static/textures/:

const colorTexture = textureLoader.load('/textures/checkerboard-8x8.png')


The texture becomes blurry because it's a very small texture on a very large surface.
While you might think this looks bad, it's probably for the best. If the effect isn't too dramatic, users might not even notice it.
You can change the texture's magnification filter with properties magFilter.
There are only two possible values:

  • THREE.NearestFilter
  • THREE.LinearFilter

The default value is THREE.LinearFilter.
If you test THREE.NearestFilter, you'll see that the base image is preserved and you get a pixelated texture:

colorTexture.magFilter = THREE.NearestFilter


This can be advantageous if you want a Minecraft style with pixelated textures.
You can see the result using the textures located static/textures/in the folder minecraft.png:

const colorTexture = textureLoader.load('/textures/minecraft.png')


The last word about all these filters is THREE.NearestFilterthat it's more convenient than the others, and you should get better performance when using it.
Only used for mipmap attributes minFilter. You don't need mipmaps if you're using THREE.NearestFilterthem, you can deactivate them with colorTexture.generateMipmaps = false:

colorTexture.generateMipmaps = false
colorTexture.minFilter = THREE.NearestFilter

This will offload the GPU a bit.

Texture Formats and Optimization

When preparing textures, 3 key elements must be kept in mind:

  • weight weight
  • size size (or resolution resolution)
  • data data

**weight **weight

Don't forget that users visiting your website must download these textures . You can use most of the image types we use on the web, such as .jpg (lossy compressed but usually lighter) or .png (lossless compressed but usually heavier).
Attempts to apply optimized methods to obtain user-acceptable image sizes, but keep the image storage size as small as possible. You can use a website that compresses images like TinyPNG (also works for jpg) or any software.

size size

Regardless of the weight of the image, every pixel of the texture you use must be stored on the GPU. Like your hard drive, GPUs have storage limitations. Worse, automatically generated mipmapping increases the number of pixels that must be stored.
Keep the image size as small as possible.
If you remember what we said about mipmapping, Three.js will repeatedly generate half a small version of the texture until it gets a 1x1 texture. Therefore, your texture width and height must be powers of 2. This is mandatory so that Three.js can divide the size of the texture by 2.
Some examples: 512x512, 1024x1024 or 512x2048 512
, 1024 and 2048 can be divided by 2 until it reaches 1.
If you use a texture with a different width or height than a power of 2 value, Three.js will try to stretch it to the nearest power of 2, which may cause poor visual results, and you'll also get a warning in the console.

data data

We haven't tested it yet because we have other things to look at first, but textures support transparency. As you probably know, jpg files don't have an alpha channel, so you might prefer to use png (transparency).
Or you can use an alpha map, as we'll see in a later lesson.
If you're using a normal texture (purple texture), you probably want to have exact values ​​for the red, green, and blue channels of each pixel, or you might end up with visual glitches. For this you need to use png as its lossless compression will preserve the pixel values ​​well.

where to find textures

Unfortunately, it's always difficult to find the perfect texture. There are lots of texture sites out there, but not all textures are great, and you might have to pay for them.
It might be a good idea to start with a web search. Below are some of the sites I visit frequently.

Always make sure you have permission to use the texture if it is not for personal use. You can also create your own procedural textures
using photo and 2D software like Photoshop or even using software like Substance Designer .

Guess you like

Origin blog.csdn.net/m0_68324632/article/details/130997884