What are 3D textures?

Google has failed me, I could not find anything informative. So perhaps GameDev can :).

  • What are 3D textures?
  • When are they used?
  • Performance costs?
  • How are they stored?

I have many vague ideas, but no 'definitive' definition.

Any references to examples or tutorials is appreciated, especially in rendering particle systems via 3D textures.

3d textures

shareimprove this question

edited Nov 18 '11 at 16:15

Josh

91.6k16204321

asked Mar 12 '11 at 6:27

qreba47jhqb4e3lstrujvvdx

1,46721427

add a comment

4 Answers

activeoldestvotes

up vote32down voteaccepted

3D texture works like regular texture. But it is truly 3D. 2D textures has UV coords, 3D has UVW (you had to use them propably). Texture coords are unit cube (0 - 1, 0 - 1, 0 - 1).

Possible usage:

  • volumetric effects in games (fire, smoke, light rays, realistic fog)
  • caching light for realtime global illumination (CryEngine for example)
  • scientific (MRI, CT scans are saved into volumes)

Performance:
Same as regular texture - super fast (fastest memory access in gpu). Cached for threads and optimized for the situation when near threads (pixel shaders) are looking for near values.

Can be read as point or used linear sampling (native tri-linear interpolation. Eq bi-linear interpolation in 2D texture).

They are stored in memory like a array of 2D textures.

Mipmapping on 3D texture:
enter image description here

shareimprove this answer

edited Nov 18 '11 at 13:07

answered Mar 12 '11 at 8:10

Notabene

5,82512439

add a comment

up vote14down vote

3D textures or 'Volume textures' are a series of normal textures arranged as slices, like a deck of cards. These are used in volumetric rendering which often takes real world data such as CT scans and then manipulates them. In games and graphics it's sometimes for volumetric effects like smoke where you trade the flexibility of a particles system for the fixed cost and easy (and easily parallelisable) calculation of a volume texture. It's really easy in a volume texture to find the nearest neighbours but this can be a problem is a more traditional particle system made up of loose points.

Also, some complex object benefit from using volume textures, such as grass:

Nvidia's smoke in a box is a great example:

http://www.youtube.com/watch?v=9AS4xV-CK14

Also, rendering light scattering is easier when you have a volume to render through

shareimprove this answer

edited Jan 24 '14 at 15:13

answered Mar 12 '11 at 7:51

Luther

99269

add a comment

up vote7down vote

Apart from the existing answers, I like how 3D textures can be created procedurally as well (actually it makes more sense than a 2D procedural texture), like noise and fractals which will provide good and non-deformed texture details to any angle surfaces without having to select a projection model (UV, planar, cylindrical, spherical and so on).

Instead of sampling a bitmap layer in 3D space, you get a distinct value for each point in 3D space with a basic algorithm - which basically gives you infinite texture resolution and no problems with stretched textures on angled polygons. This is 3D "textures" or rather surface shading to me ^^

I'm actually pretty tired of all the bitmap texture snuff in game development, when I want to dirty down a space ship, I want to apply a few layers of noise formulas and not have to paint a limited resolution bitmap. An old 3D buff can dream, can't he ;)

Right now, pre-baking these sweet oldschool 3D movie surface solutions into color, normal and specular UV bitmaps are about the only fun I get and it feels annoyingly limited. The demo scene (as always) tend to (pre)generate textures procedurally for real-time 3D but it feels hard to find good tools to support this workflow from design to run-time. Most game dev procedural approaches seems to be about mesh generation. Wonder how much the shader pipeline with the help of a depth map/deferred rendering could handle of oldschool 3D noise algos?

...rather than trying to figure out how to map a texture image parametrically onto the surface of your shape, the volumetric nature of noise-based textures allows you simply to evaluate them at the (x, y, z) locations in your pixel shader. In this way, you are effectively carving your texture out of a solid material, which is often much more straightforward than trying to work out a reasonable undistorted parametric mapping.

These sorts of volumetric noise-based procedural textures have long been mainstays of feature films, where shaders do not require real-time performance... all special-effects films today make heavy use of noise-based procedural shaders... is highly dependent on the extensive use of the noise function within pixel shaders written in languages such as Pixar's RenderMan. With a good implementation of the noise function available for real-time use in GPUs, I look forward to seeing some of the exciting visual ideas from feature films incorporated into the next generation of interactive entertainment on the desktop and the console.http://http.developer.nvidia.com/GPUGems/gpugems_ch05.html

shareimprove this answer

edited Nov 19 '11 at 10:30

answered Nov 18 '11 at 11:15

Oskar Duveborn

99969

  • Very interesting read! Im having trouble visualising what you mean in your second paragraph though. – qreba47jhqb4e3lstrujvvdx Nov 18 '11 at 21:21

  • Perhaps I could make a diagram. What I mean is that instead of sampling a bitmap, the color (or whatever you're after) is simply decided by a noise algorithm instead. That way there's not interpolation between bitmap pixels needed (as you're bound to be sampling not on a perfect 1x1 pixel position) and resolution becomes in theory infinite. It's a bit like actually calculating the interest for a specific interest rate instead of using a pre-calculated lookup table where there are a certain steps/intervals (=bitmap resolution). – Oskar Duveborn Nov 19 '11 at 10:21 

  • After reading the entirety of the article referenced. I have a much better understanding, but I think ref. images would benefit both parties: the unsure and the tl;dr. :) – qreba47jhqb4e3lstrujvvdx Nov 19 '11 at 13:14 

add a comment

up vote0down vote

3D texture patterns are applied to voxels of a volumetric body very much the same way like 2D texture patterns are applied to the pixels of a 2D polygon. Here is an example of a 3D volumetric texture in JavaScript (requires WebGL):

http://kirox.de/test/Surface.html

press [v], rotate the body and change the gradient of the generated texture with the mouse whee

猜你喜欢

转载自blog.csdn.net/linuxheik/article/details/84871589
今日推荐