Three.js implements interactive particle swarm

This tutorial will demonstrate how to draw large numbers of particles using Three.js, and an efficient way to make particles react to mouse and touch input using shaders and offscreen textures.

insert image description here

Recommendation: Use NSDT Scene Designer to quickly build 3D scenes.

1. Instanced geometry

Particles are created from the pixels of an image. Our image size is 320×180, or 57,600 pixels.

However, we don't need to create a geometry for each particle. We can create just one and render it 57,600 times with different parameters. This is called geometry instancing. In Three.js we use InstancedBufferGeometry to define geometry, BufferAttribute for attributes that remain the same per instance, and InstancedBufferAttribute for attributes that vary between instances (ie color, size).

The geometry of our particle is a simple quadrilateral consisting of 4 vertices and 2 triangles.

insert image description here

const geometry = new THREE.InstancedBufferGeometry();

// positions
const positions = new THREE.BufferAttribute(new Float32Array(4 * 3), 3);
positions.setXYZ(0, -0.5, 0.5, 0.0);
positions.setXYZ(1, 0.5, 0.5, 0.0);
positions.setXYZ(2, -0.5, -0.5, 0.0);
positions.setXYZ(3, 0.5, -0.5, 0.0);
geometry.addAttribute('position', positions);

// uvs
const uvs = new THREE.BufferAttribute(new Float32Array(4 * 2), 2);
uvs.setXYZ(0, 0.0, 0.0);
uvs.setXYZ(1, 1.0, 0.0);
uvs.setXYZ(2, 0.0, 1.0);
uvs.setXYZ(3, 1.0, 1.0);
geometry.addAttribute('uv', uvs);

// index
geometry.setIndex(new THREE.BufferAttribute(new Uint16Array([ 0, 2, 1, 2, 3, 1 ]), 1));

Next, we iterate over the pixels of the image and assign our instancing properties. Since the word position is already taken, we use the word offset to store the position of each instance. The offsets will be x,y for each pixel in the image. We also want to store the particle index and a random angle, which will be used later in the animation.

const indices = new Uint16Array(this.numPoints);
const offsets = new Float32Array(this.numPoints * 3);
const angles = new Float32Array(this.numPoints);

for (let i = 0; i < this.numPoints; i++) {
	offsets[i * 3 + 0] = i % this.width;
	offsets[i * 3 + 1] = Math.floor(i / this.width);

	indices[i] = i;

	angles[i] = Math.random() * Math.PI;
}

geometry.addAttribute('pindex', new THREE.InstancedBufferAttribute(indices, 1, false));
geometry.addAttribute('offset', new THREE.InstancedBufferAttribute(offsets, 3, false));
geometry.addAttribute('angle', new THREE.InstancedBufferAttribute(angles, 1, false));

2. Particle material

The material is a RawShaderMaterial with custom shaders particle.vert and particle.frag.

Uniforms are described as follows:

  • uTime: Elapsed time, updated every frame
  • uRandom: Random factor for displacing particles in x,y
  • uDepth: the maximum oscillation of the particle in the z direction
  • uSize: the basic size of the particle
  • uTexture: image texture
  • uTextureSize: the size of the texture
  • uTouch: touch texture
const uniforms = {
	uTime: { value: 0 },
	uRandom: { value: 1.0 },
	uDepth: { value: 2.0 },
	uSize: { value: 0.0 },
	uTextureSize: { value: new THREE.Vector2(this.width, this.height) },
	uTexture: { value: this.texture },
	uTouch: { value: null }
};

const material = new THREE.RawShaderMaterial({
	uniforms,
	vertexShader: glslify(require('../../../shaders/particle.vert')),
	fragmentShader: glslify(require('../../../shaders/particle.frag')),
	depthTest: false,
	transparent: true
});

A simple vertex shader would output the particle's position directly based on the particle's offset property. To make things more interesting, we use randomness and noise to displace the particles. The same goes for the size of the particles.

// particle.vert

void main() {
	// displacement
	vec3 displaced = offset;
	// randomise
	displaced.xy += vec2(random(pindex) - 0.5, random(offset.x + pindex) - 0.5) * uRandom;
	float rndz = (random(pindex) + snoise_1_2(vec2(pindex * 0.1, uTime * 0.1)));
	displaced.z += rndz * (random(pindex) * 2.0 * uDepth);

	// particle size
	float psize = (snoise_1_2(vec2(uTime, pindex) * 0.5) + 2.0);
	psize *= max(grey, 0.2);
	psize *= uSize;

	// (...)
}

The fragment shader samples RGB colors from the original image and converts them to grayscale using the luma method (0.21 R + 0.72 G + 0.07 B).

The alpha channel is determined by the linear distance from the center of the UV, which essentially creates a circle. The boundaries of the circle can be blurred using smoothstep.

// particle.frag

void main() {
	// pixel color
	vec4 colA = texture2D(uTexture, puv);

	// greyscale
	float grey = colA.r * 0.21 + colA.g * 0.71 + colA.b * 0.07;
	vec4 colB = vec4(grey, grey, grey, 1.0);

	// circle
	float border = 0.3;
	float radius = 0.5;
	float dist = radius - distance(uv, vec2(0.5));
	float t = smoothstep(0.0, border, dist);

	// final color
	color = colB;
	color.a = t;

	// (...)
}

3. Optimization

In our demo, we set the size of the particles based on their brightness, which means dark particles are barely visible. This leaves room for some optimizations. When iterating over the pixels of an image, we can discard those that are too dark. This reduces the number of particles and improves performance.

insert image description here

Optimization starts before we create the InstancedBufferGeometry. We create a temporary canvas, draw the image on it and call getImageData() to retrieve the array of colors [R, G, B, A, R, G, B … ]. We then define a threshold - hex #22 or decimal 34 - and test against the red channel. The red channel is an arbitrary choice, we could also use green or blue, or even the average of all three channels, but the red channel is simple to use.

// discard pixels darker than threshold #22
if (discard) {
	numVisible = 0;
	threshold = 34;

	const img = this.texture.image;
	const canvas = document.createElement('canvas');
	const ctx = canvas.getContext('2d');

	canvas.width = this.width;
	canvas.height = this.height;
	ctx.scale(1, -1); // flip y
	ctx.drawImage(img, 0, 0, this.width, this.height * -1);

	const imgData = ctx.getImageData(0, 0, canvas.width, canvas.height);
	originalColors = Float32Array.from(imgData.data);

	for (let i = 0; i < this.numPoints; i++) {
		if (originalColors[i * 4 + 0] > threshold) numVisible++;
	}
}

We also need to update the loops that define the offset, angle and pindex to take the threshold into account.

for (let i = 0, j = 0; i < this.numPoints; i++) {
	if (originalColors[i * 4 + 0] <= threshold) continue;

	offsets[j * 3 + 0] = i % this.width;
	offsets[j * 3 + 1] = Math.floor(i / this.width);

	indices[j] = i;

	angles[j] = Math.random() * Math.PI;

	j++;
}

4. Notes on interactivity

There are many different ways to introduce interactions with particles. For example, we could give each particle a velocity property and update it every frame based on how close it is to the cursor. It's a classic technique that works well, but it might be a bit too heavy if we have to cycle through tens of thousands of particles.

A more efficient way is to do it in the shader. We can pass the cursor's position as a uniform and move the particles based on their distance from it. While this will perform faster, the result can be very dry. Particles go to the given position, but do not ease in or out.

5. Selected Interaction Method

The technique we chose for this demo is to paint the cursor position onto a texture. The advantage is that we can keep a history of the cursor position and create a trail. We can also apply an easing function to the radius of the track so that it grows and shrinks smoothly. Everything will happen in the shader, with all particles running in parallel.
insert image description here

To get the cursor position we use a Raycaster and a simple PlaneBufferGeometry the same size as our main geometry. Airplanes are invisible but interactive.

Interactivity in Three.js is a topic in itself. See this example for reference.

When the cursor has an intersection with the plane, we can use the UV coordinates in the intersection data to retrieve the position of the cursor. The positions are then stored in an array (trajectories) and drawn to an offscreen canvas. The canvas is passed to the shader as a texture via uTouch in unity.

In the vertex shader, the particles are displaced according to the brightness of the pixel in the touch texture.

// particle.vert

void main() {
	// (...)

	// touch
	float t = texture2D(uTouch, puv).r;
	displaced.z += t * 20.0 * rndz;
	displaced.x += cos(angle) * t * 20.0 * rndz;
	displaced.y += sin(angle) * t * 20.0 * rndz;

	// (...)
}

Original Link: Three.js Interactive Particles—BimAnt

Guess you like

Origin blog.csdn.net/shebao3333/article/details/130029631