Learn Creative Coding (#65) - 3D Particle Systems

We built 2D particle systems back in episode 11. Tiny circles with position, velocity, and a lifespan, drifting across a flat canvas. Then in episode 46 we moved particles to the GPU with compute shaders -- millions of points in a 2D flow field, all updated in parallel. Both episodes worked in two dimensions. Particles moved left-right, up-down, and that was it.
Now we're in Three.js territory. Three episodes of scene graphs, procedural geometry, and custom shaders. And the natural next step is: particles in 3D. Not on a canvas plane -- in actual three-dimensional space. Points that drift through volume. Clouds you can orbit around. Galaxies you can fly through. The same principles from ep011 (position, velocity, age, death) plus the same GPU techniques from ep046 (typed arrays, buffer attributes, shader-driven animation) -- but with a Z axis and a perspective camera.
Three.js has a built-in system for this. Points renders vertices as screen-facing squares (or textured sprites). Each vertex in a BufferGeometry becomes one particle. You give it position data as a typed array, same as the procedural geometry from ep063. The renderer draws each vertex as a point -- no triangles, no faces, just dots in space. Simple, fast, and it scales to hundreds of thousands of particles without breaking a sweat.
Points: the basics
The simplest 3D particle system in Three.js. Create a BufferGeometry, fill it with random positions, wrap it in a PointsMaterial:
import * as THREE from 'three';
import { OrbitControls } from 'three/addons/controls/OrbitControls.js';
const scene = new THREE.Scene();
scene.background = new THREE.Color(0x050508);
const camera = new THREE.PerspectiveCamera(
60, window.innerWidth / window.innerHeight, 0.1, 100
);
camera.position.set(0, 0, 5);
const renderer = new THREE.WebGLRenderer({ antialias: true });
renderer.setSize(window.innerWidth, window.innerHeight);
renderer.setPixelRatio(window.devicePixelRatio);
document.body.appendChild(renderer.domElement);
const controls = new OrbitControls(camera, renderer.domElement);
controls.enableDamping = true;
// 10,000 particles at random positions inside a sphere
const count = 10000;
const positions = new Float32Array(count * 3);
for (let i = 0; i < count; i++) {
// random direction
const theta = Math.random() * Math.PI * 2;
const phi = Math.acos(2 * Math.random() - 1);
const r = Math.pow(Math.random(), 1 / 3) * 3; // cube root for uniform volume
positions[i * 3 + 0] = r * Math.sin(phi) * Math.cos(theta);
positions[i * 3 + 1] = r * Math.sin(phi) * Math.sin(theta);
positions[i * 3 + 2] = r * Math.cos(phi);
}
const geometry = new THREE.BufferGeometry();
geometry.setAttribute('position', new THREE.BufferAttribute(positions, 3));
const material = new THREE.PointsMaterial({
color: 0x88ccff,
size: 0.03,
sizeAttenuation: true
});
const points = new THREE.Points(geometry, material);
scene.add(points);
That's it. A cloud of 10,000 particles distributed uniformly inside a sphere. The Math.pow(Math.random(), 1/3) trick for the radius is important -- if you just use Math.random() * 3, most particles cluster near the center because volume grows with the cube of the radius. The cube root corrects for this and gives you even distribution throughout the sphere volume. Same math principle as episode 24 (seed-based art) where we talked about distribution functions.
sizeAttenuation: true means particles farther from the camera appear smaller, just like real objects do with perspective. Particles near you are fat dots, particles far away are tiny specks. Turn this off and every particle is the same screen-space size regardless of distance -- useful for HUD elements or non-perspective effects, but for 3D clouds you definitely want attenuation.
Orbit around this. From outside, it's a fuzzy cloud. Fly into the center and you're surrounded by points in every direction. The simple act of putting points in 3D space and giving the viewer a camera they can move around creates something surprisingly immersive.
Adding color per particle
A uniform blue cloud is nice but let's add variety. Just like vertex colors in ep063, you can assign a color to each particle by adding a color attribute:
const colors = new Float32Array(count * 3);
for (let i = 0; i < count; i++) {
const x = positions[i * 3];
const y = positions[i * 3 + 1];
const z = positions[i * 3 + 2];
// distance from center
const dist = Math.sqrt(x * x + y * y + z * z);
const t = dist / 3; // normalized 0..1
// cool center, warm edges (like a nebula)
colors[i * 3 + 0] = 0.2 + t * 0.7; // r: increases outward
colors[i * 3 + 1] = 0.4 + (1 - t) * 0.4; // g: decreases outward
colors[i * 3 + 2] = 0.8 - t * 0.3; // b: slightly decreases
}
geometry.setAttribute('color', new THREE.BufferAttribute(colors, 3));
const material = new THREE.PointsMaterial({
size: 0.03,
sizeAttenuation: true,
vertexColors: true // use per-vertex colors instead of uniform color
});
Now the cloud core is cool blue-green and the outer shell shifts toward warm pink-orange. Orbit around it and you see the color gradient from every angle. It reads like a nebula or a stellar nursery -- dense cool gas at the center, hot diffuse gas at the edges.
You could drive the color from anything. Height (y position), angle, noise, data values. Same concept as the height-based terrain coloring from ep063 -- map some property of each particle to a color space and the pattern emerges automatically.
Particle textures: circles, stars, glows
By default, PointsMaterial renders each particle as a square. That's because WebGL points are just square sprites -- the GPU draws a quad centered on each vertex. To get circles or custom shapes, you use a texture:
// create a circle texture programmatically
function createCircleTexture(size) {
const canvas = document.createElement('canvas');
canvas.width = canvas.height = size;
const ctx = canvas.getContext('2d');
const center = size / 2;
const gradient = ctx.createRadialGradient(
center, center, 0,
center, center, center
);
gradient.addColorStop(0, 'rgba(255, 255, 255, 1.0)');
gradient.addColorStop(0.4, 'rgba(255, 255, 255, 0.8)');
gradient.addColorStop(1, 'rgba(255, 255, 255, 0.0)');
ctx.fillStyle = gradient;
ctx.fillRect(0, 0, size, size);
return new THREE.CanvasTexture(canvas);
}
const material = new THREE.PointsMaterial({
size: 0.08,
sizeAttenuation: true,
vertexColors: true,
map: createCircleTexture(64),
transparent: true,
depthWrite: false,
blending: THREE.AdditiveBlending
});
A few things happening here. The texture is a radial gradient from white center to transparent edge -- this gives each particle a soft circular shape with a glowing falloff instead of a hard square.
transparent: true lets the alpha channel work. depthWrite: false prevents particles from blocking each other in the depth buffer -- without this, a nearby particle's transparent edge would occlude distant particles behind it, creating ugly rectangular cutouts. With depth write disabled, particles layer correctly.
AdditiveBlending is the secret sauce for the "glowing particles" aesthetic. Instead of the normal alpha blending (where semi-transparent objects dim what's behind them), additive blending ADDS the particle's color to whatever's already there. Where particles overlap, the colors accumulate and get brighter. Dense clusters glow intensely. Sparse areas are dim. It's the same effect as the additive blending we used in 2D particle systems (ep011) and shader feedback loops (ep036), just applied to Three.js materials.
The combination of soft circular texture + additive blending + depth write off = the classic particle cloud look you see in every music video, sci-fi movie, and creative coding portfolio. It genuienly never gets old :-)
Animating particles
Static clouds are pretty but animation brings them alive. Update the position array each frame, just like the dynamic geometry from ep063:
// store velocities
const velocities = new Float32Array(count * 3);
for (let i = 0; i < count; i++) {
velocities[i * 3 + 0] = (Math.random() - 0.5) * 0.005;
velocities[i * 3 + 1] = (Math.random() - 0.5) * 0.005;
velocities[i * 3 + 2] = (Math.random() - 0.5) * 0.005;
}
const clock = new THREE.Clock();
function animate() {
requestAnimationFrame(animate);
const dt = clock.getDelta();
const t = clock.getElapsedTime();
const pos = geometry.attributes.position.array;
for (let i = 0; i < count; i++) {
const i3 = i * 3;
// drift
pos[i3] += velocities[i3];
pos[i3 + 1] += velocities[i3 + 1];
pos[i3 + 2] += velocities[i3 + 2];
// gentle pull toward center (keeps the cloud from dispersing)
const cx = pos[i3], cy = pos[i3 + 1], cz = pos[i3 + 2];
const dist = Math.sqrt(cx * cx + cy * cy + cz * cz);
if (dist > 0.01) {
const pull = 0.0001;
pos[i3] -= cx / dist * pull;
pos[i3 + 1] -= cy / dist * pull;
pos[i3 + 2] -= cz / dist * pull;
}
}
geometry.attributes.position.needsUpdate = true;
// slow rotation of the whole cloud
points.rotation.y = t * 0.05;
controls.update();
renderer.render(scene, camera);
}
animate();
Each particle drifts along its velocity vector and is gently pulled back toward the center. Without the centering force, the cloud expands forever as particles drift outward. With it, the cloud breathes -- particles move outward, get pulled back, overshoot slightly, creating a living, pulsing volume. It's the same attractor concept from ep018 (physics lite) where we used spring forces to keep particles near a target.
needsUpdate = true tells Three.js to re-upload the position buffer to the GPU. This is the same flag we used in ep063 for animated terrain. Without it, the GPU keeps rendering the old positions and nothing moves.
3D noise drift
Random drift is ok but noise-driven drift is better. Remember 3D Perlin noise from ep035 (noise on the GPU) and ep064 (vertex displacement)? We can evaluate 3D noise at each particle's position and use it as a velocity field. Particles follow the noise, creating organic flowing motion:
// simplex or perlin noise function (import or inline)
// using the same hash-based approach from ep064's vertex shader
// but in JavaScript this time
function hash3D(x, y, z) {
let px = x * 443.897 % 1;
let py = y * 441.423 % 1;
let pz = z * 437.195 % 1;
if (px < 0) px += 1;
if (py < 0) py += 1;
if (pz < 0) pz += 1;
const dot = px * (py + 19.19) + py * (pz + 19.19) + pz * (px + 19.19);
return ((px + py) * pz + dot) % 1;
}
function noise3D(x, y, z) {
const ix = Math.floor(x), iy = Math.floor(y), iz = Math.floor(z);
const fx = x - ix, fy = y - iy, fz = z - iz;
const sx = fx * fx * (3 - 2 * fx);
const sy = fy * fy * (3 - 2 * fy);
const sz = fz * fz * (3 - 2 * fz);
const a = hash3D(ix, iy, iz);
const b = hash3D(ix + 1, iy, iz);
const c = hash3D(ix, iy + 1, iz);
const d = hash3D(ix + 1, iy + 1, iz);
const e = hash3D(ix, iy, iz + 1);
const f = hash3D(ix + 1, iy, iz + 1);
const g = hash3D(ix, iy + 1, iz + 1);
const h = hash3D(ix + 1, iy + 1, iz + 1);
const ab = a + (b - a) * sx;
const cd = c + (d - c) * sx;
const ef = e + (f - e) * sx;
const gh = g + (h - g) * sx;
const abcd = ab + (cd - ab) * sy;
const efgh = ef + (gh - ef) * sy;
return abcd + (efgh - abcd) * sz;
}
function animate() {
requestAnimationFrame(animate);
const t = clock.getElapsedTime();
const pos = geometry.attributes.position.array;
const col = geometry.attributes.color.array;
const noiseScale = 0.4;
const speed = 0.008;
for (let i = 0; i < count; i++) {
const i3 = i * 3;
const x = pos[i3], y = pos[i3 + 1], z = pos[i3 + 2];
// sample noise at slightly offset coordinates for each axis
const nx = noise3D(x * noiseScale + t * 0.1, y * noiseScale, z * noiseScale) - 0.5;
const ny = noise3D(x * noiseScale, y * noiseScale + t * 0.1 + 100, z * noiseScale) - 0.5;
const nz = noise3D(x * noiseScale, y * noiseScale, z * noiseScale + t * 0.1 + 200) - 0.5;
pos[i3] += nx * speed;
pos[i3 + 1] += ny * speed;
pos[i3 + 2] += nz * speed;
// containment: soft boundary at radius 4
const dist = Math.sqrt(x * x + y * y + z * z);
if (dist > 4) {
const pushback = (dist - 4) * 0.02;
pos[i3] -= (x / dist) * pushback;
pos[i3 + 1] -= (y / dist) * pushback;
pos[i3 + 2] -= (z / dist) * pushback;
}
// update color based on local noise (optional but gorgeous)
const n = noise3D(x * 0.3 + t * 0.05, y * 0.3, z * 0.3);
col[i3] = 0.2 + n * 0.6;
col[i3 + 1] = 0.3 + (1 - n) * 0.5;
col[i3 + 2] = 0.7 + n * 0.3;
}
geometry.attributes.position.needsUpdate = true;
geometry.attributes.color.needsUpdate = true;
controls.update();
renderer.render(scene, camera);
}
The key idea: sample 3D noise at the particle's current position, use the result as a velocity offset. Since noise is continuous and smooth, nearby particles get similar velocity offsets -- they flow together in coherent streams instead of jittering randomly. The noise creates invisible currents in 3D space and the particles reveal those currents as they flow through them.
This is the 3D equivalent of the 2D flow field from ep046 (compute shader particles). Same concept -- noise field drives particle motion -- but now particles can spiral upward, dive downward, and swirl around in 3D vortices. With the camera orbiting around, you can see the flow structure from every angle.
Adding time to the noise input (+ t * 0.1) makes the flow field itself evolve. Currents shift, vortices form and dissolve, streams merge and split. The particles are chasing a moving target. Without time evolution, particles eventually settle into stable flow lines. With it, they never stop exploring new paths.
Particle size and opacity per vertex
Sometimes you want particles to have diferent sizes. Big ones for nearby/important/energetic particles, small ones for distant/faint ones. Three.js doesn't support per-vertex point size with PointsMaterial alone -- you need a ShaderMaterial:
const vertexShader = `
attribute float aSize;
attribute float aOpacity;
varying float vOpacity;
void main() {
vOpacity = aOpacity;
vec4 mvPosition = modelViewMatrix * vec4(position, 1.0);
// size attenuation: smaller when farther away
gl_PointSize = aSize * (300.0 / -mvPosition.z);
gl_Position = projectionMatrix * mvPosition;
}
`;
const fragmentShader = `
uniform vec3 uColor;
varying float vOpacity;
void main() {
// circle shape: discard outside radius
vec2 center = gl_PointCoord - 0.5;
float dist = length(center);
if (dist > 0.5) discard;
// soft edge
float alpha = smoothstep(0.5, 0.3, dist) * vOpacity;
gl_FragColor = vec4(uColor, alpha);
}
`;
// per-particle size and opacity
const sizes = new Float32Array(count);
const opacities = new Float32Array(count);
for (let i = 0; i < count; i++) {
sizes[i] = 2.0 + Math.random() * 8.0;
opacities[i] = 0.3 + Math.random() * 0.7;
}
geometry.setAttribute('aSize', new THREE.BufferAttribute(sizes, 1));
geometry.setAttribute('aOpacity', new THREE.BufferAttribute(opacities, 1));
const shaderMaterial = new THREE.ShaderMaterial({
vertexShader,
fragmentShader,
uniforms: {
uColor: { value: new THREE.Color(0.5, 0.7, 1.0) }
},
transparent: true,
depthWrite: false,
blending: THREE.AdditiveBlending
});
const shaderPoints = new THREE.Points(geometry, shaderMaterial);
scene.add(shaderPoints);
This is where ShaderMaterial from ep064 connects directly to particles. The vertex shader reads a custom aSize attribute and uses it to set gl_PointSize -- the size of the point sprite in pixels. The 300.0 / -mvPosition.z part handles perspective attenuation manually (farther particles = smaller).
The fragment shader uses gl_PointCoord -- a built-in variable that gives the UV coordinate within the point sprite, from (0,0) at the top-left to (1,1) at the bottom-right. We compute the distance from the center (0.5, 0.5) and discard pixels outside radius 0.5 to get a circle. smoothstep softens the edge. Multiply by the per-vertex opacity varying and you have particles with individual size AND individual transparency.
gl_PointCoord is the secret weapon for custom particle rendering. Inside that little coordinate space you can draw anything -- circles, rings, stars, crosses, custom shapes. It's basically a tiny per-particle fragment shader canvas. Same as writing a fullscreen shader (ep032) but in miniature, on each point.
Emitter pattern: spawning and dying
So far all our particles are created at the start and live forever. A classic particle emitter continuously spawns NEW particles at a source point, gives them initial velocity, lets them age and die. Dead particles get recycled -- their position resets to the emitter and they start again. The pool stays the same size but particles are constantly born, fly, fade, and die.
const emitterCount = 20000;
const emitPos = new Float32Array(emitterCount * 3);
const emitVel = new Float32Array(emitterCount * 3);
const ages = new Float32Array(emitterCount);
const lifespans = new Float32Array(emitterCount);
const emitSizes = new Float32Array(emitterCount);
// initialize all particles as "dead" (age >= lifespan)
for (let i = 0; i < emitterCount; i++) {
ages[i] = 999;
lifespans[i] = 1;
}
function spawnParticle(index) {
const i3 = index * 3;
// emit from a point (or a small sphere)
emitPos[i3] = (Math.random() - 0.5) * 0.1;
emitPos[i3 + 1] = (Math.random() - 0.5) * 0.1;
emitPos[i3 + 2] = (Math.random() - 0.5) * 0.1;
// random upward-ish velocity
const angle = Math.random() * Math.PI * 2;
const upSpeed = 0.02 + Math.random() * 0.03;
const spread = 0.005 + Math.random() * 0.01;
emitVel[i3] = Math.cos(angle) * spread;
emitVel[i3 + 1] = upSpeed;
emitVel[i3 + 2] = Math.sin(angle) * spread;
ages[index] = 0;
lifespans[index] = 1.5 + Math.random() * 2.0;
emitSizes[index] = 3.0 + Math.random() * 5.0;
}
const emitGeo = new THREE.BufferGeometry();
emitGeo.setAttribute('position', new THREE.BufferAttribute(emitPos, 3));
emitGeo.setAttribute('aSize', new THREE.BufferAttribute(emitSizes, 1));
// shader that fades particles based on age
const emitVertShader = `
attribute float aSize;
uniform float uAges[${emitterCount}];
uniform float uLifespans[${emitterCount}];
varying float vLife;
void main() {
// this won't scale to 20k -- see note below
vec4 mvPos = modelViewMatrix * vec4(position, 1.0);
gl_PointSize = aSize * (200.0 / -mvPos.z);
gl_Position = projectionMatrix * mvPos;
}
`;
Wait -- passing 20,000 uniform floats won't work. GPU uniform buffers have strict limits (usually 1024 vec4s max). For per-particle data at this scale, you use buffer attributes, not uniforms. Lets fix that:
const ageAttr = new Float32Array(emitterCount);
emitGeo.setAttribute('aAge', new THREE.BufferAttribute(ageAttr, 1));
const lifeAttr = new Float32Array(emitterCount);
emitGeo.setAttribute('aLifespan', new THREE.BufferAttribute(lifeAttr, 1));
const emitVertShader2 = `
attribute float aSize;
attribute float aAge;
attribute float aLifespan;
varying float vLife;
void main() {
vLife = clamp(aAge / aLifespan, 0.0, 1.0);
vec4 mvPos = modelViewMatrix * vec4(position, 1.0);
// shrink as they age
float sizeMult = 1.0 - vLife * 0.5;
gl_PointSize = aSize * sizeMult * (200.0 / -mvPos.z);
gl_Position = projectionMatrix * mvPos;
}
`;
const emitFragShader = `
varying float vLife;
void main() {
vec2 c = gl_PointCoord - 0.5;
float d = length(c);
if (d > 0.5) discard;
// fade out as particle ages
float alpha = smoothstep(0.5, 0.2, d) * (1.0 - vLife);
// color shift: hot white/yellow -> cool blue as it ages
vec3 young = vec3(1.0, 0.8, 0.4);
vec3 old = vec3(0.2, 0.3, 0.8);
vec3 col = mix(young, old, vLife);
gl_FragColor = vec4(col, alpha);
}
`;
And the animation loop for the emitter:
const spawnRate = 150; // particles per frame
let spawnIndex = 0;
function animateEmitter() {
requestAnimationFrame(animateEmitter);
const dt = clock.getDelta();
// spawn new particles
for (let s = 0; s < spawnRate; s++) {
if (ages[spawnIndex] >= lifespans[spawnIndex]) {
spawnParticle(spawnIndex);
}
spawnIndex = (spawnIndex + 1) % emitterCount;
}
// update all particles
for (let i = 0; i < emitterCount; i++) {
if (ages[i] < lifespans[i]) {
const i3 = i * 3;
// apply velocity
emitPos[i3] += emitVel[i3];
emitPos[i3 + 1] += emitVel[i3 + 1];
emitPos[i3 + 2] += emitVel[i3 + 2];
// gravity
emitVel[i3 + 1] -= 0.0003;
// age
ages[i] += dt;
// update attribute buffers
ageAttr[i] = ages[i];
lifeAttr[i] = lifespans[i];
}
}
emitGeo.attributes.position.needsUpdate = true;
emitGeo.attributes.aAge.needsUpdate = true;
emitGeo.attributes.aLifespan.needsUpdate = true;
controls.update();
renderer.render(scene, camera);
}
Particles spawn at the origin, shoot upward with some spread, slow down from gravity, and fade from hot yellow-white to cool blue as they age. When a particle's age exceeds its lifespan, it's "dead" and the spawn loop recycles it. The pool stays at 20,000 particles but they're constantly cycling through birth and death. The result looks like a fountain of sparks or a magical fire.
The key performance trick: we never create or destroy objects. The Float32Array is allocated once and individual entries get overwritten. No garbage collection, no allocation pressure. This is the same pool pattern from ep011 but scaled up for 3D.
InstancedMesh: shaped particles
Points always renders circles (or squares, or whatever your point texture is). But what if you want particles with actual geometry -- tiny triangles, diamonds, cubes, custom shapes? That's InstancedMesh. Instead of rendering one mesh, it renders thousands of copies with per-instance transforms:
// the shape each particle will have
const particleGeo = new THREE.TetrahedronGeometry(0.03);
const particleMat = new THREE.MeshBasicMaterial({
color: 0x88aaff,
wireframe: true
});
const instanceCount = 5000;
const instancedMesh = new THREE.InstancedMesh(
particleGeo,
particleMat,
instanceCount
);
const dummy = new THREE.Object3D();
const instPositions = [];
for (let i = 0; i < instanceCount; i++) {
const theta = Math.random() * Math.PI * 2;
const phi = Math.acos(2 * Math.random() - 1);
const r = Math.pow(Math.random(), 1 / 3) * 3;
const x = r * Math.sin(phi) * Math.cos(theta);
const y = r * Math.sin(phi) * Math.sin(theta);
const z = r * Math.cos(phi);
dummy.position.set(x, y, z);
dummy.rotation.set(
Math.random() * Math.PI,
Math.random() * Math.PI,
Math.random() * Math.PI
);
dummy.updateMatrix();
instancedMesh.setMatrixAt(i, dummy.matrix);
instPositions.push({ x, y, z, vx: 0, vy: 0, vz: 0 });
}
instancedMesh.instanceMatrix.needsUpdate = true;
scene.add(instancedMesh);
5,000 tiny wireframe tetrahedra floating in a sphere. Each one is a full 3D shape with actual edges and faces, not just a flat sprite. The GPU renders all 5,000 in a single draw call -- that's the magic of instancing. Without it, 5,000 separate meshes would mean 5,000 draw calls, which would tank performance. With instancing, the GPU processes the same geometry 5,000 times but with different transform matrices.
The dummy Object3D is a trick. Instead of building 4x4 matrices by hand (which is tedious and error-prone), we set position and rotation on a dummy object, call updateMatrix(), and copy its matrix to the instanced mesh. Three.js builds the matrix for us.
To animate instanced particles, update the matrices each frame:
function animate() {
requestAnimationFrame(animate);
const t = clock.getElapsedTime();
for (let i = 0; i < instanceCount; i++) {
const p = instPositions[i];
// noise drift
const nx = noise3D(p.x * 0.5 + t * 0.1, p.y * 0.5, p.z * 0.5) - 0.5;
const ny = noise3D(p.x * 0.5, p.y * 0.5 + t * 0.1 + 50, p.z * 0.5) - 0.5;
const nz = noise3D(p.x * 0.5, p.y * 0.5, p.z * 0.5 + t * 0.1 + 100) - 0.5;
p.x += nx * 0.005;
p.y += ny * 0.005;
p.z += nz * 0.005;
dummy.position.set(p.x, p.y, p.z);
dummy.rotation.x = t * 0.3 + i * 0.01;
dummy.rotation.y = t * 0.2 + i * 0.007;
dummy.updateMatrix();
instancedMesh.setMatrixAt(i, dummy.matrix);
}
instancedMesh.instanceMatrix.needsUpdate = true;
controls.update();
renderer.render(scene, camera);
}
Each tetrahedron drifts through a 3D noise field and slowly rotates. Because they're wireframe, you can see through them -- the overlapping transparent edges create a delicate crystal-lattice aesthetic. Change the geometry to ConeGeometry(0.02, 0.06, 4) and you get tiny arrows showing flow direction. Use OctahedronGeometry(0.02) for diamond-like sparks. The shape of the particle completely changes the visual character of the system.
The difference between Points and InstancedMesh: Points are always camera-facing sprites (billboards). They're cheaper to render and scale to millions of particles. InstancedMesh gives you real 3D geometry with actual depth, rotation, and shape -- but costs more per instance. For most creative coding, 5,000-50,000 instanced meshes run fine on modern GPUs. For 100,000+ particles, stick with Points.
Creative exercise: galaxy
Alright, lets build something you can stare at for a while. A spiral galaxy with half a million particles, spiral arm structure from parametric equations, additive blending for the glow, and slow rotation you can orbit around:
import * as THREE from 'three';
import { OrbitControls } from 'three/addons/controls/OrbitControls.js';
const scene = new THREE.Scene();
scene.background = new THREE.Color(0x010102);
const camera = new THREE.PerspectiveCamera(
60, window.innerWidth / window.innerHeight, 0.1, 200
);
camera.position.set(0, 4, 8);
const renderer = new THREE.WebGLRenderer({ antialias: true });
renderer.setSize(window.innerWidth, window.innerHeight);
renderer.setPixelRatio(window.devicePixelRatio);
document.body.appendChild(renderer.domElement);
const controls = new OrbitControls(camera, renderer.domElement);
controls.enableDamping = true;
const starCount = 500000;
const positions = new Float32Array(starCount * 3);
const colors = new Float32Array(starCount * 3);
const sizes = new Float32Array(starCount);
const arms = 4;
const armSpread = 0.4;
const maxRadius = 6;
for (let i = 0; i < starCount; i++) {
const i3 = i * 3;
// which arm
const arm = Math.floor(Math.random() * arms);
const armAngle = (arm / arms) * Math.PI * 2;
// distance from center
const r = Math.pow(Math.random(), 0.7) * maxRadius;
// spiral angle: increases with distance
const spiralAngle = armAngle + r * 1.2;
// scatter around the arm
const scatter = (Math.random() - 0.5) * armSpread * (1 + r * 0.3);
const scatterY = (Math.random() - 0.5) * 0.15 * (1 + r * 0.1);
const angle = spiralAngle + scatter;
positions[i3] = Math.cos(angle) * r;
positions[i3 + 1] = scatterY;
positions[i3 + 2] = Math.sin(angle) * r;
// color: hot white center, blue arms, reddish outer
const t = r / maxRadius;
if (t < 0.15) {
// core: bright warm white
colors[i3] = 1.0;
colors[i3 + 1] = 0.95;
colors[i3 + 2] = 0.8;
} else if (t < 0.6) {
// mid: blue-white
const mt = (t - 0.15) / 0.45;
colors[i3] = 0.6 - mt * 0.2;
colors[i3 + 1] = 0.7 - mt * 0.1;
colors[i3 + 2] = 1.0;
} else {
// outer: dimmer, slightly reddish
const ot = (t - 0.6) / 0.4;
colors[i3] = 0.5 + ot * 0.3;
colors[i3 + 1] = 0.3 - ot * 0.1;
colors[i3 + 2] = 0.4 - ot * 0.2;
}
// size: larger in the core, smaller at edges
sizes[i] = (1 - t * 0.7) * (1.0 + Math.random() * 2.0);
}
const galaxyGeo = new THREE.BufferGeometry();
galaxyGeo.setAttribute('position', new THREE.BufferAttribute(positions, 3));
galaxyGeo.setAttribute('color', new THREE.BufferAttribute(colors, 3));
galaxyGeo.setAttribute('aSize', new THREE.BufferAttribute(sizes, 1));
The spiral arm structure comes from a simple trick: for each star, pick one of N arms (evenly spaced around 360 degrees), then add an angle that increases with distance. This creates the spiral. Adding random scatter spreads stars around the arm centerline so it's not a thin line. More scatter at larger radii makes the arms wider as they spiral outward -- just like real galaxies.
The Math.pow(Math.random(), 0.7) for radius biases the distribution toward the center. More stars in the core, fewer at the edges. Combined with the per-distance size (larger in the core), the center glows densely while the outer arms are sparse and delicate.
Now the shader for per-vertex size:
const galaxyVertShader = `
attribute float aSize;
varying vec3 vColor;
void main() {
vColor = color;
vec4 mvPos = modelViewMatrix * vec4(position, 1.0);
gl_PointSize = aSize * (250.0 / -mvPos.z);
gl_PointSize = max(gl_PointSize, 0.5); // minimum visible size
gl_Position = projectionMatrix * mvPos;
}
`;
const galaxyFragShader = `
varying vec3 vColor;
void main() {
vec2 c = gl_PointCoord - 0.5;
float d = length(c);
if (d > 0.5) discard;
float alpha = smoothstep(0.5, 0.0, d);
alpha *= alpha; // sharper falloff
gl_FragColor = vec4(vColor * alpha, alpha);
}
`;
const galaxyMat = new THREE.ShaderMaterial({
vertexShader: galaxyVertShader,
fragmentShader: galaxyFragShader,
transparent: true,
depthWrite: false,
blending: THREE.AdditiveBlending,
vertexColors: true
});
const galaxy = new THREE.Points(galaxyGeo, galaxyMat);
scene.add(galaxy);
const clock = new THREE.Clock();
function animate() {
requestAnimationFrame(animate);
const t = clock.getElapsedTime();
galaxy.rotation.y = t * 0.03;
controls.update();
renderer.render(scene, camera);
}
animate();
window.addEventListener('resize', () => {
camera.aspect = window.innerWidth / window.innerHeight;
camera.updateProjectionMatrix();
renderer.setSize(window.innerWidth, window.innerHeight);
});
Half a million stars, four spiral arms, additive blending, slow rotation. Orbit around it. See it from above -- the spiral structure is clear, arms wrapping around a bright core. Tilt to edge-on and it's a thin disc with a bright central bulge. Zoom into the core and you're inside a dense cloud of stars. The additive blending means the core glows intensely where thousands of particles overlap.
This runs at 60fps on any modern GPU because Points is extremely efficient -- each particle is just one vertex and a tiny fragment shader per pixel it covers. Half a million vertices is nothing for a modern graphics card. You could push to 2 million before hitting performance limits on most hardware.
The galaxy doesn't need per-frame position updates. The only animation is the slow rotation.y. The spiral structure is baked into the positions at creation time. If you wanted differential rotation (inner stars rotating faster than outer ones, like real galaxies), you'd store each star's radius and update positions each frame using angle = baseAngle + time * speed / radius. But the static version with whole-body rotation already looks spectacular.
Performance considerations
A few numbers to keep in mind for 3D particle systems:
Points with PointsMaterial: ~1-2 million particles at 60fps. Each particle is one vertex, one draw call for the whole system. The bottleneck is usually the fragment shader (how many pixels each particle covers) not the vertex count.
Points with ShaderMaterial and per-vertex attributes: same ballpark, maybe slightly less depending on shader complexity. Custom vertex attributes add some overhead but it's minimal.
InstancedMesh: ~10,000-100,000 instances at 60fps depending on geometry complexity. Each instance has a 4x4 matrix (16 floats) that needs to be uploaded every frame if animated. The geometry is processed once and reused for all instances.
CPU-side updates (modifying Float32Array in JavaScript): this is usually the bottleneck for animated particles. Looping over 500,000 particles in JavaScript every frame is slow. For noise-driven animation of that many particles, you'd want to move the computation to the GPU using either vertex shader displacement (compute positions in the shader) or a compute shader approach (like ep046). The galaxy exercise avoids this by not updating positions per frame.
When to use what:
- Static or slowly rotating clouds:
Pointswith baked positions. Millions of particles, no CPU cost per frame. - Animated particles with simple physics (gravity, drift):
Pointswith CPU-side updates. Cap at ~50,000-100,000 for 60fps in JavaScript. - Particles that need real 3D shape and rotation:
InstancedMesh. Cap at ~50,000 for 60fps. - Massive animated particle fields: move physics to the vertex shader or use transform feedback / compute shaders. That's a topic for a future episode.
What comes next
We took particles from 2D to 3D. Points for massive counts, InstancedMesh for shaped particles, ShaderMaterial for per-particle control, emitter patterns for birth-and-death lifecycles, noise fields for organic motion. The galaxy exercise showed that half a million particles is totally viable for real-time creative work.
We've been building geometry by hand (ep063) and shading it ourselves (ep064). Next episode we'll take that further with procedural mesh generation -- creating complex 3D forms entirely from code. Not grids and terrain this time, but organic shapes, mathematical surfaces, and generative structures where the geometry itself is the artwork.
't Komt erop neer...
- Three.js
Pointsrenders each vertex in a BufferGeometry as a screen-facing sprite. Fill aFloat32Arraywith positions, wrap it inPointsMaterialorShaderMaterial, and you've got a particle system. Scales to hundreds of thousands (even millions) of particles in a single draw call - Per-vertex colors via the
colorattribute +vertexColors: truelet you color particles by any property -- distance from center, height, noise value, data mapping. Same approach as terrain vertex colors from ep063 - Particle textures (
PointsMaterial.map) replace the default square with circles, stars, or custom shapes. Combine withtransparent: true,depthWrite: false, andAdditiveBlendingfor the classic glowing particle cloud aesthetic where overlapping particles accumulate brightness - Animating particles: update the position array each frame and set
needsUpdate = true. Add a centering force (spring toward origin) or noise-driven drift (sample 3D noise at particle position, use as velocity). Same attractor and noise-field concepts from ep011, ep018, and ep046 but now in 3D gl_PointSizein a vertex shader controls per-particle size.gl_PointCoordin a fragment shader gives UV coordinates within each point sprite -- use it to draw circles, rings, or custom shapes. These are ShaderMaterial features not available through PointsMaterial alone- Emitter pattern: pre-allocate a pool, track age and lifespan per particle, recycle dead particles by resetting their position to the emission point. Color shifts from hot (young) to cool (old) based on the age/lifespan ratio. No allocation, no garbage collection -- just array overwrites
InstancedMeshrenders thousands of copies of a geometry with per-instance transforms. Real 3D shapes (tetrahedra, cones, cubes) instead of flat sprites. More expensive than Points but gives actual depth, rotation, and silhouette. Good for 5,000-50,000 shaped particles- Galaxy exercise: half a million stars with spiral arm structure from parametric equations, per-vertex color and size via ShaderMaterial, additive blending for dense core glow. Runs at 60fps because static
Pointsgeometry is just vertex data -- the GPU handles it effortlessly
Sallukes! Thanks for reading.
X
Congratulations @femdev! You have completed the following achievement on the Hive blockchain And have been rewarded with New badge(s)
Your next target is to reach 100 posts.
You can view your badges on your board and compare yourself to others in the Ranking
If you no longer want to receive notifications, reply to this comment with the word
STOP