Building a 3D particle head hero effect with WebGL
A 3D point cloud head effect converts a mesh model into tens of thousands of individually animated particles, rendered via THREE.Points with custom GLSL shaders, additive blending, and bloom post-processing. The core pipeline is straightforward: load a head model, sample its surface with MeshSurfaceSampler, build a BufferGeometry with custom per-particle attributes, then render everything through a ShaderMaterial that handles glow, dissolve, and mouse interaction entirely on the GPU. This technique powers hero sections on award-winning sites and can comfortably handle 100K–500K particles at 60fps in modern browsers. Below is a complete implementation guide with working code for every layer of the stack.
Step 1: Convert a 3D head model into a particle system
The recommended approach uses Three.js's MeshSurfaceSampler, which generates uniformly distributed points across any mesh surface regardless of polygon density. Unlike direct vertex extraction (which is limited to the model's vertex count and produces uneven distributions), the sampler interpolates across triangle faces weighted by area.
1import * as THREE from 'three';
2import { GLTFLoader } from 'three/addons/loaders/GLTFLoader.js';
3import { MeshSurfaceSampler } from 'three/addons/math/MeshSurfaceSampler.js';
4
5const loader = new GLTFLoader();
6const gltf = await loader.loadAsync('/models/head.glb');
7
8// Find the mesh in the GLTF scene graph
9let headMesh = null;
10gltf.scene.traverse((child) => {
11 if (child.isMesh) headMesh = child;
12});
13
14// Build sampler — O(n) build, O(log n) per sample
15const sampler = new MeshSurfaceSampler(headMesh).build();
16
17// Sample 60,000 evenly distributed points
18const PARTICLE_COUNT = 60000;
19const positions = new Float32Array(PARTICLE_COUNT * 3);
20const normals = new Float32Array(PARTICLE_COUNT * 3);
21const sizes = new Float32Array(PARTICLE_COUNT);
22const randoms = new Float32Array(PARTICLE_COUNT);
23const colors = new Float32Array(PARTICLE_COUNT * 3);
24
25const tempPos = new THREE.Vector3();
26const tempNorm = new THREE.Vector3();
27const palette = [
28 new THREE.Color('#4fc3f7'), new THREE.Color('#81d4fa'),
29 new THREE.Color('#b3e5fc'), new THREE.Color('#e1f5fe'),
30 new THREE.Color('#ffffff'),
31];
32
33for (let i = 0; i < PARTICLE_COUNT; i++) {
34 sampler.sample(tempPos, tempNorm);
35 positions[i * 3] = tempPos.x;
36 positions[i * 3 + 1] = tempPos.y;
37 positions[i * 3 + 2] = tempPos.z;
38 normals[i * 3] = tempNorm.x;
39 normals[i * 3 + 1] = tempNorm.y;
40 normals[i * 3 + 2] = tempNorm.z;
41
42 const color = palette[Math.floor(Math.random() * palette.length)];
43 colors[i * 3] = color.r;
44 colors[i * 3 + 1] = color.g;
45 colors[i * 3 + 2] = color.b;
46
47 sizes[i] = Math.random() * 1.5 + 0.5;
48 randoms[i] = Math.random();
49}
50
51const geometry = new THREE.BufferGeometry();
52geometry.setAttribute('position', new THREE.BufferAttribute(positions, 3));
53geometry.setAttribute('aNormal', new THREE.BufferAttribute(normals, 3));
54geometry.setAttribute('aColor', new THREE.BufferAttribute(colors, 3));
55geometry.setAttribute('aSize', new THREE.BufferAttribute(sizes, 1));
56geometry.setAttribute('aRandom', new THREE.BufferAttribute(randoms, 1));A 2K–10K polygon head model is the sweet spot — high enough for good surface detail distribution, small enough for fast loading (<500KB GLB). The sampler controls point density independently, so you can sample 100K points from a 5K-poly mesh. For model sources, Sketchfab offers numerous CC-licensed head models (search "human head base mesh"), and Three.js ships with test models like Nefertiti.glb. Before export, prepare the model in Blender: merge all parts into a single watertight mesh, remove interior geometry (eyes, teeth), apply all transforms (Ctrl+A), center the origin, and export as .glb.
Step 2: GLSL shaders for glowing ethereal particles
The visual magic happens in two shaders. The vertex shader handles size attenuation, noise-based displacement, and dissolve masking. The fragment shader creates soft circular particles with radial color gradients.
Complete vertex shader
1// Simplex 3D Noise — Ian McEwan & Stefan Gustavson (MIT)
2vec4 permute(vec4 x) { return mod(((x*34.0)+1.0)*x, 289.0); }
3vec4 taylorInvSqrt(vec4 r) { return 1.79284291400159 - 0.85373472095314 * r; }
4
5float snoise(vec3 v) {
6 const vec2 C = vec2(1.0/6.0, 1.0/3.0);
7 const vec4 D = vec4(0.0, 0.5, 1.0, 2.0);
8 vec3 i = floor(v + dot(v, C.yyy));
9 vec3 x0 = v - i + dot(i, C.xxx);
10 vec3 g = step(x0.yzx, x0.xyz);
11 vec3 l = 1.0 - g;
12 vec3 i1 = min(g.xyz, l.zxy);
13 vec3 i2 = max(g.xyz, l.zxy);
14 vec3 x1 = x0 - i1 + C.xxx;
15 vec3 x2 = x0 - i2 + 2.0*C.xxx;
16 vec3 x3 = x0 - 1.0 + 3.0*C.xxx;
17 i = mod(i, 289.0);
18 vec4 p = permute(permute(permute(
19 i.z + vec4(0.0, i1.z, i2.z, 1.0))
20 + i.y + vec4(0.0, i1.y, i2.y, 1.0))
21 + i.x + vec4(0.0, i1.x, i2.x, 1.0));
22 float n_ = 1.0/7.0;
23 vec3 ns = n_ * D.wyz - D.xzx;
24 vec4 j = p - 49.0*floor(p*ns.z*ns.z);
25 vec4 x_ = floor(j * ns.z);
26 vec4 y_ = floor(j - 7.0*x_);
27 vec4 x = x_*ns.x + ns.yyyy;
28 vec4 y = y_*ns.x + ns.yyyy;
29 vec4 h = 1.0 - abs(x) - abs(y);
30 vec4 b0 = vec4(x.xy, y.xy);
31 vec4 b1 = vec4(x.zw, y.zw);
32 vec4 s0 = floor(b0)*2.0 + 1.0;
33 vec4 s1 = floor(b1)*2.0 + 1.0;
34 vec4 sh = -step(h, vec4(0.0));
35 vec4 a0 = b0.xzyw + s0.xzyw*sh.xxyy;
36 vec4 a1 = b1.xzyw + s1.xzyw*sh.zzww;
37 vec3 p0 = vec3(a0.xy, h.x);
38 vec3 p1 = vec3(a0.zw, h.y);
39 vec3 p2 = vec3(a1.xy, h.z);
40 vec3 p3 = vec3(a1.zw, h.w);
41 vec4 norm = taylorInvSqrt(vec4(dot(p0,p0),dot(p1,p1),dot(p2,p2),dot(p3,p3)));
42 p0 *= norm.x; p1 *= norm.y; p2 *= norm.z; p3 *= norm.w;
43 vec4 m = max(0.6 - vec4(dot(x0,x0),dot(x1,x1),dot(x2,x2),dot(x3,x3)), 0.0);
44 m = m * m;
45 return 42.0 * dot(m*m, vec4(dot(p0,x0),dot(p1,x1),dot(p2,x2),dot(p3,x3)));
46}
47
48uniform float uTime;
49uniform float uPointSize;
50uniform float uPixelRatio;
51uniform vec3 uMouseWorld;
52uniform float uMouseRadius;
53uniform float uDissolve; // 0.0 = solid, 1.0 = dissolved
54
55attribute float aSize;
56attribute float aRandom;
57attribute vec3 aColor;
58attribute vec3 aNormal;
59
60varying vec3 vColor;
61varying float vAlpha;
62varying float vNoise;
63
64void main() {
65 vec3 pos = position;
66
67 // Gentle idle floating along surface normal
68 pos += aNormal * sin(uTime * 0.5 + aRandom * 6.2831) * 0.005;
69
70 // Noise-based scatter at periphery
71 float distFromCenter = length(pos);
72 float scatterMask = smoothstep(0.4, 1.0, distFromCenter / 2.0);
73 float noiseVal = snoise(pos * 1.5 + uTime * 0.15);
74 vNoise = noiseVal;
75
76 vec3 noiseDir = vec3(
77 snoise(pos * 1.2 + vec3(0.0, 0.0, uTime * 0.3)),
78 snoise(pos * 1.2 + vec3(100.0, 0.0, uTime * 0.3)),
79 snoise(pos * 1.2 + vec3(0.0, 100.0, uTime * 0.3))
80 );
81 pos += noiseDir * 0.8 * scatterMask * uDissolve;
82
83 // Mouse repulsion — GPU-side displacement
84 float mouseDist = distance(pos, uMouseWorld);
85 float repulsion = 1.0 - smoothstep(0.0, uMouseRadius, mouseDist);
86 vec3 repelDir = normalize(pos - uMouseWorld);
87 pos += repelDir * repulsion * 0.5;
88
89 vec4 mvPosition = modelViewMatrix * vec4(pos, 1.0);
90 gl_Position = projectionMatrix * mvPosition;
91
92 // Size attenuation with perspective
93 gl_PointSize = uPointSize * aSize * uPixelRatio / -mvPosition.z;
94 gl_PointSize *= mix(1.0, 0.3, scatterMask * uDissolve);
95
96 // Color: cyan core → teal edges, white hotspots near center
97 float colorNoise = snoise(pos * 2.0 + uTime * 0.1) * 0.5 + 0.5;
98 vColor = mix(vec3(0.0, 0.95, 1.0), aColor, colorNoise * 0.5);
99 vColor = mix(vec3(1.0), vColor, smoothstep(0.0, 0.3, distFromCenter / 2.0));
100
101 // Twinkle + depth fade
102 vAlpha = (0.5 + 0.5 * sin(uTime * 1.5 + aRandom * 6.2831))
103 * smoothstep(0.0, 5.0, -mvPosition.z)
104 * (1.0 - scatterMask * uDissolve * 0.7);
105}Complete fragment shader
1uniform float uDissolve;
2
3varying vec3 vColor;
4varying float vAlpha;
5varying float vNoise;
6
7void main() {
8 // Soft circular particle from gl_PointCoord
9 float dist = length(gl_PointCoord - 0.5);
10 if (dist > 0.5) discard;
11
12 // Dissolve: discard based on noise threshold
13 float dissolveVal = vNoise * 0.5 + 0.5;
14 if (dissolveVal < uDissolve * 0.8) discard;
15
16 // Smooth radial falloff
17 float softCircle = 1.0 - smoothstep(0.0, 0.5, dist);
18
19 // White-hot center → colored edge per particle
20 vec3 finalColor = mix(vec3(1.0), vColor, smoothstep(0.0, 0.4, dist));
21
22 // Edge glow near dissolve boundary
23 float edgeGlow = 1.0 - smoothstep(0.0, 0.15, dissolveVal - uDissolve * 0.8);
24 finalColor += edgeGlow * vec3(0.3, 0.8, 1.0) * 0.5;
25
26 float alpha = softCircle * vAlpha;
27 gl_FragColor = vec4(finalColor * alpha, alpha);
28}Key shader techniques at play: gl_PointCoord provides UV coordinates within each point sprite, enabling per-pixel circular soft falloff without actual geometry. Additive blending (THREE.AdditiveBlending) is essential — overlapping particles sum their colors, creating natural bright spots at dense areas of the head. depthWrite: false prevents particles from occluding each other. The simplex noise function (from Ian McEwan and Stefan Gustavson, MIT licensed) drives both the dissolve pattern and the scatter displacement.
Mouse-tracking interaction with smooth interpolation
The interaction system has two layers: macro rotation of the entire head following the cursor, and micro displacement where individual particles near the mouse repel outward.
Smooth mouse follow with frame-rate-independent lerp
1const mouse = new THREE.Vector2();
2const smoothMouse = new THREE.Vector2();
3const mouseWorld = new THREE.Vector3();
4
5// Invisible hit plane for raycasting mouse to world space
6const hitPlane = new THREE.Mesh(
7 new THREE.PlaneGeometry(20, 20),
8 new THREE.MeshBasicMaterial({ visible: false })
9);
10scene.add(hitPlane);
11const raycaster = new THREE.Raycaster();
12
13window.addEventListener('mousemove', (e) => {
14 mouse.x = (e.clientX / window.innerWidth) * 2 - 1;
15 mouse.y = -(e.clientY / window.innerHeight) * 2 + 1;
16
17 raycaster.setFromCamera(mouse, camera);
18 const hits = raycaster.intersectObject(hitPlane);
19 if (hits.length) mouseWorld.copy(hits[0].point);
20});
21
22const group = new THREE.Group();
23group.add(particles);
24scene.add(group);
25
26const clock = new THREE.Clock();
27let prev = 0;
28
29function animate() {
30 requestAnimationFrame(animate);
31 const elapsed = clock.getElapsedTime();
32 const dt = elapsed - prev;
33 prev = elapsed;
34
35 // Frame-rate independent smooth follow (damping factor × deltaTime)
36 smoothMouse.x += (mouse.x - smoothMouse.x) * 5 * dt;
37 smoothMouse.y += (mouse.y - smoothMouse.y) * 5 * dt;
38
39 // Macro: rotate entire head group toward mouse
40 group.rotation.y = smoothMouse.x * 0.3 + Math.sin(elapsed * 0.3) * 0.1;
41 group.rotation.x = smoothMouse.y * 0.2;
42 group.position.y = Math.sin(elapsed * 0.5) * 0.05; // idle breathing
43
44 // Micro: pass mouse world position to vertex shader for per-particle repulsion
45 material.uniforms.uTime.value = elapsed;
46 material.uniforms.uMouseWorld.value.copy(mouseWorld);
47
48 composer.render();
49}The damping factor of 5 multiplied by deltaTime produces consistent motion across 60Hz and 144Hz displays. Lower values (2–3) create a dreamier, more delayed follow; higher values (8–10) feel snappier. The vertex shader handles per-particle displacement by computing distance(pos, uMouseWorld) and applying a smoothstep-based repulsion force — this runs entirely on the GPU with zero per-particle CPU iteration.
For more sophisticated interaction, the touch texture technique (from Bruno Imbrizi's Codrops tutorial) draws mouse positions onto an offscreen 256×256 canvas with gradual alpha fade, then passes it as a CanvasTexture uniform. The canvas fade provides built-in easing and trail effects without CPU per-particle computation.
Bloom post-processing and the complete render pipeline
The ethereal glow comes from UnrealBloomPass, which applies a multi-pass Gaussian blur to bright pixels and composites the result back. For particle systems, set the threshold low (0.0–0.1) since additive-blended particles are often dim individually.
1import { EffectComposer } from 'three/addons/postprocessing/EffectComposer.js';
2import { RenderPass } from 'three/addons/postprocessing/RenderPass.js';
3import { UnrealBloomPass } from 'three/addons/postprocessing/UnrealBloomPass.js';
4import { OutputPass } from 'three/addons/postprocessing/OutputPass.js';
5
6const composer = new EffectComposer(renderer);
7composer.addPass(new RenderPass(scene, camera));
8
9const bloomPass = new UnrealBloomPass(
10 new THREE.Vector2(window.innerWidth, window.innerHeight),
11 1.2, // strength — intensity of glow
12 0.6, // radius — spread of the halo
13 0.1 // threshold — low to catch dim additive particles
14);
15composer.addPass(bloomPass);
16composer.addPass(new OutputPass());
17
18// Material setup — ties it all together
19const material = new THREE.ShaderMaterial({
20 uniforms: {
21 uTime: { value: 0 },
22 uPointSize: { value: 4.0 },
23 uPixelRatio: { value: renderer.getPixelRatio() },
24 uMouseWorld: { value: new THREE.Vector3() },
25 uMouseRadius:{ value: 1.5 },
26 uDissolve: { value: 0.0 },
27 },
28 vertexShader: vertexSource,
29 fragmentShader: fragmentSource,
30 transparent: true,
31 blending: THREE.AdditiveBlending,
32 depthWrite: false,
33 depthTest: true,
34});
35
36const particles = new THREE.Points(geometry, material);For scenes with non-particle elements (text, UI), use selective bloom via a two-composer approach: render bloom-enabled objects (particles) to an offscreen target with other objects blacked out, then composite bloom + base scene in a final pass. This prevents bloom from bleeding into sharp UI elements. The Three.js selective bloom example demonstrates the pattern using THREE.Layers to tag which objects bloom.
| Bloom parameter | Recommended range | Effect |
|---|---|---|
strength | 0.8–2.0 | Higher = more intense glow halo |
radius | 0.3–0.8 | Higher = softer, wider spread |
threshold | 0.0–0.1 | Must be low for additive particles |
Rendering bloom at half resolution can nearly double frame rate: composer.setSize(width/2, height/2). The pmndrs/postprocessing library automatically merges compatible effect passes and is measurably faster than Three.js's default EffectComposer for multi-pass pipelines.
Performance at scale: 50K to 500K particles
The critical principle is moving all animation to the vertex shader. Buffer attributes (position, aRandom, aNormal) are uploaded to the GPU once at initialization. Only lightweight uniforms (uTime, uMouse) update per frame — no needsUpdate = true on geometry, no per-particle CPU loops.
| Approach | Particle count at 60fps | Key bottleneck |
|---|---|---|
| CPU-updated attributes | 10K–50K | CPU→GPU data transfer each frame |
| GPU vertex shader (recommended) | 100K–500K | Fragment shader complexity |
| GPGPU / FBO ping-pong | 500K–1M | Texture read bandwidth |
| WebGPU compute shaders | 1M–10M | Future-proof, not yet universal |
For the 50K–200K range typical of hero sections, THREE.Points with a ShaderMaterial is the optimal choice. InstancedMesh offers per-particle rotation and 3D shapes but introduces per-instance overhead that hurts at high counts. The Three.js official webgl_buffergeometry_points example runs 500K points smoothly using this pattern.
Practical optimization checklist:
- Pre-allocate all
Float32Arraybuffers at init; reuseTHREE.Vector3()temp objects — never allocate in the render loop - Set
renderer.setPixelRatio(Math.min(devicePixelRatio, 2))to cap Retina rendering - On mobile, reduce particle count to 20K–30K and use
mediumpprecision in shaders (mobile GPUs processmediumpat ~2× the speed ofhighp) - Call
geometry.computeBoundingSphere()and setparticles.frustumCulled = falseif GPU animation pushes particles beyond the original bounding volume - Replace shader conditionals with
mix()andstep()— GPU branches are expensive - Pack related uniforms into
vec4to reduce WebGL uniform calls
For particle counts beyond 500K, the GPGPU/FBO approach stores particle positions in a DataTexture and updates them via GPUComputationRenderer fragment shaders each frame. The Codrops "Dreamy Particle Effect" tutorial demonstrates this exact pipeline with MeshSurfaceSampler feeding initial positions into an FBO simulation.
Tutorials, repos, and real-world references
The best resources for implementing this effect, ranked by direct relevance:
- Codrops: "Crafting a Dreamy Particle Effect with Three.js and GPGPU" (Dominik Fojcik, Dec 2024) — The most directly applicable tutorial. Uses
MeshSurfaceSampler→ GPGPU simulation → bloom, with raycaster mouse repulsion. Full GitHub repo atgithub.com/DGFX/codrops-dreamy-particles. - Codrops: "Surface Sampling in Three.js" (Louis Hoebregts, 2021) — The definitive
MeshSurfaceSamplertutorial with multiple CodePen demos. - Codrops: "Interactive Particles with Three.js" (Bruno Imbrizi, 2019) — 57,600 instanced particles with the touch-texture mouse interaction technique. The gold standard for particle interactivity.
- Codrops: "FBO Particles" (Yuri Artiukh/akella, 2021) — Video walkthrough of FBO-based particle systems. Akella's YouTube channel has dozens of related creative coding sessions.
- Codrops: "Implementing a Dissolve Effect with Shaders and Particles" (Feb 2025) — Dissolving meshes into particles using
ShaderMaterial. - CodePen: "WebGL Particle Head" by Robert Bue (
codepen.io/robbue/pen/nearyy) — The classic demo: loads an OBJ head, extracts vertices, renders as mouse-reactive particles. - Maxime Heckel: "The Magical World of Particles with R3F and Shaders" — Comprehensive React Three Fiber particles guide covering BufferGeometry, FBO technique, curl noise, and 1M+ particles.
- Three.js Forum: "3D Point Cloud for My Head" — A developer captured their head with iPhone TrueDepth camera, exported OBJ, and rendered as a point cloud.
- GitHub:
Kshitij978/Three.js-Point-cloud-morphing-effect— Particles morphing between 3D shapes (Queen, Pawn, etc.) with GSAP + TypeScript. - GitHub:
VedantSG123/R3FPointsFX— R3F component that accepts an array of GLTF meshes and auto-generates particle systems with morphing transitions. - Nicolas Barradeau's FBO Particles blog post (
barradeau.com/blog/?p=621) — The foundational explanation of the GPGPU/FBO technique referenced by nearly every other resource.
Custom implementation vs. library alternatives
tsParticles/particles.js cannot create this effect. They are fundamentally 2D Canvas-based, cannot load 3D models, and hit performance walls at ~10K particles. They are designed for decorative 2D particle backgrounds (snow, confetti, connecting dots), not 3D point cloud rendering.
For React Three Fiber projects, three viable shortcut libraries exist. R3FPointsFX (github.com/VedantSG123/R3FPointsFX) is the fastest path — pass GLTF meshes and it auto-generates particles with morphing transitions and a custom shader API. r3f-particle-system (github.com/sampstrong/r3f-particle-system) provides declarative FBO-based simulation with MeshEmitter that samples from model surfaces. three.quarks offers a full VFX toolkit with mesh surface emission, color/size over lifetime, and force fields, plus an @react-three/quarks R3F wrapper.
The tradeoff is clear: libraries get you 80% of the effect in hours rather than days, but cap your customization. A custom ShaderMaterial implementation gives unlimited control over every particle's behavior, color, and animation — essential for matching a specific reference like Betawise. The performance ceiling is identical since R3F is a thin wrapper around Three.js. Choose custom GLSL for production hero sections where the effect is a signature visual element; choose libraries for prototyping or when the particle effect is secondary to other content.
Conclusion
The particle head effect is built on a surprisingly narrow technical stack: MeshSurfaceSampler for point generation, THREE.Points + ShaderMaterial for rendering, simplex noise for organic movement, and UnrealBloomPass for glow. Mouse interaction is best handled by passing a single vec3 uniform to the vertex shader rather than iterating particles on the CPU. The Codrops GPGPU tutorial and its companion GitHub repo provide the closest ready-made implementation to the Betawise-style effect. For a production hero section, budget 2–3 days for a custom implementation with full dissolve, mouse repulsion, and bloom — or a few hours using R3FPointsFX if you're in a React stack and willing to trade some visual uniqueness for speed.