The Technology Behind the Memorial: Three.js and WebGL Visualization
Building a memorial that honors 63,872 individual lives required innovative technology that could handle massive datasets while maintaining the dignity and respect each soul deserves. Here's how we built it.
The Technical Challenge
Creating an interactive memorial for over 63,000 individuals presents unique technical challenges. Each person needs to be represented individually, with their own data, position, and interactive capabilities. Traditional web technologies would struggle with this scale, which is why we turned to WebGL and Three.js.
The memorial needed to be:
- Performant: Smooth 60fps rendering with 63,000+ particles
- Interactive: Individual hover and click detection for each particle
- Accessible: Works across devices and browsers
- Respectful: Dignified presentation without exploitation
- Scalable: Ability to add more data as it becomes available
Core Technologies
Technology Stack
Frontend
- • Three.js: 3D rendering and scene management
- • React Three Fiber: React integration for Three.js
- • Next.js 15: Full-stack React framework
- • TypeScript: Type safety and developer experience
- • Tailwind CSS: Utility-first styling
Graphics & Audio
- • WebGL: Hardware-accelerated graphics
- • GLSL Shaders: Custom particle effects
- • Web Audio API: Immersive audio experience
- • Canvas API: 2D overlays and UI elements
- • Zustand: State management
Particle System Architecture
The heart of the memorial is a sophisticated particle system where each particle represents one person. This required careful optimization to maintain performance while preserving individual identity.
// Particle system initialization
const particleCount = data.length // 63,872+ particles
const positions = new Float32Array(particleCount * 3)
const colors = new Float32Array(particleCount * 3)
const sizes = new Float32Array(particleCount)
// Each particle gets individual properties
data.forEach((person, index) => {
const i3 = index * 3
// Position in 3D space
positions[i3] = person.x
positions[i3 + 1] = person.y
positions[i3 + 2] = person.z
// Individual coloring based on person data
const color = getPersonColor(person)
colors[i3] = color.r
colors[i3 + 1] = color.g
colors[i3 + 2] = color.b
// Size based on age or other factors
sizes[index] = getPersonSize(person)
})Performance Optimizations
Rendering 63,000+ interactive particles at 60fps requires aggressive optimization:
GPU-Accelerated Rendering
All particle calculations happen on the GPU using custom GLSL shaders. This moves computation from the CPU to the graphics card, enabling smooth performance even with massive datasets.
Instanced Geometry
Instead of creating 63,000 individual objects, we use instanced geometry to render all particles in a single draw call, dramatically reducing the performance overhead.
Level of Detail (LOD)
Particles far from the camera use simplified rendering, while nearby particles get full detail. This ensures performance while maintaining visual quality where it matters most.
Interactive Features
Making 63,000+ particles individually interactive required innovative solutions:
// Raycasting for particle selection
const raycaster = new THREE.Raycaster()
const mouse = new THREE.Vector2()
function onMouseMove(event) {
// Convert mouse coordinates to normalized device coordinates
mouse.x = (event.clientX / window.innerWidth) * 2 - 1
mouse.y = -(event.clientY / window.innerHeight) * 2 + 1
// Cast ray from camera through mouse position
raycaster.setFromCamera(mouse, camera)
// Check intersection with particle system
const intersects = raycaster.intersectObject(particleSystem)
if (intersects.length > 0) {
const particleIndex = intersects[0].index
const person = data[particleIndex]
showPersonInfo(person)
}
}Accessibility Considerations
Building an accessible memorial was crucial to ensure everyone can pay their respects:
- Keyboard Navigation: Full keyboard support for all interactions
- Screen Reader Support: ARIA labels and semantic HTML structure
- Performance Scaling: Automatic quality adjustment based on device capabilities
- Alternative Interfaces: Text-based fallbacks for users who can't access 3D content
- Multilingual Support: Names and interface in both Arabic and English
Data Pipeline
The memorial connects to live data sources to ensure accuracy and completeness:
Data Flow
- 1. Source APIs: Tech for Palestine, Gaza Ministry of Health
- 2. Data Validation: Cross-reference and verify information
- 3. Processing: Clean, normalize, and structure data
- 4. Visualization: Transform into 3D coordinates and properties
- 5. Real-time Updates: Continuous synchronization with sources
Future Enhancements
The memorial continues to evolve with new features and improvements:
- VR Support: Immersive virtual reality experience
- AI Narration: Personalized storytelling for each individual
- Family Connections: Visual links between related individuals
- Geographic Mapping: Location-based clustering and exploration
- Community Features: User-contributed memories and stories
Open Source
The Gaza Souls Memorial is built with open source technologies and we believe in transparency. The codebase demonstrates how technology can be used to create meaningful, respectful memorials that honor human dignity.