How I Load 30+ Videos Without Compromising Performance
Loading a couple videos on a page is easy. Loading 30+ is where browsers start reminding you they’re not a streaming engine.
My first attempt was “simple”: render a grid of <video> tags, lazy-load them, and host the files on a CDN. The CDN helped with bandwidth, but the page still froze once enough videos tried to play.
The bottleneck wasn’t downloading — it was decode + GPU upload + scheduling 30 animations at 60fps.
So I switched from “load everything” to a resource-managed approach.
The real issue: decoding, not networking
Even if videos load instantly from a CDN, the browser still has to:
- decode frames (CPU)
- upload textures (GPU)
- composite continuously updating elements
- keep memory under control
If you let 30 videos play at once, you’ll get stutters, dropped frames, and scroll jank.
The fix is straightforward:
Limit concurrent playback. Only the videos that matter most should decode.
1) A tiny play manager: cap concurrent videos
Instead of calling video.play() everywhere, I route all playback through a small global manager that enforces a hard limit (e.g. 3 videos at once). When a new video becomes important, it pauses the lowest-priority one.
Here’s the core idea (not the whole file, just the important blocks):
type Entry = { id: string; el: HTMLVideoElement; priority: number };
const playing = new Map<string, Entry>();
export async function requestPlay(
id: string,
el: HTMLVideoElement,
priority: number,
limit = 3,
) {
if (!el) return;
// If we're at the cap, pause the lowest-priority video.
if (!playing.has(id) && playing.size >= limit) {
const loser = [...playing.values()].sort(
(a, b) => a.priority - b.priority,
)[0];
loser.el.pause();
playing.delete(loser.id);
}
playing.set(id, { id, el, priority });
try {
await el.play();
} catch {
// autoplay can fail; ignore and let user interaction handle it
}
}
export function release(id: string) {
playing.delete(id);
}This single change eliminates the “30 decoders spinning at once” problem.
2) Split “attach source” from “play”
A common mistake is: intersection => attach source + play immediately.
That creates a “play storm” when you scroll and many elements intersect around the same time.
Instead, I treat this as two phases:
- Near viewport: attach
<source>(start fetching) - Mostly visible: request playback through the manager
3) The LazyVideo component: staged loading + capped playback
Below are the key blocks from my LazyVideo.tsx.
States: source attachment, visibility, loaded fade
const [shouldAttachSource, setShouldAttachSource] = useState(false);
const [shouldPlay, setShouldPlay] = useState(false);
const [isLoaded, setIsLoaded] = useState(false);IntersectionObserver: “near” vs “visible”
useEffect(() => {
const el = ref.current;
if (!el) return;
const io = new IntersectionObserver(
([entry]) => {
const ratio = entry.intersectionRatio;
// Near viewport => start network
if (entry.isIntersecting) setShouldAttachSource(true);
// Mostly visible => eligible to play
setShouldPlay(ratio >= 0.6);
// Out of view => pause + release slot
if (!entry.isIntersecting) {
el.pause();
release(id);
}
},
{ rootMargin: "300px", threshold: [0, 0.1, 0.6] },
);
io.observe(el);
return () => io.disconnect();
}, [id]);Playback routing through the manager
useEffect(() => {
const el = ref.current;
if (!el) return;
if (shouldPlay) {
// Priority can be smarter (size, center-ness, etc). Start simple.
requestPlay(id, el, 1, 3);
} else {
el.pause();
release(id);
}
}, [shouldPlay, id]);Render: base64 blur placeholder + controlled sources
return (
<div
className={cn(
"relative overflow-hidden aspect-video rounded-3xl border",
className,
)}
>
{/* Instant placeholder (perceived performance) */}
<img
src={blurDataURL}
alt=""
aria-hidden
className={cn(
"absolute inset-0 w-full h-full object-cover scale-110 blur-xl transition-opacity duration-500",
isLoaded ? "opacity-0" : "opacity-100",
)}
/>
<video
ref={ref}
loop
muted
playsInline
preload="none"
onLoadedData={() => setIsLoaded(true)}
className="w-full h-full object-cover"
>
{shouldAttachSource && (
<>
<source
src={`${CDN}${src.replace(/\.mp4$/, ".webm")}`}
type="video/webm"
/>
<source src={`${CDN}${src}`} type="video/mp4" />
</>
)}
</video>
</div>
);Why this works
- CDN reduces latency + bandwidth pressure.
- preload="none" prevents the browser from eagerly doing work for 30 videos.
- Two-phase IO logic prevents “attach+play everything” spikes.
- Play manager cap keeps decode and GPU work bounded.
- Blur base64 previews make it feel instant even when the video isn’t ready.
The takeaway
The trick isn’t “make videos smaller” (though that helps). The trick is: bound the work.
Once you stop 30 videos from decoding at once, performance becomes predictable — and scaling past 30 stops being scary.
Thanks to Soren for the base64 blur inspiration/writeup.