Case study

Disney+ Hotstar

Streaming video & live sports

Overview

Hotstar (now Disney+ Hotstar in many regions) is known for record-breaking concurrent streams during IPL and World Cup matches. Video is CDN-heavy, but live sports add real-time overlays, ads, and DRM on top.

The learning angle: read-heavy, cache-friendly architectures still hit hard limits when everyone tunes in at the same kickoff.

Technical problems at scale

CDN edge caching and origin shield

Popular live events need massive edge capacity, signed URLs, and tiered caching so origins are not stampeded. Segment-based video (HLS/DASH) changes cache keys and invalidation strategy.

DRM and device fragmentation

Widevine, FairPlay, and PlayReady differ by browser and TV. Playback pipelines must negotiate licenses, rotate keys, and degrade gracefully on old devices.

Live latency vs buffer stability

Lower latency means smaller buffers and more sensitivity to jitter. Adaptive bitrate (ABR) algorithms trade quality spikes against rebuffering—classic control problem.

Personalized rails under load

Homepages and “more like this” rails still call recommendation services. At peak, teams may precompute or simplify rails to protect core playback.

Systems & patterns you will hear about

  • Global CDNs
  • HLS / DASH packaging
  • DRM license servers
  • Kafka for telemetry
  • Transcoding pipelines
  • ABR & QoE metrics

Case-study angles

Estimate requests per second at edge if 30M users poll playlist manifests every few seconds—why might you push updates over WebSockets instead?

Design a fallback ladder: if personalized home fails, what static rail do you serve without breaking the app?