Streaming architecture
The DebConf video team streaming setup gives users access to HLS (HTTP Live Streaming) and DASH (Dynamic Adaptive Streaming over HTTP) streams, which both are very easy to wrap for web-based consumption. To push the streams to the backends, RTMP streams are used on the backend.
Our streaming infrastructure is split in two layers: streaming backends receive a H.264 encoded stream from each room via RTMP, and generate HLS (HTTP Live Streaming) streams on HTTPS, as well as, optionally, DASH streams using other codecs (such as AV1). Streaming frontends are currently simple caching HTTPS reverse proxies, allowing load distribution and geographic redirections to reduce the latency downloading video segments for the clients.
All TLS certificates (for HTTPS) are handled using either Let’s Encrypt or by generating self-signed certificates. To accommodate geographic redirections or other DNS manipulations, when using Let’s Encrypt, all of the challenges are centralized on the streaming backend host, while all frontends redirect their requests there.
Streaming backend
Streaming backends are nginx instances, using the nginx RTMP module to listen to RTMP feeds pushed by the rooms. For DebConf, only one backend is used.
The backend is configured as two RTMP applications (stream
and show
), as well as a HTTPS virtual host.
The stream
RTMP application
The videoteam-stream
script in each room pushes the feeds to the stream
RTMP application, at URL rtmp://<backend>:1935/stream/<room>
. This
application has two purposes:
It dumps the incoming stream as a last ditch backup
It runs ffmpeg to downscale the incoming stream to lower bandwidth variants. This ffmpeg instance pushes its synchronized downscaled streams to the
show
RTMP application atrtmp://<backend>:1935/show/<room>_<quality>
. The original stream is also pushed unchanged tortmp://<backend>:1935/show/<room>_src
It runs ffmpeg a second time, generating multiple streams, muxing them together with the
-f dash
output muxer, which generates a DASH manifest and chunks in a local directory.
The show
RTMP application
The show
RTMP application generates the client-oriented adaptive bandwidth
HLS playlists [1], which are in the end served through its HTTPS virtual host.
https://<backend>/live/<room>.m3u8
main adaptive HLS playlist, referencing all the following playlists, with the bandwidth settings from the configuration
https://<backend>/live/<room>_src.m3u8
HLS playlist for the “source quality” stream
https://<backend>/live/<room>_<quality>.m3u8
HLS playlist for downscaled streams
The streaming backend HTTPS virtual host
The streaming backend virtual host mostly serves the live HLS data under the
/live/
directory. It also serves the centralized .well-known
directory
used for the frontends to answer the Let’s Encrypt challenges. Finally, it
serves the DASH manifests and chunk files under the /dash
relative URL, where
the manifest can be found as /dash/<room>/stream.mpd
.
Streaming frontend
Streaming frontends are nginx instances as well, with a single HTTPS virtual
host. As of now, these nginx instances only perform caching of the HLS playlists
and segments for use by clients, using the nginx proxy module for the /live/
directory.
Geographic redirections
To improve latency of client connections to streams, our streaming frontends are geographically distributed.
Frontends are expected to have a virtual host that supports all possible domain names under the “live” subdomain, allowing on-the-fly DNS rearrangements:
af (Africa)
an (Antarctica)
as (Asia)
eu (Europe)
na (North America)
oc (Oceania)
sa (South America)
local (Clients local to the conference venue)
When using Let’s Encrypt to generate certificates, challenge data in the
.well-known
directory is centralized on the backend host, and all frontends
redirect to it, so that they can generate certificates for all domains
indifferently of the current DNS configuration.
To help geographic-aware redirections of clients, all streaming frontends support two specific (sets of) HTTPS endpoints (URLs):
https://<current-frontend>/local-server
This single endpoint returns a
text/plain
response containing the GeoIP continent code, orlocal
for clients within the registered local networks. For example, this allows the JavaScript (JS) player embedded into the DebConf page to generate a geographic-aware URI (generating that URI is a modification of the JS player by the video team). This workaround is needed, because the player does not support the redirection described below.https://<current-frontend>/redir/<uri>
Redirects clients (those are players, e.g. vlc) to
https://<code>.<live-domain>/<uri>
, i.e. the proper geographic frontend detected for their IP address. For example,https://local.live.debconf.org/redir/<room_name>.m3u8
could redirect tohttps://eu.live.debconf.org/live/<room_name>.m3u8
. To bypass redirection, one can use the latter directly.