Adaptive Bitrate Streaming using MPEG-DASH

ABR, DASH and how to use it?

Sharad Bhat
4 min readMar 4, 2021
Photo by Christian Wiediger on Unsplash

What is ABR?

ABR or Adaptive Bitrate streaming is a technique for streaming media by providing the best quality version to the user based on the available bandwidth, efficiently.

Why ABR?

Main problem with streaming a single video file is buffering. If a user has poor Internet bandwidth, the player will need to wait for additional blocks to be downloaded before it can be played. This can be quite annoying and an organization can lose a potential user/customer.

How does ABR fix this?

ABR allows for multiple versions of the video to be created, at varying levels of quality. (i.e., 360p, 480p, 720p, 1080p and more)

Each version is then divided up into chunks to allow for easy switching between the different qualities.

ABR graph
By Daseddon — Own work, CC BY-SA 3.0

What is MPEG-DASH?

It stands for Dynamic Adaptive Streaming over HTTP. MPEG-DASH is an implementation of ABR. It is currently the international standard for streaming multimedia content. Developed by the Moving Pictures Expert Group (MPEG) to solve complexities related to streaming media to multiple devices using a unified standard.

How do I go about working with DASH?

In my repository, QuickDASH, I developed an end-to-end proof-of-concept on streaming video using DASH.

Step 1 — Prerequisites

Download NodeJS, Ffmpeg and Shaka Packager.

Step 2 — Encoding

Using your sample video, i.e. sample.mp4, we create multiple versions by encoding them at different bitrates.

#!/bin/sh

# 720p
ffmpeg -y -i src/sample.mp4 -c:a aac -ac 2 -ab 256k \
-ar 48000 -c:v libx264 -x264opts \
"keyint=24:min-keyint=24:no-scenecut" \
-b:v 1500k -maxrate 1500k -bufsize 1000k \
-vf "scale=-1:720" src/sample_720.mp4

# 540p
ffmpeg -y -i src/sample.mp4 -c:a aac -ac 2 -ab 128k \
-ar 44100 -c:v libx264 -x264opts \
"keyint=24:min-keyint=24:no-scenecut" \
-b:v 800k -maxrate 800k -bufsize 500k \
-vf "scale=-1:540" src/sample_540.mp4

# 360p
ffmpeg -y -i src/sample.mp4 -c:a aac -ac 2 -ab 64k \
-ar 22050 -c:v libx264 -x264opts \
"keyint=24:min-keyint=24:no-scenecut" \
-b:v 400k -maxrate 400k -bufsize 400k \
-vf "scale=-1:540" src/sample_360.mp4

After this step, you will have 4 videos in your src/ folder, the original sample video file and 3 generated video files encoded at different bitrates.

Step 3 — Generating the MPD file

MPD of Media Presentation Description file holds the information of the various streams and the bandwidth required.

For this, we will use Shaka Packager by Google.

Run this Shell script in the parent directory of src folder.

#!/bin/sh# Shaka Packager
shaka-packager \
input=src/sample_720.mp4,stream=audio,output=dest/sample_720_audio.mp4 \
input=src/sample_720.mp4,stream=video,output=dest/sample_720_video.mp4 \
input=src/sample_540.mp4,stream=audio,output=dest/sample_540_audio.mp4 \
input=src/sample_540.mp4,stream=video,output=dest/sample_540_video.mp4 \
input=src/sample_360.mp4,stream=audio,output=dest/sample_360_audio.mp4 \
input=src/sample_360.mp4,stream=video,output=dest/sample_360_video.mp4 \
--profile on-demand \
--mpd_output sample-manifest-full.mpd \
--min_buffer_time 3 \
--segment_duration 3

After this step, in the dest/ folder, we will have 6 files, 3 video files and 3 audio files. 1 video file and 1 audio file for each encoding. And in the parent directory, we will have a manifest file called sample-manifest-full.mpd

The sample-manifest-full.mpd file will look something like this.

MPD file format
MPD file

Step 4 — ExpressJS Server (Glossing over NodeJS basics)

Create a file called server.js and add the following code.

const express = require('express')
const path = require('path')

const app = express()

let port = process.env.PORT || 80

app.use(express.static(path.join(__dirname, 'public')))

app.get('/', (req, res) => {
res.sendFile(path.join(__dirname, './index.html'))
})

app.listen(port, '', () => {
console.log('Server started at port ' + port)
})

Step 5 — Client Video Player

We will be using the Shaka Player.

This file is a bit lengthy, so I’m just going to point you to the index.html file in the repository.

You’re done!

Now, to run your project, just run the `npm run start` command.

You can then head on over to http://localhost:80 to stream your video. Play around with the network throttling option in the developer console to see DASH in action and requesting various bitrate streams based on network bandwidth.

Links and References

https://github.com/sharadbhat/QuickDASH

https://en.wikipedia.org/wiki/Adaptive_bitrate_streaming

https://en.wikipedia.org/wiki/Dynamic_Adaptive_Streaming_over_HTTP

https://github.com/google/shaka-packager

https://github.com/google/shaka-player

https://developer.mozilla.org/en-US/docs/Web/Guide/Audio_and_video_delivery/Setting_up_adaptive_streaming_media_sources

--

--