Adaptive Video Streaming and HLS file structure

02 / Jul / 2016 by Shivam Khandelwal 1 comments

Ever wondered how those multiple resolutions of videos are available on youtube or some other website when you have uploaded only a single file. Well, the process is called ‘video transcoding or video encoding’. Describing it in layman’s terms ‘Video encoding is the process of converting a video from one format to another format so that it can be viewed across different devices.’ for e.g :- Lets say you convert a .mp4 (MPEG-4) video to .avi (Audio Video Interleave) format etc. Unlike the start days of the internet when the speeds were limited and websites majorly hosted static html content, today majority of them are transmitting their content through the means of video, clubbing this with the mobile revolution when we started using screens of different sizes for tablets, mobiles and PCs  and irregular mobile internet speeds we get :-

Challenge :- The user wants that the video must play seamlessly on different screen sizes and with different internet speeds and for obvious reasons that won’t be possible if in case you are having an HD video and the user is playing it on a 2G network.

Solution :- Adaptive Streaming of videos

Adaptive Streaming is a technique used to stream multimedia content over HTTP protocols instead of old RTP protocols, which cannot support large distributed networks. The basic idea behind adaptive streaming is to generate various versions of the multimedia file for different bitrates and resolutions and then choosing one of them based according to user’s bandwidth, screen size and various other factors. More commonly today those different bitrate versions are split into multiple chunks that are then played according to the user’s bandwidth and also user can switch from one version to another version. Common protocols based on this idea are MPEG-DASH, Adobe HTTP dynamic streaming, Apple’s HLS, etc.

AdaptiveStreaming

Apple’s HLS (HTTP Live Streaming) :-

HLS is an HTTP based protocol developed by Apple, that works by segmenting video files into multiple chunks of MPEG2-TS format and having an index file that have all the chunks listed. Optionally you can have a master playlist file that list all the index files present corresponding to different resolutions. A folder containing files in HLS format may look like this :-

hls_folder

Structure and Significance of files :-

1. Mpeg2-TS (.ts) chunks :- These are the actual video files that contain the multimedia content, there are multiple chunks depending on the varied resolutions available. for eg. :- we will be having 4 files with same content but with different bitrate for resolutions 720, 480, 360, 240. Suppose these file name are 720_1.ts, 480_1.ts, 360_1.ts, 240_1.ts.

2. Stream/Resolution specific index files :- The files you see (720.m3u8, 640.m3u8, etc) are index files that contain stream specific ts chunks index, so 720.m3u8 will have an index of chunks that are of resolution 720p and so on. Lets see the format of one such file.

[js]
#EXTM3U
#EXT-X-TARGETDURATION:9
#EXT-X-VERSION:4
#EXT-X-MEDIA-SEQUENCE:0
#EXT-X-PLAYLIST-TYPE:VOD
#EXTINF:9,
720_1.ts
#EXTINF:9,
720_2.ts
#EXTINF:8,
720_3.ts
#EXT-X-ENDLIST
[/js]

Understanding terms used in this file :-

  • #EXTM3U :- Specifies that this file is a hls playlist
  • #EXT-X-TARGETDURATION :- Specifies the duration of each chunk/segment
  • #EXT-X-VERSION :- Version of HLS format used
  • #EXT-X-PLAYLIST-TYPE :- Defines whether the current playlist is of a VOD or a live stream
  • #EXTINF :- Duration of the segment that follows (Used to calculate total duration of video)
  • 720_1.ts,720_2.ts,720_3.ts :- Chunks of video
  • #EXT-X-ENDLIST :- Defines the playlist ends (absent in case of#EXT-X-PLAYLIST-TYPE = ‘LIVE’)

3. main.m3u8 :- This is an optional file and is only required when you have multiple resolutions for the same video. It contains the details about individual index file and metadata for the resolution and other details specific to each stream.

Sample

[js]
#EXTM3U
#EXT-X-VERSION:4
#EXT-X-STREAM-INF:BANDWIDTH=1896000,AVERAGE-BANDWIDTH=1649000,RESOLUTION=1280×720,CLOSED-CAPTIONS=NONE,CODECS="avc1.4d001e,mp4a.40.2"
720.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=998000,AVERAGE-BANDWIDTH=868000,RESOLUTION=640×480,CLOSED-CAPTIONS=NONE,CODECS="avc1.4d001e,mp4a.40.5"
480.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=445000,AVERAGE-BANDWIDTH=387000,RESOLUTION=480×360,CLOSED-CAPTIONS=NONE,CODECS="avc1.4d001e,mp4a.40.5"
360.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=213000,AVERAGE-BANDWIDTH=186000,RESOLUTION=360×240,CLOSED-CAPTIONS=NONE,CODECS="avc1.4d001e,mp4a.40.5"
240.m3u8
[/js]

Understanding terms of main.m3u8 :-

  • #EXTM3U :- Specifies that this file is a hls playlist
  • #EXT-X-VERSION :- Version of HLS format used
  • #EXT-X-STREAM-INF:BANDWIDTH=213000,AVERAGE-BANDWIDTH=186000,RESOLUTION=256×144,CLOSED-CAPTIONS=NONE,CODECS=”avc1.4d001e,mp4a.40.5″
    256×144.m3u8 :- Defines the metadata about each stream index file, details are quiet easy to understand.

Workflow :-  The client queries for main.m3u8, which has all the metadata about the streams available and according to its bandwidth and requirements queries for specific m3u8 file. The specific m3u8 file contains the ts segments list with the tag #EXTINF, that specific the duration of the segment that follows, client may find the total duration of video by adding all the #EXTINF tags or else just start playing the video in order of their appearance in the list.

Now when you have understood the structure and significance of each and every part of a HLS encoded file. You may want to know how to create one, here is a blog by Anil Agrawal on the same go on and try it out.

See you next time.

 

 

FOUND THIS USEFUL? SHARE IT

comments (1 “Adaptive Video Streaming and HLS file structure”)

Leave a Reply

Your email address will not be published. Required fields are marked *