Note: this demo and the upcoming tutorial will only work in modern Chrome and Firefox if MSE is enabled in config. Internet Explorer is not supported.

Preparing a WebM video file

The first step is to create a correctly clustered WebM video file. The clusters of the WebM file should be aligned so that the first frame of each cluster is an Intra-frame, meaning that the video contained within the cluster can be played with only data contained within that cluster. All other frames are derivatives of the previous frame right back until it hits an Intra-frame.

The most straightforward way to create a WebM video file from your video source is to use FFMPEG, an open source cross-platform video encoding library. For the best browser support you should use the VP8 Video Codec and the Vorbis Audio Codec.

You need to compile FFMPeg FFMPEG with these options:

–enable-libvpx –enable-libvorbis

When generating your WebM file you should specify your audio and video codecs, for example:

ffmpeg -i <input-file> -c:v libvpx -c:a libvorbis <output-file.webm> –

Unfortunately FFMPEG often doesn’t generate WebM files with correctly aligned clusters. If your file is broken (test it at the bottom of this article) you should be able to use acolwell’s Media Source Extension Tools to fix your file.

Build the tools (Go required) as described in the Git repository and run msewebmremuxer to fix the clusters, for example:

./mse_webm_remuxer example.webm fixedExample.webm

You can test your generated video file using our Simple Media Source Player example at the bottom of this article.

There’s also an example of correctly clustered WebM video file for testing at:

http://edge-assets.wirewax.com/blog/vidData/example.webm

Building the basic player

Let’s keep it simple

We’re going to keep it dead simple in this first part – we’re going to download an entire video file in one piece, place it into memory in the form of a data array and then attach it the video element using the MSE objects. There’ll be no clustering, no buffering and no fancy adaptive streaming until parts 2 and 3. This should let you determine if the WebM rendition you’ve created above has worked out.

The code

Let’s start by creating a simple Javascript object in a jQuery ready event, give it the ability to show the user its state and check for Media Source compatibility:

$(function () { var BasicPlayer = function () { var self = this; this.initiate = function (sourceFile) { if (!window.MediaSource || !MediaSource.isTypeSupported(‘video/webm; codecs=”vp8,vorbis”‘)) { self.setState(“Your browser is not supported”); return; } } this.setState = function (state) { $(‘#state-display’).html(state); } } var basicPlayer = new BasicPlayer(); window.updatePlayer = function () { var sourceFile = $(‘#source-file’).val(); basicPlayer.initiate(sourceFile); } updatePlayer(); }

The key component of a Media Source Extensions played is the MediaSource Object. This object needs to be created and attached to a video element source using URL.createObjectURL. The MediaSource object then triggers a sourceopen event, at which point a SourceBuffer object can be attached.

We’ll now add some functionality to our init method to create a MediaSource object and associate it with a detached video object. A sourceopen listener is then added to the MediaSource which is triggered when the video element has been attached to the DOM:

this.initiate = function (sourceFile) { if (!window.MediaSource || !MediaSource.isTypeSupported(‘video/webm; codecs=”vp8,vorbis”‘)) { self.setState(“Your browser is not supported”); return; } self.clearUp(); self.sourceFile = sourceFile; self.setState(“Creating media source using”); //create the video element self.videoElement = $(‘<video controls></video>’)[0]; //create the media source self.mediaSource = new MediaSource(); self.mediaSource.addEventListener(‘sourceopen’, function () { self.setState(“Creating source buffer”); //when the media source is opened create the source buffer self.createSourceBuffer(); }, false); //append the video element to the DOM self.videoElement.src = window.URL.createObjectURL(self.mediaSource); $(‘#basic-player’).append($(self.videoElement)); } this.clearUp = function() { if (self.videoElement) { //clear down any resources from the previous video embed if it exists $(self.videoElement).remove(); delete self.mediaSource; delete self.sourceBuffer; } }

We also need a clearUp method so the video can be restarted.

The SourceBuffer object now needs to be created and attached to the MediaSource using the MediaSource.addSourceBuffer method, which takes a string containing the file format and codecs.

This source buffer then takes the video data in the form of a Typed Array using the SourceBuffer.appendBuffer method. You should check that the source buffer is not in the updating state before appending data. The video data can be obtained using a standard XMLHttpRequest.

this.createSourceBuffer = function () { self.sourceBuffer = self.mediaSource.addSourceBuffer(‘video/webm; codecs=”vp8,vorbis”‘); self.sourceBuffer.addEventListener(‘updateend’, function () { self.setState(“Ready”); }, false); var xhr = new XMLHttpRequest(); xhr.open(‘GET’, self.sourceFile, true); xhr.responseType = ‘arraybuffer’; xhr.onload = function (e) { if (xhr.status !== 200) { self.setState(“Failed to download video data”); self.clearUp(); } else { var arr = new Uint8Array(xhr.response); if (!self.sourceBuffer.updating) { self.setState(“Appending video data to buffer”); self.sourceBuffer.appendBuffer(arr); } else { self.setState(“Source Buffer failed to update”); } } }; xhr.onerror = function () { self.setState(“Failed to download video data”); self.clearUp(); }; xhr.send(); self.setState(“Downloading video data”); }

In the next article we’ll use the clusters of the WebM to break the file into chunks using HTTP Range and feed the parts in the buffer as we wish them to be buffered.