HTML5 Live Video Streaming via WebSockets

When I built my Instant Webcam App, I was searching for solutions to stream live video from the iPhone's Camera to browsers. There were none.

When it comes to (live) streaming video with HTML5, the situation is pretty dire. HTML5 Video currently has no formalized support for streaming whatsoever. Safari supports the awkward HTTP Live Streaming and there's an upcomming Media Source Extension standard as well as MPEG-DASH. But all these solutions divide the video in shorter segments, each of which can be downloaded by the browser individually. This introduces a minimum lag of 5 seconds.

So here's a totally different solution that works in any modern browser: Firefox, Chrome, Safari, Mobile Safari, Chrome for Android and even Internet Explorer 10.

Please use a browser that supports the Canvas Element, like Chrome, Firefox, Safari or Internet Explorer 10

live view at

recording from our office in Darmstadt, Germany. For a live streaming example, please check the free iOS app instead.

It's quite backwards, uses outdated technology and doesn't support audio at the moment. But it works. Surprisingly well.

The Camera Video is encoded to MPEG by ffmpeg on a local machine and then sent to a public webserver via HTTP. On the webserver a tiny nodejs script simply distributes the MPEG stream via WebSockets to all connected Browsers. The Browser then decodes the MPEG stream in JavaScript and renders the decoded pictures into a Canvas Element.

You can even use a Raspberry Pi to stream the video. It's a bit on the slow side, but In my tests it had no problem encoding 320x240 video on the fly with 30fps. This makes it the, to my knowledge, best video streaming solution for the Raspberry Pi right now.

Here's how to set this up. First get a current version of ffmpeg. Up to date packages are available at deb-multimedia. If you are on Linux, your Webcam should be available at /dev/video0 or /dev/video1 . On OSX or Windows you may be able to feed ffmpeg through VLC somehow.

Make sure you have nodejs installed on the server through which you want to distribute the stream. Get the stream-server.js script from jsmpeg.

Now install its dependency to the ws WebSocket package and start the server with a password of your choosing. This password is there to ensure that no one can hijack the video stream:

npm install ws node stream-server.js yourpassword

You should see the following output when the server is running correctly:

Listening for MPEG Stream on http://127.0.0.1:8082/<secret>/<width>/<height> Awaiting WebSocket connections on ws://127.0.0.1:8084/

With the nodejs script started on the server, you can now start ffmpeg on the local machine and point it to the domain and port where the nodejs script is running:

ffmpeg -s 640x480 -f video4linux2 -i /dev/video0 -f mpeg1video \ -b 800k -r 30 http://example.com:8082/yourpassword/640/480/

This starts capturing the webcam video in 640x480 and encodes an MPEG video with 30fps and a bitrate of 800kbit/s. The encoded video is then sent to the specified host and port via HTTP. Make sure to provide the correct secret as specified in the stream-server.js . The width and height parameters in the destination URL also have to be set correctly; the stream server otherwise has no way to figure out the correct dimensions.

On the Raspberry Pi you will probably have to turn down the resolution to 320x240 to still be able to encode with 30fps.

To view the stream, get the stream-example.html and jsmpg.js from the jsmpeg project. Change the WebSocket URL in the stream-example.html to the one of your server and open it in your favorite browser.

If everything works, you should be able to see a smooth camera video with less than 100ms lag. Quite nice for such hackery and a humble MPEG decoder in JS.

Again, for an easier to use solution, check the Instant Webcam App.