As mentioned in my previous blog post I am working on bringing Phonon, the best multimedia abstraction library from KDE, to QML and Qt Quick.

Now I’d like to go more into detail about what the point of all this is and how this is going to rock our world.

Before Qt Quick we had widgets (imagine them as boxes). These were easy to use for Phonon, because every box on just about every operating system has some way to draw video content on it. Therefore this was nicely implemented in the frameworks underneath Phonon, so only the high-level Qt integration had to be done. With Qt Quick we don’t have proper boxes anymore, not in the way the operating systems like them anyway. Consequently we need to do some work to get from where we are to Qt Quick.

How will this work you might ask. Really it is actually very simple. In Qt Quick rather than having a random box where you can draw your video frames on, you have declarative items, which in essence also are boxes but unknown to the operating system. In Qt Quick those boxes are rendered using Qt internal technology. Right now that would be a software rasterizing approach or similar magic supported by QPainter, in Qt 5 this will actually all be done using OpenGL to enable much more awesome and complex applications as essentially the whole user interface will be rendered using GL.

Even though Qt 5 will do this vastly different, the concept of making video playback happen with Qt Quick is the same in both cases. You get your video frame, hand it to your Qt Quick item and in there you draw the frame depending on available capabilities and what not.

Ok, clearly there is various degrees of complexity involved on both ends (video/audio sync? drawing speed? …). Some of the more interesting programming challenges I will blog about next week.

Anyhow, people who know me a bit will notice that this post is unusually long, and clearly I would not blog that much blah without offering something to try 😉 So, get the Phonon QML branch iteration1 and take it for a test drive if you want. There is an audio and video demo in demos/qml (just make sure you have audio.wav or vidoe.ogv in the demo folders).

Iteration1 actually uses only technology that is already available (namely stock Phonon QObjects for Media and Audio and VideoDataOutput from experimental for Video). Future iterations will get rid of the dependency on VideoDataOutput as the QML branch is moving parts of the magic in the Phonon backends to allow for more ways to optimize the entire drawing process. In fact my local branches already have something that draws without the overhead of VideoDataOutput 😉

If you want to watch development at large you might want to check out the main qml branch.

I’ll blog soon about the architecture (once I have figured it out better ^^), meanwhile you can take a look at this picture (for Phonon GStreamer).