Having covered the open source graphics software community for several years, I was immediately intrigued when I learned about the ColorHug project. ColorHug is a spare-time effort by Red Hat's Richard Hughes to build an open source monitor calibration device. I am still resigned to the fact that most people's eyes will glaze over whenever you utter the phrase "color management," but with ColorHug, at least interested parties have a simple and affordable option — and as with any good open source project, there are hints at more things still to come.

Color me good

For the benefit of the uninitiated, the ColorHug is a tristimulus colorimeter. That means it can measure the amount of RGB light hitting its sensor; by displaying a carefully-chosen set of color swatches on the screen, it can characterize the display's output and generate an ICC profile. You can then load the ICC profile into the OS color management system, and enjoy balanced output. The ColorHug sports a 64-pixel light sensor, and like other colorimeters uses filters to read the red, green, and blue values separately. Unlike many other colorimeters, the ColorHug also converts the readings into a standardized XYZ color value in the firmware, and can correct the output to account for the different illumination characteristics (e.g., varying backlights and primary colors) found in different display types.

Unlike essentially every other colorimeter, the ColorHug is supported only under Linux — it ships with a Fedora-based live CD to perform the calibration, although the resulting ICC profile can be used by any operating system. That aspect of the project receives little attention, but it is significant. There are other colorimeters supported by open source software projects like Argyll (which handles underlying calibration tasks), but there is no official Linux support. On top of that, the existing Windows and OS X client software is usually restrictive, with limits on the number of copies that can be installed, requiring email registration, and so on. Hughes made a point of designing the ColorHug to be fast and accurate hardware at a low price point (currently £60), but the software support is equally significant. It even gets free firmware updates, which could make for some interesting developments down the road.

How it works

Physically the ColorHug is a small black puck about the size of a matchbox, with a sensor aperture on one side and a mini-USB port on the end. It comes with a mini-to-fullsize USB cable and the aforementioned live CD. Hughes is also the author and maintainer of GNOME Color Manager (g-c-m) and colord, so that software is used by the live CD to detect attached displays and colorimeters, as well as to step the user through the process of creating a display profile. G-c-m should automatically recognize your displays and list them in its main window; to start creating a profile you simply select one and click the "Create Profile" button.

The application will ask you to name your new profile (although it maintains a time-sorted list of the resulting profiles in the window, too, so the name is primarily for your benefit) and select the accuracy at which to analyze the monitor. The approximate measurement times listed — up to 20 minutes for the most thorough — are very conservative when it comes to the ColorHug; I cannot think of any reason to create a low-accuracy profile other than curiosity. I tested the ColorHug against a Pantone Huey (a common proprietary colorimeter), and creating an accurate profile took well under 10 minutes for each device, with a slight edge going to the ColorHug.

Whatever accuracy you choose, when you click the "Start" button the application launches into a sequence of color patches displayed in the center of the screen. You must have the ColorHug centered on the display area and keep it still for the duration of the process. That proved to be the trickiest part of the formula for me; the USB cable is on the thick side, and the ColorHug weighs so little that the cable pulls it out of position if you don't take care to wedge it in place with a nearby object. Even then, profiling a laptop is easier because you can maneuver the display to lie flat. Back when CRTs were the norm, proprietary colorimeters often used suction cups to stick to the front of the monitor. That is clearly a bad idea for LCDs, but some form of clip to hang the device in place against a vertical surface might be a nice addition in the future.

Calibration example Before After

After a few minutes of silent color-swatch juggling, g-c-m announces that it is finished, and you can instantly apply the freshly baked ICC profile to your display. This is the point where you might notice trouble. Early versions of the software asked you to select the white point of the monitor during the first step, and the wording encouraged you to select the daylight-balanced D65, but that resulted in profiles skewed sharply into the red part of the spectrum. The correct choice was to use the display's "Native" white point, and updated versions of the software and live CD now choose that setting automatically.

The other tricky bit is that it might be hard to find the actual ICC profile files if you are running the live CD. They are located in ~/.local/share/icc/ , so you need to copy them to another storage location in order to load them into your normal desktop environment and reap the benefits. However, you still may find the newly-adjusted profile looks different than you expect. The truth is that most LCDs are factory set to a very cool-blue white point because that appears brighter to the human eye. But brighter is not accurate; I immediately noticed that the Fedora live CD's aquatic background image had a lot more color variation when I applied the new ICC profile; without it the entire ocean scene was deep blue. Your eyes will adjust to the change. More importantly, if you profile multiple displays, your eyes will not have to adjust when you switch between them. You can see "before" and "after" photos above ("before" is on the left). Of course, how much of a difference you see depends on your own display as well, but hopefully it is at least clear that the grays are much bluer in the "before" shot.

Using the live CD is actually pretty important; Hughes updates the client code and utilities frequently and his updated ISOs are the only officially guaranteed path to making all of the pieces work together. Your distribution might not package the most recent version of Argyll or other dependencies for several months.

Auxiliary bits

Display profiling is the ColorHug's primary purpose, but the project has other interesting infrastructure built in as well. For instance, it is firmware-upgradeable: the live CD comes with a GUI firmware updater that checks the project web site for a new release, and allows you to download and flash it. The first time I tried that, it did not work. But the ColorHug is also designed with a separate firmware bootloader to prevent accidentally bricking the device. The ColorHug site explains how to recover from a bad flash with command line utilities included on the live CD, and I am happy to say that they worked.

You can also use the ColorHug to calculate a CCMX color transformation matrix, if you have access to a separate spectrophotometer. A spectrophotometer is a different class of calibration device from a colorimeter; rather than using a light sensor to measure RGB values, it measures the amount of light output by a display across the full visual spectrum, in tiny increments. They are pretty expensive in practice, but if you can wrestle one away from a friend, you can use it to create a CCMX matrix that will be shared with the rest of the community. These matrices are most valuable for non-standard display types; most LCD screens work fine with the standard sRGB matrix shipping in the ColorHug. But mobile phones, TVs, and other devices do differ, and the project is interested in collecting as many good samples at it can, which will then be included in future firmware updates. You can grab and install new CCMXes with the colorhug-ccmx tool included on the CD.

On the ColorHug mailing list, there was a thread about using other colorimeters to generate CCMX data. The process relies on Argyll, and author Graeme Gill confirmed that the Argyll tools can create a matrix from two colorimeters. I asked Hughes what the relative value of such a colorimeter-to-colorimeter matrix is, and he likened it to trying to adjust your speedometer by driving alongside another car: not totally useless, but definitely not reliable enough to recommend.

Finally, the G-c-m build on the live CD includes tools for examining ICC profiles themselves. In the version of the CD I tested, they qualify as interesting but not rigorous. For example, you can show a 3D rendering of each profile as a solid space, but you cannot overlay to profiles to compare them. I hope that that will improve in the future, the other major Linux color management system, Oyranos, has a very nice profile inspection tool called ICC Examin, so there is hope.

Speaking of Oyranos, just because colord is used by the live CD, that does not mean that the ColorHug's ICC profiles are usable only with colord on a GNOME system. The profiles are bound to the hardware, whichever software is used to load them. Oyranos developer Kai-Uwe Behrmann is also working on a set of enhancements to ICC file metadata via the OpenICC project which should make interoperability better in the future, but you can simply install the profiles on an Oyranos system and take advantage of them now. You can also submit them to the taxi service, which collects user-generated ICC profiles in an effort to build a more reliable set of real-world options for users who cannot generate their own profiles. You can even use the profiles in Mac OS X and Windows if you multi-boot.

The future

Automatically installing the profiles created by the live CD is a possible enhancement that has come up on the ColorHug mailing list, although for now it is not high-priority. More interesting is the possibility of extending the hardware's usefulness — such as to CRT displays and projectors. Supporting CRTs requires different color-measurement timing, due to that hardware's refresh rate.

Projectors are another beast altogether. There are proprietary colorimeters intended for use with projectors, and technically the ColorHug hardware can be used with them as well — but it is tricky. There is an aperture disk in front of the ColorHug light sensor; removing it allows the sensor to see a larger sample area, which would be useful for projectors, but the details of placing and orienting the device have yet to be worked out. Similarly, removing the aperture disk would also make the device useful for profiling the ambient light in a room, but here again the details have yet to be worked out. Hughes and Gill have discussed the issues on the mailing list — including whether or not a light diffuser would be required, but it is possible that future versions of the software will support these or other new functions.

Looking even further out, Hughes has floated the idea of building an open source spectrophotometer as a sequel or companion piece to the ColorHug. There was interest in the idea on the mailing list, but most thought such a device would be useful only if it was designed to help profile reflective light — in other words, printers. Not all spectrophotometers do so; the hardware challenge is significantly steeper, because in addition to making precise measurements at specific wavelengths, a reflective light spectrometer also needs to provide a precise illumination source, which is a tricky prospect for any project. Hughes indicated that such a device was a 2013-or-later project, and it would cost at least four or five times as much as the ColorHug.

Of course, the ColorHug proves that there is at least some market for open source color-measurement hardware, which few people would have predicted before the project launched. In fact, according to Hughes' talk at Libre Graphics Meeting 2012, he can barely keep up with demand for the devices. The tale of designing, manufacturing, and iterating the devices is interesting for anyone looking for real-world information on open hardware projects; hopefully slides or a recording of the session will be available soon. As an open hardware device, you can download the gEDA schematics and the firmware for the ColorHug from the same repository that hosts the client software. But for everyone who is merely interested in getting good results from a monitor, the ColorHug is a good buy on its own. It would probably be a good buy even if it was closed source: it is fast, inexpensive, and accurate. But the fact that you can tweak and modify it only adds to the value.

Comments (28 posted)

Projects the size of LibreOffice often tend to get a little unwieldy; the size of the code is such that even seemingly trivial tasks like removing dead code can take a long time. Considering the sheer size of the project and the fact that its copyright ownership is distributed, it would be natural to doubt the sanity of anybody proposing to simultaneously move 1.5 years worth of work to a new base and adopt a new license. But that is just what LibreOffice has in mind.

LibreOffice is based on the original OpenOffice.org project; it inherited its current LGPLv3 license from there. The project is clearly happy with copyleft licensing, but it occasionally shows signs of wanting a bit more flexibility than LGPLv3 sometimes provides. LibreOffice has, since the beginning, accepted all changes under a dual license, mixing LGPLv3 with version 2 of the Mozilla Public License. But the LibreOffice project does not own the original OpenOffice.org code, so it must distribute its work under LGPLv3, essentially dropping the MPLv2 license from the changes that have been made since the project began.

Something interesting happened a while after LibreOffice launched, though: Oracle donated the code to the Apache Software Foundation (ASF) which, after some work, has released it under the Apache License. That is a permissive license, which is not something LibreOffice is interested in. But the Apache License is compatible with MPLv2; a derived product containing code under both licenses can be distributed as an MPLv2-licensed whole. Thus, LibreOffice has concluded:

With the relicensing of the original OpenOffice.org code-base to Apache License 2.0 by Oracle, we're now able to incrementally rebase our own code on top of that to offer a choice of licensing that does not only include LGPLv3 but also any of GPLv3.0+, LGPLv3.0+, and the AGPLv3.0+ that the MPLv2+ allows. This will also allow us to incorporate any useful improvements that are made available under that license from time to time.

In other words, LibreOffice intends to toss out its original OpenOffice.org base and move its changes over to the newly released, Apache licensed version; the result will be a LibreOffice that can be distributed under the MPL, which the project sees as being worthwhile:

As we compete with our own code, licensed under an unhelpfully permissive license, the MPL provides some advantages around attracting commercial vendors, distribution in both Apple and Microsoft app-stores, and as our Android and iPhone ports advance in tablets and mobile devices.

Getting there will be an interesting process, though. The code given to the ASF is substantially the same as the code LibreOffice started with, but, as some Apache contributors have taken pains to point out, that code is not available under the Apache License. Contrary to the text quoted above, Oracle did not relicense the code; that code was, instead, given to the ASF, which was given the freedom to release it under a new license. The only code that is actually available under the Apache License is that found in the Apache OpenOffice 3.4 release. Anything that has not been officially released by Apache has not truly been relicensed.

Indeed, it has been claimed that code found in the Apache repositories—and, thus, currently distributed by Apache—is not necessarily available under the Apache License. So LibreOffice will have to start with the official release, which has had a lot of components carved out of it. Much of that code has been replaced with permissively-licensed code, but not all of it. There are also several chunks of code that Apache OpenOffice may relicense and integrate into a future release, but that has not happened yet. LibreOffice developers have requested help in clarifying the license status of that code, but answering those queries has not been a high priority for Apache OpenOffice, especially while the latter was working flat-out to make its own first release.

So the relicensing of LibreOffice is not going to be a quick task. The project must start with, and limit itself to, code that is known to be available under the Apache license. Then it will be necessary to rebase the large number of changes made over the lifetime of the LibreOffice project onto the new release; the project thinks that will be relatively easy, because Apache OpenOffice has, thus far, not made a lot of changes:

It is worth pointing out that only around 6% of the files in the Apache OpenOffice incubator project have any code change beyond changing the license headers. A rather substantial proportion of the those changes are highly localised, or are inclusions of code not generated by the Apache project itself.

There remains the task of filling in all of the parts that Apache OpenOffice simply removed from the code base or has not yet gotten around to releasing under the Apache License. How easy that will be is not entirely clear; it may well be that, even after rebasing the project, LibreOffice may not be able to make a fully MPL-licensed release for some time. There are also little details like getting the license headers right, but they seem minor by comparison.

In summary, the LibreOffice project appears to be setting itself up for a fair amount of work. License changes are hard, especially when one does not necessarily have the cooperation of the copyright holders; it is only possible this time around through the surprising combination of the Apache relicensing and the LibreOffice project's prescient-seeming decision to require LGPL/MPL dual licensing on all contributions. The reward from all of this work, the project hopes, will be more flexible licensing that helps it to compete with the project upon which it is rebasing. All told, the interesting interplay between these two projects seems destined to continue for the foreseeable future.

Comments (45 posted)

To a large extent, BusyBox has been the "poster child" for GPL enforcement, at least in the US. That may now be changing with the announcement that the Software Freedom Conservancy (SFC) has signed up multiple Linux kernel and Samba developers whose copyrights can be used in license compliance efforts. That expands the scope of license enforcement activities while also removing the need to use the controversial GPLv2 death penalty threat to ensure compliance with the kernel's license—SFC should be able to go directly at kernel enforcement now.

Copyright holders step up

There are three parts to the announcement. First off, the Samba project, which is an SFC member project, has moved its existing compliance efforts over to SFC. Secondly, a new "GPL Compliance Project for Linux Developers" has been created for Linux kernel copyright holders to allow their contributions to be used in enforcing the GPL on that code. So far, seven copyright holders have joined the project. Lastly, several other SFC member projects (Evergreen, Inkscape, Mercurial, Sugar Labs, and Wine) have requested that the organization handle any compliance issues for their code.

The kernel copyright holders that have joined the project are—somewhat curiously—not listed, other than Matthew Garrett, who was the first. In an email exchange with SFC executive director Bradley M. Kuhn, he said that he asked Garrett to "officially speak on behalf of this program", but that the others are welcome to speak up if they wish. Essentially, Kuhn is shielding those developers from having to handle questions if they don't want to, which is part of the mission of SFC: "we take care of boring work for Free Software projects so the developers can focus on the developing the code".

Furthermore, Kuhn recognizes that GPL compliance activities are sometimes controversial in the community. So, he is happy to deflect any criticism in his direction:

I'm aware this issue can be divisive for some (I don't really understand why myself, but I see it's there), so I don't want developers to feel they all have to stand up and take the influx of FUD on this issue. That's my job to do on their behalf.

Similarly, Kuhn asked Jeremy Allison to be the contact person for the Samba copyright holders, who were also unnamed. There were nine of those before the announcement, but Kuhn said that another stepped forward after it went out. Part of the idea behind the announcement is to do just that: attract more copyright holders, particularly kernel copyright holders, to the compliance efforts. Kuhn also expects that some of the copyright holders may make themselves known in "comment threads, blogs and the like".

Copyright coverage

One question that arises when only a subset of the copyrights in a body of code are being used is whether those copyrights are sufficient to be used for compliance purposes. For Samba, there is little question in Kuhn's mind that Allison's contributions would be sufficient by themselves: "Jeremy's copyrights *alone* are so extensively spread across the many Samba codebases for multiple decades that it's probably impossible to have a working Samba binary without including very substantial portions of Jeremy's copyright works". But he is happy to have other Samba developers on board "to show their solidarity and so that they could give input into the process of the compliance activity".

On the kernel side, Garrett himself noted that his copyrights might not be useful for enforcement activities: "Most of the past work I've done is in bits of the kernel that are rarely present in infringing devices, and most of my recent work is owned by my employer rather than me." But that was an off-the-cuff blog comment that Kuhn said "painted a substantially bleaker outlook than the reality" once SFC started investigating Garrett's copyrights. Beyond that, Kuhn said, the addition of the other copyright holders has further strengthened SFC's hand:

Conservancy is confident that those developers who signed up for compliance activity on their Linux copyrights hold enough copyright that the code can't, for example, be trivially written out of Linux by someone who wants to avoid working with Conservancy on a compliance matter related to Linux.

Kuhn said that SFC is now going through the laborious process of registering the copyrights of all of those involved. The basic methodology is "going through git logs, finding commits by a particular developer, verifying their copyright claim to that commit, and then preparing the source code in a form that the copyright office can grok", he said. That will take some time, but, once it's done, those registrations will be searchable by those interested at the US copyright office.

But seven copyright holders is only a tiny fraction of the thousands of contributors to the Linux kernel over the years. To understand the scope of the copyrights that can be enforced, we would need to know just who has signed up with SFC. Until the copyright registration process is completed—or a lawsuit filed—we won't really have a good feel for that it seems.

Compliance, not litigation

When asked about any litigation on the horizon, both Kuhn and Allison were quick to point out that SFC only uses lawsuits as a last resort. The policy has always been to file suit only "when a company simply refuses to respond to repeated requests from Conservancy about compliance problems", Kuhn said, while Allison called lawsuits "a 'LISTEN TO ME !!!!' action" for those who are non-responsive.

In fact, Kuhn said, "the average is less than one lawsuit every three years", though SFC has been accused of being "overly litigious" by some. By way of contrast, SFC deals with more than a hundred minor compliance issues (e.g. some kind of compliance question) each year. In addition, situations that "required complex negotiation" number in the twenties each year, he said. There is also a certain amount of educational work that SFC does, including Kuhn speaking publicly about the issue at various conferences "which (hopefully) reaches a wide audience and (again, hopefully) raises the level of understanding" about compliance for companies beyond those SFC interacts with directly.

The announcement essentially puts companies on warning that there are "cops on the beat", Allison said:

They are low-key cops, who would rather give you a stern talking to and get you to fix your driving on the quiet than drag you publicly into court, but it does mean that if you're using the code in these projects you do have to at least *think* about compliance. You can't just ignore it, there are consequences.

Plans

SFC will be consulting with the copyright holders to determine the compliance steps that are to be taken. The organization is not looking to go its own way, but rather to work with the stakeholders to find a path that is acceptable to all. As the announcement put it:

Conservancy's new effort will be run in a collaborative manner with the project developers. All copyright holders involved will have the opportunity to give input and guidance on Conservancy's strategy in dealing with compliance issues. Thus, all Conservancy's compliance matters will have full support of relevant copyright holders.

That attitude may be helpful in bringing in other developers that might be leery of just appointing SFC as their copyright agent. By making it clear that they will be consulted and are able to help define the strategy and tactics, SFC is clearly trying to alleviate fears in the hopes of bringing in more copyright holders.

The addition of kernel and Samba copyrights (and, likely to a lesser degree, those of the other SFC member projects) will clearly expand the number and kinds of license compliance offenders the organization can target. But Kuhn does not see SFC spending any more time on compliance than it has in the past. It is third on the list of SFC activities as reported in its 2010 Form 990 [PDF], and Kuhn does not see that changing as SFC "doesn't want license compliance to be our sole or even primary focus".

That said, there are a number of things that need to be done. Kuhn said that SFC will be giving the non-responsive companies in its queue "another go-around" in hopes that the announcement will spur them to start engaging. That queue, which numbers around 250 violations, will also need to be re-prioritized based on product age and availability, he said.

Compliance is an important issue for our communities. It is not just that some are completely ignoring the—pretty minimal in the grand scheme—requirements that we put on our freely available code, but also that violators are getting an advantage on their compliant competitors. As Allison put it: "The people who do think about compliance are subsidising their competitors who don't. I think that's unfair."

With this announcement, SFC now has more weapons in its arsenal so that it can reduce that problem, while still working within the community it serves. Compliance may be controversial in some circles, but choosing a license that places restrictions on distribution—as the GPL does—doesn't really make sense unless it is enforced. Those who would rather not see SFC (or others) push compliance on companies might be best served by more permissive licenses.

Comments (7 posted)