Welcome to the first LWN.net Weekly Edition for 2017. We are entering a new year characterized by more than the usual amount of uncertainty at many levels. One thing that is sure, though, is that it will be an active and interesting year for the Linux and free-software communities. Many years ago, we started an ill-advised tradition of trying to make some predictions for the coming year. Without further ado, we'll continue this tradition into 2017.

The discussion on eliminating Debian maintainers faded out with no clear conclusions. But, over the years, it has become increasingly clear that the single-maintainer model used in much of the free-software community creates a great many single points of failure, some of which are guaranteed to fail over any reasonable period of time. Thus, we have seen interest in group maintainership models grow, and that process will continue into 2017. We will certainly not see the end of single-maintainer fiefdoms in 2017, but it seems likely that there will be fewer of them.

Another important single point of failure is Android. It has brought a lot of freedom to the mobile device world, but it is still a company-controlled project that is not entirely free and, by some measures at least, is becoming less free over time. A shift of emphasis at Google could easily push Android more in the proprietary direction. Meanwhile, the end of CyanogenMod has, temporarily, brought about the loss of our most successful community-oriented Android derivative.

The good news is that the efforts to bring vendor kernels closer to the mainline will bear some fruit this year, making it easier to run systems that, if not fully free, are more free than before. Lineage OS, rising from the ashes of CyanogenMod, should help to ensure the availability of alternative Android builds. But it seems likely that efforts to provide free software at the higher levels of the stack (microG, for example) will languish.

The security mess will only get worse, as it becomes clear that vast numbers of severely insecure systems have been deployed, many in important infrastructural roles. Those who have been paying attention have understood the scope of this problem for some time; the rest of the world will catch up quickly. By the end of the year, expect to see attempts at legislative solutions in various countries, many of which will be less than helpful at addressing the real problems.

With luck, the free-software community will have a somewhat better security story by the end of the year, as understanding of the gravity of the problem sinks in. Fuzz-testing and hardening efforts will increase, and some notable progress will be made. But it is certain to be far short of what is needed.

The disintermediation of traditional Linux distributors will continue, with more software being distributed via containerization mechanisms, language-specific repositories, and more. But it will also become clear that this process has a cost; distributors perform a useful service by providing an integrated system, and some users will find that they miss that service. Episodes like the Node.js left-pad fiasco will highlight the risks of performing one's own software integration from multiple repositories under varied management.

We should see another ruling in the appealed VMware GPL-infringement lawsuit. Whether that ruling will bring any joy or any clarity in the area of derived works and the GPL is rather less certain.

The problems of protocol ossification and widely deployed ancient kernels will push more big content providers toward protocols like QUIC or TCP over UDP. Both protocols allow moving a lot of networking logic into user space. There are some advantages to such schemes, including the ability to more quickly deploy protocol enhancements and pushing network intelligence back to the endpoints. But they also will facilitate the creation of private low-level protocols that need not be open source. We could be seeing the beginning of the end of the era where widely used protocol implementations had free reference implementations.

Much of the kernel's core memory-management and filesystem code was designed around a fundamental truth of computing: the latency associated with I/O operations is orders of magnitude larger than memory-access latency. As solid-state memory devices replace traditional storage, that "truth" is increasingly untrue. Fundamental changes will be required for the kernel to perform well on near-future hardware. Those changes will not (for the most part) be made this year, but our understanding of what needs to be done will be much improved by the end of the year.

Containers will be big. Really big. You'll be fed up with hearing about containers by the end of the year. You heard it here first.

2016 was a year that saw many triumphs by divisive forces that would set people against one another. Those forces will certainly not rest in 2017, but they will also be more strongly confronted by those with a vision of a more integrated and cooperative world. The free-software community has, over the last few decades, shown that it is possible to bring together people from all over the planet to work toward shared, long-term goals. We have our own inclusiveness problems to solve and, hopefully, will make progress on those this year. But, if we work at it, we can also serve as an example of how a vast number of people and companies, while working toward their own goals, can build something that serves everybody and makes the world a better place.

As always, the usual disclaimers apply to everything written here. Attributing any level of trustworthiness to your editor's predictions is hazardous at best, if not downright foolhardy. The safest prediction of all is that much of the above will be proved wrong.

With 2017, LWN begins its 20th year of publication, something that we certainly did not predict in 1997 when the idea of starting a Linux-oriented newsletter was being discussed. Let us start this year by wishing the best to all of our readers — newcomers along with those who have been with us the whole time. While we thought, 20 years ago, that Linux was bound for success, few of us could have predicted the world we find ourselves in now. It is probably safe to say that Linux and free software will continue to go from strength to strength, and that 2017 will be another successful year. We look forward to being there with you.

Comments (7 posted)

This article is an introduction to the world of free and open-source applications for symbolic mathematics. These are programs that assist the researcher or student through their ability to manipulate mathematical expressions, rather than just make numerical calculations. I'll give an overview of two large computer algebra packages available for Linux, and a briefer sampling of some of the more specialized tools aimed at particular branches of mathematics.

This category of software is traditionally called a "computer algebra system", but that description can be misleading. These systems can find analytic solutions to algebraic and differential equations; solve integrals; sum infinite series; and generally carry out nearly any kind of mathematical manipulation that can be imagined. At the least, symbolic mathematics software can replace the bulky handbooks of mathematical information that have been lugged by generations of graduate students.

Over decades, mathematicians have honed these programs, encoding within them the accumulated mathematical knowledge of centuries: information about special functions, for example, that's so difficult (for some of us) to remember. They have learned to reduce such things as algebraic simplification and calculating derivatives to patterns of symbol manipulation ripe for automation. The earliest of these systems, developed in the 1960s, were based on Lisp, the obvious choice at the time, but development of later systems used a variety of languages.

Fortunately, most of the best of this software is free and open source, which allows us to look under the hood and examine or alter the algorithms employed.

Maxima

The ancestor of all symbolic mathematics systems is Macsyma [PDF]. It began as an academic project at MIT in the 1960s, but it was eventually licensed to Symbolics (a company founded by former MIT people), which sold it throughout the 1980s. A fork of an earlier version became Maxima, which remains the predominant free-software symbolic mathematics solution. Maxima is actively developed, and has attained a high degree of sophistication and completeness; source is licensed under the GNU GPLv2.

On Ubuntu, at least, the packaged version is quite close to the latest development version; if your distribution lags, you can build it from the sources in the Git repository on SourceForge. If you intend to do any real work with the program you should also install the "maxima-share" package (after installing Maxima itself). This will install a large number of extra mathematical libraries that are used transparently by Maxima and make it far more capable.

The traditional way to use Maxima is in a terminal. Just type " maxima ", and you will be presented with an interactive command prompt. You can enter mathematical expressions in the customary computerized dialect (x^2 for x squared, etc.), followed by a semicolon, and Maxima will respond with a simplified or "solved" version of the expression. If Maxima can't do anything with the expression, it usually just repeats it: tell it "foo" and it will respond with "foo". Readline support allows you to recall previous inputs and edit them, which is particularly convenient for the type of exploration that Maxima seems to inspire, at least for this user.

But if you tell it:

integrate(%e^(-x^2), x);

Maxima will respond with something that looks like:

sqrt(%pi) erf(x) ---------------- 2

This example is here to illustrate three things: First, special numbers such as π and e are represented, in input and output, as %pi and %e . If you say, for example, %e , Maxima will just say %e back; to see the numerical value of the constant, say float(%e) .

Second, mathematical expressions in the output are, by default, rendered in an ASCII approximation of mathematical notation, making their structure easier to grasp than the computerese that we are obliged to use for the input. You can enter tex(%) to get the result in a form ready for pasting directly into a TeX or LaTeX document. (The symbol % refers to the immediately preceding output; each input and output is numbered, and can be referred to directly, as %o6 for output number six, %i6 for input number six, etc.)

Third, there is a wealth of mathematical knowledge baked in to Maxima. Notice the erf in the numerator of the answer: this refers to the error function, well known to statisticians. Maxima will provide the results of integrals and the solutions of differential equations in terms of the special functions that it knows about, when appropriate.

If you'd like to know what that error function looks like, you just need to say:

plot2d(%, [x, -2, 2]);

That will pop up a plot like the one at the right.

Remember, the % notation refers to the last result returned. The three terms in square brackets are the variable on the horizontal axis and its range. Maxima uses gnuplot for its plotting. If you're familiar with gnuplot, you can add options to the plot command to tweak the plot's appearance, or set global options that will apply to every plot in the session. If you have a high-resolution screen you may want to apply these global options to make the plots easier to see:

set_plot_option( [gnuplot_preamble, "set termoption font 'courier,24'; set termoption lw 4"]);

In fact, these options were used for the example plots in this article.

Maxima also provides an interface to gnuplot's 3D plotting commands. Here's a simple example of a surface plot:

plot3d(sin(x)*cos(y), [x, 0, 2*%pi], [y, 0, 2*%pi]);

By setting the gnuplot preamble, either globally or per plot, you can access contour, parametric, or any of gnuplot's other plotting modes. The plots use the x11 gnuplot terminal. When you interact with them, you are interacting directly with the gnuplot subsystem, so you can use the mouse to zoom 2D and 3D plots and rotate 3D surfaces.

Using Maxima at the terminal is convenient because it starts instantly, is responsive, and there is nothing extra to install. However, the console interface has some disadvantages: the ASCII output is not easy on the eyes, especially after a long session; there is only one plot window, which gets reused for each plot; and there is no convenient record of your session.

There are a handful of alternative interfaces that solve one or more of these problems. None of them are ideal, nor as capable as the solution that we'll see in the next section; but I'll briefly describe them here, to provide an idea of what's available, and in case any one of them hits a sweet spot for the reader.

Two slightly more graphical interfaces to Maxima are Xmaxima and wxMaxima. The first of these is based on Tk and the second on wxWidgets. They both allow plots to be embedded with the input and output, and allow for various options to save the session, which is useful for creating notebooks or documents from Maxima explorations. Neither one seems to provide true typeset output, although wxMaxima can use the jsMath fonts to make the results somewhat more attractive. wxMaxima can serve as a gentler introduction to Maxima for those who prefer to get started without frequent trips to the manual, as it bristles with menus and dialogs that expose some, but far from all, of the program's options.

As we saw above, Maxima knows how to create TeX versions of its output. Therefore it should be possible to simply run TeX behind the scenes and display math that looks like real math. There are at least two interfaces that follow this strategy. The WYSIWYG editing platform TeXmacs can interface with Maxima and display typeset output, but this is probably mainly of interest to those who are already using TeXmacs. Perhaps of more general interest, especially to Emacs users, is the imaxima mode of that editor, where you can embed plots and fully typeset output directly into the editing buffer. A readline-like functionality is simulated by typing M-p instead of up-arrow.

Installation of imaxima can be an adventure. Several files need to be placed in the correct places, and variables set in the .emacs configuration file. In Ubuntu this can all be done automatically by installing the maxima-emacs package. The downside here is that this package will install large chunks of TeX Live. As the typical user of Maxima is likely to already have installed a more up-to-date version of TeX Live than most Linux distributions' package managers provide, this will create some redundancy — but it does save time. Some alternatives for those who already have TeX installed are to download the Maxima source, which includes the most critical required files, or to try to find recent versions of the imaxima mode files on the web. Either way, smooth operation is likely to require some customization in your .emacs file, some of which can be set through the Emacs user options interface, which exposes some imaxima settings; see the imaxima link above to get started.

The next figure shows part of a Maxima session in emacs using the imaxima interface. The regular plot commands can be used as in the terminal, and separate gnuplot windows will pop up. To get embedded graphs, use the wxplot2d() and wxplot3d() analogues. With the use of comments, for which Maxima uses C-style syntax, the Emacs buffer can become a notebook in the Jupyter style (in fact there is also a Maxima kernel for Jupyter). The notebook can be converted into HTML or LaTeX with the Emacs commands imaxima-to-html and imaxima-latex , respectively, making it a convenient way to generate lecture notes or parts of papers. The HTML export is serviceable out of the box, but the current state of LaTeX export fails to create correct graphics insertion commands, requiring some extra ad hoc post-processing steps.

The Emacs session depicted also illustrates a few additional Maxima features, such as the use of infinity, sums, and defining functions.

In the space available, I can't even scratch the surface of all the math that Maxima can do. It has grown by accretion over the decades, and continues to grow, as mathematicians take advantage of its extensible nature to teach it the secrets of their discipline's sundry specialties. Maxima is especially congenial to the Lisp programmer, who can drop into its Lisp subsystem at the interactive prompt and make contact with the internal representation of its mathematical expressions.

Sage

In 2006 a single developer, Ondrej Certik, started a project called SymPy, which is a symbolic mathematics program, like Maxima, written entirely in Python. It grew quickly, and is now a mature project with scores of contributors. SymPy can be used from a specially wrapped IPython, providing an experience similar to Maxima in the terminal, or as a library. It would deserve its own section in this article were it not easier to discuss it in the context of Sage.

Sage is an enormous system for symbolic and numerical mathematics. It is unique in presenting a unified interface to 90 distinct components. Behind the scenes, Sage will automatically use the appropriate component to perform the calculation or manipulation that the user desires; or, when there is more than one way to do something, Sage can be directed to use a particular library or algorithm.

Because of this, Sage can handle optimization problems, astronomical calculations, elliptic curves, number theory, interval arithmetic, networks, cryptography, statistics, Rubik's cubes, ray tracing, and much else that I don't understand. It can display 2D and interactive 3D graphics, including visualizations of molecular structures. It includes embedded versions of Python (with NumPy and many other Python libraries), R, SciPy, SymPy, Maxima, and many other subsystems. Sage even claims to be able to outsource computations to the Wolfram Alpha computational engine, but in my testing this did not work.

Naturally, Sage is a big download. The usual way to install it is to download a 1GB archive from headquarters and expand that to a 3GB directory tree. Start it by typing ./sage in a terminal from the SageMath directory. This approach is convenient, but may result in some redundant installations on your system. On Ubuntu, at least, there is also a PPA that will allow installation through its package manager.

Sage is based on Python rather than Lisp. It has two interfaces, each of which presents a version of Python with superpowers. After typing the sage command you will be faced with a wrapped IPython prompt, as when using SymPy. You can interact with Python normally, but you will find a few alterations in syntax; for example, ^ is used for exponentiation rather than ** . You can perform symbolic math manipulations (and numerical calculations) from the prompt just as with Maxima, but with a somewhat different syntax (no semicolons required). There is no need to import any libraries, as everything has been set up within the special IPython environment. You can plot your results, but rather than gnuplot Sage uses matplotlib for 2D graphics and the Jmol Java molecular viewer to put 3D graphs in a window where you can interact with them.

Where Sage really shines is in its notebook interface. If you type notebook() at the IPython prompt, Sage starts a web server and connects it to a new tab in your browser. Here you can talk to Sage just as in the IPython interface, but the responses will be typeset by jsMath, which uses an embedded JavaScript version of the TeX mathematical typesetting algorithms. Thanks to AJAX and other JavaScript magic, interaction using the notebook is smooth; there are keyboard shortcuts for evaluation and for manipulating the "cells" into which the notebook is organized. Graphics are embedded into the page, and you can interact with 3D plots directly, zooming and spinning with the mouse. With a single click, the notebook can be converted into a neat HTML version suitable for printing or for use as a web page. You can even share the live notebook online.

The figure shows an extract from a Sage notebook in Firefox, while playing with the included graph theory packages. It gives some idea of the appearance, but cannot convey the fun of using Sage. This project has succeeded in gathering a carefully curated set of components and presenting them with a unified interface that is powerful and enjoyable to use.

Specialized packages

Maxima and Sage, powerful and wide-ranging as they are, are the general practitioners of the symbolic math world. Sometimes one needs to consult a specialist. Here I survey a few of the more narrowly focused mathematics systems, some of which are research projects, such as a system for symbolic computations in general relativity.

A related project is Cadabra, designed to help with the tensor manipulations and other math used in field theory. Cadabra is unusual in that the input language as well as the output are a subset of TeX. It's available as a reasonably up-to-date Ubuntu package.

While on the subject of physics, the FORM project is popular with particle theorists. Interaction is through text files that define the computation to be performed, and which can be processed in a multi-threaded, multi-processor, or sequential style.

Fermat, the personal project of Robert H. Lewis of Fordham University, specializes in polynomial and matrix algebra over the rational numbers and finite fields. It's the best in class for the specialized set of algebraic problems that it's designed for.

The CoCoA (Computations in Commutative Algebra) System specializes in polynomial systems and commutative algebra. It can employ Gröbner basis computations, unlike Fermat. It can also do map coloring and logic problems.

Macaulay2 concentrates on algebraic geometry and commutative algebra. It's been supported by the National Science Foundation since 1992. You communicate with Macaulay2 through a its own specialized, interpreted language.

TRIP specializes in perturbation series computations, specially adapted to celestial mechanics. In development since 1988, TRIP does both numerical and symbolic work, and is integrated with gnuplot.

I hope this incomplete roundup of a handful of specialized mathematics packages in this section provides a general impression of the range of activity in this field. For anyone with even a nascent interest in mathematics, systems like Sage, in particular, can make exploration of this world a great deal of fun.

Comments (29 posted)

In what is becoming its annual tradition, the darktable project released a new stable version of its image-editing system at the end of December. The new 2.2 release incorporates several new photo-correction features of note, including automatic repair of distorted perspectives and the ability to reconstruct highlights that are washed out in some color channels but not all—a type of overexposure that other editors can miss. There is a new image-warping tool that lets users edit image pixels (a first for darktable, which has historically focused on image-wide tasks like color correction). And there is at least one new tool that may prove intriguing even to users who prefer editing images in some other program: a utility for inspecting and editing color-mapping look-up tables.

Source code bundles are available for download through the project's GitHub repository and binary packages are already available for a wide variety of popular Linux distributions. Users of the 2.0 series should note, however, that opening existing darktable edit files with the 2.2 release will automatically migrate them to the newer format and render them subsequently unopenable with darktable 2.0.

New perspectives

For many users, the most useful new addition in this release will be the automatic perspective-correction tool. Imagine having an image like the one below, where straight lines appear curved due to distortion:

Using an algorithm from the Windows-only free-software program ShiftN, this new tool automatically detects off-vertical and off-horizontal lines in an image and computes the transformation needed to create a perfectly aligned, perpendicular output image.

If the algorithm does erroneously mark lines that should not be used when transforming the photo, the user can simply deselect them before generating the correction.

The user can choose whether to apply vertical correction, horizontal correction, or a combination of both, and the parameters are adjustable (although only in truly pathological cases is the algorithm likely to need much adjustment). There are also parameters available for correcting rotation and for applying a lens shift if either of those functions is necessary to align or re-center the image.

Highlights in 2.2

Washed-out highlights in an image usually occur when the light hitting the photosensor meets or exceeds the maximum value that the sensor can measure; the pixel's value then gets clipped and detail is lost. But each pixel in a color image sensor consists of separate red, green, and blue monochrome sensors (at least, for the most common camera types), and it is possible that only one or two of those subpixels has been maxed out. Darktable deals with these cases in a different manner than the case where all three colors are washed out: it reconstructs luminance information from the non-overexposed color channels. That produces grayscale pixels that have at least some texture (as opposed to being flat fields of plain white). So, while the clipped region is still nearly white, it can still exhibit some gradation and reveal some shapes.

This option is the "reconstruct in LCh" mode of darktable's existing highlight-reconstruction tool but, in prior releases, it simply did not work—despite how nice it sounds in theory. In the 2.2 release, "reconstruct in LCh" has been rewritten from scratch and now works as advertised. Moreover, darktable now offers an on-canvas indication of which subpixels are clipped and which are not: a checkerboard pattern utilizing red, green, and blue checks in the washed-out image region.

This clipping information can be computed from the original raw camera file, even before standard transformations like white balancing are performed. When this indicator is activated, it enables the user to spot instances where it is possible to recover more image data from a washed-out image than would otherwise be possible. In the case where all subpixels are clipped, no detail can be recovered with the "reconstruct in LCh" mode. As was mentioned in our look at the 2.0 release, darktable's highlight-reconstruction tool also offers a way to fill in these entirely washed-out areas by copying colors from the surrounding pixels, but that option is something of a last resort. The now-working "reconstruct in LCh" mode is likely preferable.

Strictly speaking, of course, one should not take overexposed images in the first place (perhaps, instead, bracketing exposures if one is unsure of how to get the best shot), but "only take perfect pictures" is hardly practical advice. And overexposed pixels can result from image processing itself; any time the user applies exposure compensation in darktable or some other program, clipped highlights can be among the side effects.

Speaking of bracketing exposures, darktable 2.2 adds another tool designed to help with difficult-to-capture situations. In scenarios where the dynamic range of a scene is too wide to be captured in a single shot, the photographer can shoot multiple exposures (e.g., one to capture the highlights and one for the shadows). Those exposures can then be combined via darktable's new "exposure fusion" module. In essence, the two frames (or however many were taken) are stacked together, then the lighter portions of the darker image and the darker portions of the lighter image are blended to produce one image that looks more-or-less correct everywhere.

This is same the technique found in the Enfuse utility, and similar approaches are available in other "high dynamic range" (HDR) image-processing tools like Luminance HDR (both of which LWN looked at briefly in 2012). Sadly, neither Enfuse nor Luminance HDR has made much progress toward a new stable release since 2012. So the addition of the exposure fusion module in darktable is a welcome sight. Perhaps that is the better option in the long run, since using a single-purpose application like Enfuse makes for a more complicated workflow than most users want. But all of the usual caveats about HDR images apply to darktable's new module: it is easy to overdo the blending process, creating distracting artifacts like halos or producing weird, washed-out looking results. Proceed with caution.

Distort all the things

A key distinction between photo-manipulation programs like darktable and RawTherapee and raster image editors like GIMP is that the former are intentionally restricted to "post-production" image-tuning operations. That means they focus on making adjustments to the hue, saturation, and lightness of the pixels in the image as a whole (or in large portions of the whole), as opposed to painting or drawing with tools onto a canvas. Various names are trotted out for such applications: "raw photo editor" was one of the first; "photo workflow application" is more recent, although neither really seems to communicate the idea unambiguously.

On the plus side, "workflow" adjustments can easily be implemented as non-destructive transformations, and the operations used on any particular image can be saved in a compact file format. The downside is that, in real life, one eventually runs into some situations where on-canvas editing is required.

The darktable 2.2 release adds what could reasonably be considered the application's first on-canvas tool for image manipulation (with possible exceptions going to special cases like flipping and rotating). It is called Liquify and it lets the user deform the image by pinching, stretching, and warping pixels.

The name "Liquify" is seemingly borrowed from Adobe Photoshop, though similar functionality can also be found in GIMP's IWarp filter.

Darktable lets the user add a single-point warp node that will stretch, compress, and twist the surrounding pixels within a user-defined radius, or the user can connect several such nodes along a vector path and warp a larger portion of the image. To be sure, the Liquify tool will not suffice for every possible operation, but it is an important first step. If bending or reshaping one portion of a photograph is all that is needed, doing so in darktable is more convenient that exporting the image to a separate raster file and opening it with GIMP.

Darktable is looking up

Whereas the Liquify tool manipulates pixel positions in space, the vast majority of darktable's other functions manipulate pixel colors in some specified fashion: adjusting the white point, brightening the dark regions, shifting certain hues, and so on. The final new tool in the 2.2 release is another color-adjustment feature, but it is a feature that can best be thought of as a generic color-mapping function. The color look-up table (CLUT) tool supports remapping all of the colors in the image based on some predefined set of transformations.

Remappings of this type are how "film emulation" features work: when someone takes the time to measure the output of known sample images exposed on Kodachrome 64, Fuji Velvia 50, or some other film stock, they can produce a mapping that transforms a generic RGB image into something that looks more vintage. Photographer and GIMP team member Pat David, for example, has created several hundred open-source film-emulation CLUTs that are already usable in GIMP's G'MIC plugin and in RawTherapee. But there are other important CLUTs found in the wild, such as those that digital cameras apply when converting a raw image into a JPEG to be saved on the memory card. Being able to apply the same CLUT to a raw image in darktable would ensure that its output matches the camera's JPEG exactly.

This is indeed what the new tool in darktable can do—at least, if the CLUT of interest is available. At present, the process of creating those CLUTs is only beginning, with a handful of mappings available.

Creating a new CLUT can be a time-intensive task. Color spaces are three-dimensional constructs—regardless of whether those dimensions are red-green-blue, hue-saturation-lightness, or something more exotic; a 3D-to-3D transformation matrix thus involves looking at a lot of sample points. But the new darktable release adds a utility called darktable-chart that can assist with the process. To match a camera's output, the user will need to purchase a calibrated IT8 target and take a reference photo, but nothing stops interested parties from fiddling around to create interesting CLUT mappings of their own. In fact, the popularity of products like Instagram suggests that it is hard to stop people from fiddling with CLUT transformations.

In the long run, CLUT support will let darktable users easily apply a large variety of color styles to their photographs, whether those styles are designed to emulate film or not. But darktable-chart may prove useful in its own right, because—as David has mentioned on his blog—the existing process for creating a CLUT is complex.

From the shadows

While the aforementioned tools are the biggest changes in the new release, there are scores of smaller improvements to be found as well. The keyboard support introduced in version 2.0 is expanded upon: arrow keys can be used to adjust sliders and controls, and modifier keys can be used to change the rates of adjustment (Shift is a 10x increase in the adjustment increment, while Control is a 0.1x reduction in the increment).

The script-friendly command-line version of darktable, darktable-cli , can now take entire directories as an input argument. Watermarks generated by darktable can now include geolocation information. And, on Linux, the application now tells the window manager when it has completed a batch operation, thus enabling the user to be notified when another application has focus.

There is also initial support for undo and redo on image-adjustment operations. That may seem like a fundamental missing piece, but one must remember that darktable's adjustments are non-destructive and most can only be applied once (e.g., you would not adjust the white balance of a photograph twice). So in most cases, undoing or changing an adjustment was a simple matter of deactivating it or resetting it to the default parameters. Still, this is a nice improvement for how most people think of editing images.

At a lower level, the project has implemented initial support for some new photosensor subpixel patterns, including CYGM and RGBE arrays. OpenCL support has also been improved, with OpenCL now used to demosaic raw images, and the first builds for ARM devices are available. Finally, the program now uses Pango for text rendering, which will enable it to be used with right-to-left languages.

Interfacing

The new feature set boasts a lot of useful additions. But, if there is any criticism to be leveled at the 2.2 release, it is that darktable still hides so much of its functionality behind user-interface choices that are invisible while simply using the program.

A good example is the substantial set of options in the perspective-correction tool that are only made available by holding down modifier keys when clicking on the "automatic fit" buttons. The existence of the modifiers is only discoverable by hovering the mouse over the buttons long enough to bring up the tooltip hints; for an application offering as many functions as darktable does, it is simply not feasible to expect users to memorize every combination of Shift, Control, and left- and right-clicking for every tool.

Other applications—for example, GIMP—find a way to present such options in a persistent message-bar area or in a field within the toolbox itself; there is no reason darktable cannot do the same. Instead, having to pause and hover the mouse over every tool before you can see how to use it is reminiscent of darktable's early days, when the filters and controls had no text labels and users had to guess at the meaning of cryptic icons. The application has come a long way since then; it would seem that it still has some distance to travel.

Comments (5 posted)