NetHack 4 homepage | NetHack 4 blog | ATOM feed

Towards being able to ignore $TERM

Tags: terminal interface portability libuncursed | Mon Aug 12 06:53:38 UTC 2013 | Written by Alex Smith

There are two main groups of interfaces for NetHack: graphical / "tiles" interfaces, which use a windowing system like X (on Linux), GDI (on Windows), or SDL (cross-platform) to render the interface; and "console" interfaces which use a terminal to render the interface. Traditionally, console interfaces work by looking at the $TERM environment variable to detect the terminal, then at a file in a database ( termcap or terminfo in order to determine what codes are required to make the terminal display in certain ways. However, this has several problems.

The first problem is to do with watching and replaying games. It's quite common to use utilities such as ttyrec to record a game, or to broadcast it live via a terminal multiplexer (such as the well-known nethack.alt.org, or termcast.org which is not limited to NetHack). This immediately leads to a problem: not all the people watching a game are necessarily going to be using the same type of terminal! As such, it's quite common for characters to be used that render correctly for one viewer, but not for another.

The other problem is to do with terminal configuration. For instance, NetHack 3.4.3 has an IBMgraphics mode that uses IBM-extended (or "code page 437") control characters in order to draw parts of the dungeon. Several players prefer this to the alternative drawing methods available, but NetHack 3.4.3 assumes that the terminal interprets extended characters as code page 437 by default, and most terminals don't (they typically use either Latin-1 or Unicode). The result is that players have to seek out guides on how to configure, say, gnome-terminal to use the correct encoding, and often have to settle for something "almost right". gnome-terminal does not have an option for code page 437, but it does have an option for the similar code page 850; and many players still think that the IBMgraphics display for a fountain is ¶ (rather than the correct ⌠ ), because it's one of the characters that differs between the two codepages. This is a ridiculous state of affairs, because gnome-terminal can be configured to understand extended characters as code page 437 via terminal control characters!

There is a neat solution to both these problems. Instead of trying to choose control codes to send that are specific to one terminal and configuration, we can instead write a "polyglot", aiming to produce control codes that work on as many terminals as possible. The principles are the same as polyglots in programming; typically, you want each control code to either be understood the same way by each terminal you're aiming at, or else understood by some and ignored by others. (In order to produce codes that will be ignored, you can instead write codes that cancel themselves out.)

I wrote a terminal rendering library, libuncursed, that uses these principles, and it's currently in use for rendering here on nethack.org. Here's a description of the problems encountered trying to use them, and what the solutions are:

Color

Nowadays, the vast majority of terminals that people play NetHack on support color, and the vast majority of players play in color. (There's one holdout on rec.games.roguelike.nethack who still has a terminal old enough that it not only fails to display color, but will fail to display anything at all if color codes are seen in the input; and likewise, there are some players who play in black and white as a challenge or to show off. Both sets of players can be accommodated via a toggle to turn color on and off.) The problem is that different terminals support different numbers of colors correctly: out of the color terminals in common use, possible values for color depth include 8, 15 (16 with a broken dark grey), 16, and 256. Likewise, in some terminals, the bright colors are tied to boldness (i.e. "bright blue" is equivalent to "bold + blue"); such terminals might or might not display the bright colors as bold.

From the point of view of actually writing a roguelike, it does not make much sense to use more than 16 colors; many players will be unable to render them. In order to provide a sensible fallback for the 8-color users, we'd want to use bold to distinguish the bright from dark colors, in the same way as a 16-color terminal where boldness is tied to brightness. So for instance, we'd want to render dark red as dark red on every terminal, but bright yellow would be rendered as:

bold brown on an 8-color terminal

bold yellow on a 16-color terminal with brightness tied to boldness

(non-bold) yellow on other 16-color terminals

(non-bold) yellow on 256-color terminals.

Experimentation shows that low-color-depth terminals tend to ignore the code for, say, "yellow" entirely, which gives an easy way to get the correct color; send "brown" first, then "yellow", and we'll get the correct color regardless of the color depth of the terminal. However, if we have an 8-color terminal, or a terminal with brightness tied to boldness, this is going to give us brown rather than yellow, and we need to set the "bold" flag too. Boldness is meaningful on terminals with high color depths; we could just set it unconditionally (which is what NetHack 3.4.3 does), but then it would embolden bright colors even when there was no need, making the terminal harder to read. (This also prevents the use of bold to make certain monsters stand out more; making two monsters differ only in boldness would be a mistake, but it'd certainly be nice to be able to make the truly dangerous monsters more visible.)

As such, we need to use a polyglot trick. Although it's not clear how to turn bold off for the third type of terminal, we can turn it off for the 256-color terminals; the code for setting a color out of the palette of 256 is CSI 3 8 ; 5 ; color m , with the important part being that the color is a parameter; that means that on 256-color terminal, we can put any number we like there and we can still cancel out the code simply by setting the color we want afterwards. The obvious code to use there is 1 , meaning "bold"; this gives us CSI 3 8 ; 5 ; 1 m to set bold only on sub-256-color terminals. The problem here is that we're setting two other options as well (because the CSI m command allows multiple options to be set at once); 38 is "underline, default foreground" in some terminals (and ignored by others), and 5 is universally "blink on" (although many terminals refuse to render blink, or render it as a bright background).

We can work around the problem by adding more options. Assuming we know whether we want the character underlined or not, we can simply set the underline state by hand; 4 for underline on, 24 for underline off. Assuming we aren't insane enough to want blink (especially as it isn't portable), we can set that by hand too, code 25. Combining this, we get CSI 3 8 ; 5 ; 1 ; 2 4 ; 2 5 m as our code to turn bold on, except in terminals that support 256 colors (and quite possibly also change the foreground color, but this is OK as we're about to change the foreground color anyway.)

Finally, we just need to add the codes for the various possible foreground colors. In our example of yellow, this is 33 (brown) and 93 (yellow), giving us a final code of CSI 3 8 ; 5 ; 1 ; 2 4 ; 2 5 ; 3 3 ; 9 3 m .

For dull colors, it's much easier; we want to universally turn off bold (22), and set brown (33), giving us CSI 2 2 ; 3 3 m , which is interpreted the same way by all the terminals we care about. We can use a similar trick to render as dark grey on high-color-depth terminals but dark blue on low-color-depth terminals (working around the bug with dark grey): CSI 2 2 ; 3 4 ; 9 0 should do the trick.

We can use a similar trick for background colors, as well. All the terminals I tested that supported foreground colors also supported background colors; this time, what we need to set for bright backgrounds on low-colored terminals is not bold, but blink. This can be accomplished quite simply using CSI 4 8 ; 5 ; 5 m ; on terminals that don't support 256 colors, this sends a 48 (which is meaningless and so ignored) and turns blink on twice, whereas on terminals that do, this sets the background color to 5 (which will be overwritten by our subsequent setting of the background color to something else). Likewise, dark colors will use something like CSI 2 5 ; 4 6 m to turn blink off (regardless of whether it was off already or not), and change to a dark cyan background.

An interesting question is whether it's possible to extend this to picking a certain color from the 256-color range if possible, and fall back to 16 or 8. The problem is that in the Linux console, the code for setting a foreground color from 256 will turn color off altogether, and in both the foreground and background color codes, the color that's actually selected will be interpreted as a control code, and thus could potentially do anything. The codes used would thus probably have to depend on the exact color being selected.

One remaining problem is that some terminals, such as gnome-terminal , implement "default foreground" and "default background" colors that don't necessarily match anything in the color palette. The best solution here is probably to simply just ensure that the foreground and background colors are always both set to explicit values, in order to avoid being caught out by the occasional crazy player with a white background.

(It should be noted that it's a lot simpler to just bold the bright colors regardless of the terminal; this leads to much shorter control codes and much more portable behaviour. This is the approach I used in libuncursed, because of compatibility problems that some terminals have with the polyglot approach.)

Color palette

There's one other issue involved with color: a designation like "dark blue", "brown", or even "black" can vary by terminal (I've seen terminals where black was actually quite a light grey when used as a background). It would be nice to work around this problem by portably setting the color palette; this ends up typically being a huge mess in practice, though, because some terminals don't have a customizable palette, and there's no standard way of setting it. Worse, gnome-terminal resets its color palette to its defaults whenever you alt-tab away from the window, something that console applications have no control over and which shouldn't matter. (I've reported this bug to Ubuntu, who supplied my copy of gnome-terminal.)

Of course, these difficulties didn't stop me trying anyway. I've created a file that sets the palette of the Linux console, PuTTY , gnome-terminal , and xterm to the color palette used by my ttyrec-playing program jettyplay (I couldn't set the color palette used by screen for obvious reaons, but I did manage to make the file a no-op on screen ); if you want a nice color palette while playing roguelikes, just cat this file to your terminal before playing (and if you're on gnome-terminal, don't let the window lose focus until you're finished…)

libuncursed also attempts to set the palette, using much the same techniques (although instead of trying to prevent the palette-setting codes echoing visibly on terminals that don't understand them, it just prints them immediately before clearing the screen). It actually took a lot of experimentation to find a usable palette in which the colors differed from each other as much as possible, because human eyes see some colors as much closer than other colors. (For instance, pretty much everyone can distinguish a red from a yellow, but many people have trouble distinguishing a green from a cyan.) For reference, here's the color palette I came up with (15 colors + black background):

Number Description HTML color code 256-color code 1 Dark red #AF0000 124 2 Dark green #008700 28 3 Brown #AF5F00 130 4 Dark blue #0000AF 19 5 Dark magenta #870087 90 6 Dark cyan #00AF87 36 7 Grey #AFAFAF 145 8 Dark grey #5F5F5F 59 9 Orange #FF5F00 202 10 Bright green #00FF00 46 11 Yellow #FFFF00 226 12 Bright blue #875FFF 99 13 Bright magenta #FF5FAF 205 14 Bright cyan #00D7FF 45 15 White #FFFFFF 231

The hardest part of the color wheel to distinguish is the part between magenta and green (2, 4, 5, 6, 10, 12, 13, 14; that's 8 colours that have to fit into just over half the color wheel). It turns out to be very difficult to create a color between green and yellow that is easily distinguishable from both, so my greens are actual greens (luckily, dark and light green are pretty different from each other); this forced one of the cyans to be greenish, and the other bluish, to be able to easily tell those apart, and that forced the bright colors to skew around towards red (which is why the bright blue is a lilac). Luckily, there was extra room in the color wheel between lilac and orange for the pink that's being used for bright magenta; in fact, if I needed to add a 17th color, I'd probably try to add an extra shade of red.

Character Sets

NetHack players will probably be familiar with the terms DECgraphics , IBMgraphics , and (for NetHack 4) Unicode graphics . These don't just stand for different character sets, but for entirely different rendering methods (which have different sets of characters that they can produce, which is why they look different in-game). So which one should a roguelike aiming for terminal portability use?

DECgraphics is probably the simplest in terms of getting it working, because it isn't designed to be compatible with ASCII the way other character sets work, and uses special codes to specify that characters should be interpreted as line drawing codes, rather than adding them to the end of an existing character set. As such, there's no penalty to a terminal to support DECgraphics at the same time as another set; there cannot possibly be any conflicts. So in most terminals, it will just work fine. The flip side is that in the terminals where it doesn't work, such as PuTTY, no amount of configuration will make it work; you just have to abandon it and use something else.

There's one other problem with DECgraphics . The way that terminal control codes for character sets work is that a terminal tracks multiple character sets (at least two, G0 and G1; some terminals have a G2 and G3 as well); there are codes to change which character sets are used to write characters, and codes to define what each character set means. As such, if you want to mix DECgraphics line drawing characters with ordinary ASCII characters (which is basically always the case), there are two common approaches; either you define G0 as ASCII and G1 as DECgraphics permanently and switch between them, or else you use only G0 and define it as ASCII or DECgraphics according to which character you want to write. The problem is that neither of these approaches works on all DECgraphics terminals: the Linux console will redraw all existing G0 characters using the new G0 character set if G0 is redefined, and some Mac OS X terminals have trouble switching between G0 and G1.

So what about IBMgraphics ? Unlike DECgraphics , IBMgraphics works via extending the 7-bit ASCII character set to the 8-bit codepage 437 character set, with characters 32-126 as ASCII and 128-255 as codepage 437 (the others are terminal control codes). The problem with IBMgraphics is that unlike DECgraphics , terminals have a wide spread of interpretations of characters with the high bit set (for instance, interpreting it as Unicode is a common choice). There is a widely implemented terminal control code to set G0 as IBMgraphics , ESC ( U , but many terminals don't support it (and in particular, xterm doesn't understand it; in fact, xterm does not support IBMgraphics at all in its default configuration). So when attempting to use IBMgraphics , it's best to set G0 to IBMgraphics and use it exclusively, but for many terminals the user will have to set the encoding manually (through the menus on many UNIX terminals, or with the command chcp 437 on Windows).

The remaining possibility I will consider is Unicode (specifically, UTF-8). This is gaining increasingly wide support nowadays; there's also a control code ESC % G that some terminal emluators (e.g. the Linux console and Jettyplay) support for explicitly marking the input as UTF-8 (whereas most terminals just assume this if they aren't informed otherwise, and some terminals just assume this even if they're informed otherwise). Like IBMgraphics , UTF-8 uses bytes with the high bit set to encode non-ASCII characters; unlike IBMgraphics , UTF-8 uses multiple such bytes per character, giving a much larger character set.

There are two problems with the use of Unicode. The first is that just as many terminals don't support anything but Unicode, many terminals don't support Unicode, and are likely to use either code page 437, or Latin-1 (which is unsuitable for roguelikes due to having no line-drawing characters). The second is that there are several cases where the terminal supports Unicode, but the combination of the operating system on which the terminal is installed, and the fonts the terminal is trying to use, is incapable of rendering the characters even though it knows which characters they are supposed to be. Solving the first problem using Unicode graphics is impossible, but the second problem has workarounds; for instance, Microsoft publishes a list (Windows Glyph List 4) of Unicode characters that normally render correctly on Windows. Interestingly, this list contains all the characters in code page 437.

As such, NetHack 4's default configuration voluntarily limits itself to characters that exist in code page 437 (although it allows the user to specify Unicode characters beyond that in the configuration files, a user is unlikely to configure those unless they actually work on that user's terminal); and libuncursed uses either IBMgraphics or UTF-8 depending on which the terminal supports. It might at first seem impossible to determine which character set the terminal supports without looking at $TERM , but the vast majority of terminals (including all the Unicode terminals I tested) support a terminal control code to request the cursor position. This makes it possible to send a sequence of bytes like C2 A0 that moves the cursor a different distance in codepage 437 and in UTF-8, then determine the character set based on how far the cursor moves. (I chose this code because it's the first printable Unicode codepoint that moves the cursor a different distance from the number of bytes that represent that codepoint in UTF-8; as a bonus, it represents a non-breaking space, meaning that it doesn't even put weird characters on the screen.) If the terminal doesn't reply with a cursor location, it must be an old pre-Unicode terminal, and most likely it uses IBMgraphics by default. (Then libuncursed sends the codes to select the character set it just detected; partly in case it detected the character set incorrectly, but mostly so that people watching a recording or watching via a terminal multiplexer have a better chance of rendering the character set correctly.)

Alternatively, it would be possible to use the polyglot approach for rendering, creating a sequence of bytes that rendered correctly on both UTF-8 and IBMgraphics terminals. The idea is to send the codepage 437 code first, then two carriage returns (so that terminals trying to parse the codepage 437 as Unicode will return to the original shift state), then a cursor motion code to put the cursor on the next character, then a C2 A0 and two backspaces, then the UTF-8 code, and finally a cursor motion code to put the cursor on the next character again. Let's take a ┌ in the leftmost column as an example (as would be used, for instance, by the frame NetHack 4 draws around the screen on large terminals). The byte sequence is DA 0D 0D 1B 5B 31 43 C2 A0 08 08 E2 94 8C 0D 1B 5B 31 43 :

IBMgraphics ┌ ( DA = ┌ )

┌ ( 0D 0D = cursor to column 1)

┌ ( 1B 5B 31 43 = ESC [ 1 C = cursor right 1 column)

┌┬á ( C2 A0 = ┬á )

┌ ┬ á ( 08 08 = cursor left 2 columns)

┌Γöî ( E2 94 8C = Γöî )

┌ Γöî ( OD = cursor to column 1)

┌ Γ öî ( 1B 5B 31 43 = ESC [ 1 C = cursor right 1 column)

UTF-8 � ( DA 0D is invalid UTF-8)

� ( 0D = cursor to column 1)

� ( 1B 5B 31 43 = ESC [ 1 C = cursor right 1 column)

� ( C2 A0 = nonbreaking space)

� ( 08 08 = cursor left 2 columns)

┌ ( E2 94 8C = ┌ )

┌ ( OD = cursor to column 1)

┌ ( 1B 5B 31 43 = ESC [ 1 C = cursor right 1 column)

In both cases, we have our ┌ in column 1, as desired. This leaves some junk on the screen in the IBMgraphics case, but that will be overwritten as more characters are output (and eventually can be overwritten with spaces at the end of the line, which don't need fancy handling because they are ASCII characters).

The main reason this approach isn't used in libuncursed is that it's incapable of getting very near to the right edge of the screen; the cursor motion tricks don't work if the cursor wraps. (A more minor reason is that sending 19 bytes for one character would bloat the size of recordings somewhat; while a valid criticism, the extra bandwidth would be minor, and the recordings would still compress well.)

Cursor motion

The remaining problem with portable terminal handling is the codes for actually moving the cursor. It's long been a tradition in terminal rendering to attempt to get these down to as few bytes as possible via using every obscure feature of the terminal, in order to save as much bandwidth as possible. This was likely a valid argument back in the days when communication with terminals went via 2400 baud serial lines, but nowadays extra portability is worth sacrificing a few bytes now and again.

I tested a wide range of terminals to see which control codes they supported; and as a result of the test, libuncursed only ever uses "clear screen", "cursor up/left/down/right distance", and "cursor to location row, column". (The main useful codes that turned out not to be portable were "cursor to column in current row", and "cursor distance up/down and to column 1"; the former can be expressed with only one extra byte via use of a carriage return before the code, and the latter with only one extra byte via use of a carriage return after the code.)

libuncursed also attempts to save the existing terminal contents when it loads, and put them back when it exits, using the standard codes for doing that; those codes aren't portable, but there is no better option than trying them to see if they work.

Conclusions

It seems that the termcap/terminfo/curses method of terminal rendering is overkill nowadays; trying to gain portability via special-casing every terminal introduces problems whenever multiple people might need to look at the same sequence of bytes, and modern terminals are broadly very similar to each other. libuncursed serves as a demonstration that really, you only need two sets of codes (one for Unicode terminals, and one for non-Unicode terminals).