It turns out that there is a rather interesting story behind the 286 XENIX incompatibility with 386 and later processors. Here’s roughly what happened in chronological order.

In 1982, Intel released the iAPX 286 processor, later known as the 80286 or simply 286. This was a highly complex CPU for the time, with an intricate system of software privilege and security enforcement known as protected mode. The 286 used descriptor tables with 48 bits (6 bytes) of information, but in order to keep the design simple, full 64 bits were taken up for each descriptor; the last 16 bits were not used by the CPU and were documented as reserved.

Later in 1982, work was also underway on the 386, Intel’s 32-bit successor to the 286.

In late 1982 or in 1983, Microsoft ported XENIX (a variant of UNIX System III licensed from AT&T) to the Intel 286 processor. Intel and IBM were among companies which licensed the 286 XENIX port from Microsoft. Although Intel had the XENIX source code, the porting was done entirely by Microsoft. Developers at Microsoft decided to store bookkeeping information in the “reserved” 16 bits of a descriptor table entry not used by the 286. This approach reliably worked on 286 processors.

Circa 1983, Intel’s design of the 386 became more fleshed out. The 386 used descriptor tables which were an extension of the 286 variant. Notably, the previously unused word was fully used and Intel utilized it to extend the segment base (24 to 32 bits) and add new attribute bits.

In late 1983, updated 286 documentation marked the last 16 descriptor bits as “reserved for iAPX 386, must be zero”.

In Summer 1984, Intel had the first A-step samples of the 386 and validation was in full swing. A logical target was 286 XENIX since it was a product Intel sold and an operating system which utilized the 286’s protected mode.

Intel discovered that 286 XENIX does not work on the 386 at all. Engineers quickly realized that the problem was not with the 386 design or silicon but rather with the fact that Microsoft chose to use the reserved descriptor word ignored by the 286, and the 386 interpreted the data in unexpected ways (which immediately crashed the OS).

Since Intel engineers had access to the XENIX source code, they were able to re-code the OS not to use the reserved descriptor word. Microsoft was reluctant to accept the patch, claiming that there was nothing wrong with their code.

The matter was quickly escalated and sometime in late Summer 1984, Intel engineers met with higher-ups at Microsoft. Bill Gates argued that he knew how to get the most out of hardware, that Intel should listen to Microsoft, and that the 386 needed to be changed. Intel argued that Microsoft should not have been using reserved areas; Intel also really didn’t want to change the 386 design at that point because A0-stepping engineering samples were already in customers’ hands. Microsoft pointed out that if Intel didn’t want programmers to use reserved areas, they shouldn’t have let them use those areas in the first place.

Intel did not change the 386 design but took heed of Microsoft’s admonition. It was a lesson learned—if you want to keep certain areas reserved, you must actively keep out intruders. Otherwise there is a risk that reserved bits can’t be repurposed in the future because changing their semantics would break too much existing software.

In late 1984, IBM XENIX 1.0 for the PC/AT was released. It contained the original Microsoft implementation which puts data in a reserved descriptor word. Since there were no 386s available yet and IBM XENIX 1.0 was bolted to the specific PC/AT models anyway, that did not pose practical problems to IBM’s customers.

In October 1985, Intel released the 80386. The 386 was a somewhat sloppy design which allowed users to set certain bits documented as reserved. The 386 programming documentation explicitly listed the use of the last 16 bits of a descriptor table entry as a visible behavioral difference between the 286 and 386.

In April 1989, Intel announced the i486 processor. The i486 was essentially identical to the 386 in its treatment of reserved bits. It is likely that Intel wanted to make minimal changes compared to the 386, possibly in part because they could not be certain who might be incorrectly setting reserved bits.

No later than Summer 1990, Intel completed the P5 (later known as Pentium) external design specification. The Pentium design was significantly different from the 386/486; it took the 286 XENIX lesson to heart and actively prevented programmers from setting reserved bits in control registers or page tables. This was done by raising an exception when an attempt to set reserved bits was made. The strategy was adopted by follow-up CPUs which raise exceptions whenever the user tries to set a reserved bit in control registers, page tables, MSRs, etc. This approach makes controlled expansion of the architecture viable because existing software does not suddenly produce completely unexpected results on newer CPUs which additional control bits. The P5 documentation explicitly listed exceptions due to setting of reserved bits as a difference from the i486.

In early 2017, the story came out in an e-mail discussion with a former Intel validation engineer.