Similar changes have happened to other kinds of embedded systems. An IC which includes a lot of features is called an SoC (System on Chip). Common examples of SoCs are nRF51 (BLE + MCU on single core), CC2640 (BLE + separate MCU), CC3200 (Wi-Fi + separate MCU).

This kind of consolidation results in two things:

Less hardware for embedded systems is developed in-house

Software and overall system complexity is highly reduced

Performance

Embedded systems, being a part of the whole hardware industry, have enjoyed a steady growth in silicon performance. Proliferation of 32-bit microcontrollers was a part of that trend. A less-noticeable, but arguably more important part of that has been the switch from the 8051 core to ARM Cortex-M.

8051

8051, designed by Intel in the 80s has been a popular choice for microcontroller manufacturers, who used it as a basis for building their own, optimized versions of it. This meant that the industry had a wealth of 8051-flavored cores and no standardization across manufacturers.

Upgrade with a modern design has brought higher performance, which allowed for doing work which would before require a fully-fledged microprocessor, thus enabling a whole new range of applications. It has also increased the efficiency and thus lowered power consumption. Modern design has brought other less obvious advantages like better sleep modes, simpler memory interfacing and overall ease of development.

Cortex-M

Cortex-M hasn’t been the first 32-bit core on the embedded market, but over the years it has become more and more popular of a choice for microcontroller manufacturers, most of whom now offer at least one product family featuring it, along with older 8-bit and proprietary 32-bit versions.

The consolidation of the industry around Cortex-M is important for several reasons:

Silicon design reuse lowers MCU costs

Ability to use a wider range of software toolchains

Software Development Tools

Toolchains

As mentioned previously, for a long time most embedded systems were based on 8-bit microcontrollers with proprietary cores, many of which only supported IAR/Keil toolchains or, at the worst, those supplied by the manufacturer. And since manufacturers are/were at their core hardware companies, the quality of those tools was below the level accepted in the software world. More specifically, the problem with those tools was that they were closed-source, windows-centric, heavily GUI-based and very expensive (>$5k for a yearly license).

This all has changed with the arrival of ARM. While they do provide their own toolchain for the Cortex-M series, there are plenty of free options, most popular of which are GCC-based.

Beyond Toolchains

Perhaps the most noticeable thing which happened to the industry is Arduino, which served as both the entry point for people new to the field, and a useful prototyping tool.

Starting as a toolchain (avr-gcc + IDE) for an 8-bit AVR and a basic development board with a mission to enable “non-engineers to create digital projects”, it has grown into a wide ecosystem, inspiring many others to create similar ones, compatible or not and a whole industry around hobbyist electronics.

LLVM

Orthogonal to the trend with ARM, another technology with a potential of bringing modern software development practices to the embedded world is LLVM.

LLVM is a compiler infrastructure agnostic to the choice of a language and a target. On a high-level, LLVM and a typical C toolchain have a similar compilation algorithm:

Source Code → Internal Representation → Binary

The difference lies in the choice of the internal representation language: GCC uses Assembly, which is inherently platform-dependent, while LLVM uses a more abstract language (LLVM IR). And this is a huge deal, because this makes the toolchain much more modular and allows to combine different front-ends and back-ends (at least in theory).