Microcontrollers will transform IoT, and our lives.

Ten years ago, Steve Jobs stood on stage with the iPhone and we got a glimpse of the future. A supercomputer in every pocket, everywhere in the world. Now there are almost as many smartphones as humans.¹

And yet, by the numbers, the smartphone pales in comparison to the hardware revolution that comes next; by 2025, there will be ten times more non-mobile phone connected devices in the world than humans.

Technology progress is iterative, yet punctuated. Every so often, something big comes along and changes our relationship to the world. Fire, metallurgy, the telegraph, the internet, and so on. All these things had a profound effect on humankind’s relationship with the world and each other. The smartphone changed just about every aspect of our relationship to the world. From realtime GPS directions to Instagram, it changed not just how we operate with our surroundings, but with each other.

And what’s coming next will have an unimaginably large effect upon our species.

The Connected Things/IoT Revolution

The mobile revolution gave us something beyond the phone experience: tiny, high quality, commodity cheap, low-energy, connected components along with new processes and industrial capacity to make lots of them. Every smartphone is collection of interesting hardware, much of it designed specifically for the mobile use-case, but when pulled out of that context, can be used to make all sorts of interesting experiences beyond mobile phones.

Soon, it will be difficult to buy a new product without some kind of connected smarts. Controlling an appliance from a mobile phone will become a standard feature and ubiquitous sensors and control will transform industrial automation, buildings, and our interaction with the world around us. This ubiquitous connected hardware transformation promises to be the biggest part of the computing revolution yet, with USD $1T/year being spent on IoT by 2020.³

And powering it will, in large part, be a new class of microcontrollers that confer a number of built in advantages, including: price, functionality/performance, and cost.

The Microcontroller Renaissance

While mobile was shrinking everything and commoditizing hardware, something else was happening. Partly as a byproduct of those innovations, the microcontroller, a mainstay in simple embedded systems since the ’70s, was undergoing a major renaissance.

The microcontroller (often shortened to MCU), is a stand alone mini computing chip that has everything it needs to run, and typically includes flash memory for code/application storage, and a small amount of RAM in which to execute.

Microcontrollers are built to control hardware.

Unlike other microprocessors, microcontrollers have their roots in embedded hardware, so they also have lots of General Purpose Input/Output (GPIO) in which to communicate with and control hardware peripherals. This includes both digital I/O and associated protocols such as I2C, Serial, SPI, CAN, and others, as well as analog I/O for reading data from environmental sensors and other analog sources.

Many modern microcontrollers also have other interesting features such as hardware JPEG and cryptographic acceleration that extend their use cases into things like camera control or security products. Many even have display controllers and 2D graphics acceleration, allowing them to drive displays and accept touch screen input.

Also unlike other microprocessors, such as CPUs, microcontrollers are not designed to run heavy OS’s such as Linux or Windows that are based on a multi-user, multi-application, hardware-abstracted paradigm, in which the OS is often the heaviest user of resources. Instead, microcontrollers are typically either OSless, or they run Micro Real Time Operating System (µRTOS’s) in which there is just enough OS to support the application, which is the main user of processing time and has full, direct access to the chip hardware. Having direct hardware access also provides another advantage; real time hardware communications. Unlike a hardware-abstracted OS, an application running on a µRTOS has fewer abstractions to slow down access and generally provides a much lower access latency to connected hardware.

Microcontrollers are Already Everywhere

You may not know it, but if you’re reading this, you’re likely surrounded by microcontrollers. The average home in a developed nation has almost 10 times as many microcontrollers as CPUs, and if you’re sitting in a car, there are probably at least 30 around you.⁴

Nearly every modern appliance is powered by a microcontroller, but they’re also so cheap and simple to use that they’re in toys, remote controls, thermostats, and just about any other electronic gadget around.

In fact, microcontrollers are also what power the ubiquitous Arduino boards, which has become the go-to standard for tinkerers and hobbyists around the world.

And while microcontrollers have been embedded in everything from toys to cars for the better part of 40 years, it’s only been in the last few years that microcontrollers have become truly useful for IoT; they’re not only sufficiently powerful enough to run complex applications, but many have also received built-in support for gateway connectivity such as Ethernet, WiFi, BLE, and others. Many of them have also opened up their memory busses to drive external flash and RAM, allowing them to be extended in interesting ways.

Microcontrollers are Replacing Legacy Technology

Microcontrollers also perform the functions of many other legacy pieces of technology. Programmable Logic Controllers (PLCs), for instance, are being replaced en masse in products by microcontrollers because they can do everything a PLC can do, and are cheaper and more flexible. Hardware Proportional, Integral, Derivative (PID) controllers were once a staple of appliances such as ovens, refrigerators, rice cookers, and HVACs to control things like temperature and have now all but been replaced by MCUs in new designs.

Vs. Application Processors

Microcontrollers are often confused with another recently popularized class of chips known as application processors. Application processors came out of the mobile revolution as a lower power but still high performance alternative to CPUs that had enough horsepower to drive an entire mobile OS, but still run on a battery.

Application processors are often a System on a Chip (SoC), which means that rather than being a single core, single purpose chip, they contain multiple cores with different functionality. In fact, many application processors are effectively nearly the entire motherboard of a low-end computer, shrunk down and put on a single, massive piece of silicon.

Single-Board Computers

Many of the common application processors are based on the ARM A7 processor core specification and are what powers Single-Board Computers (SBCs) such as the Raspberry Pi, as well as many Android phones.

Because SoCs are tiny computers, they run, and in fact require, full operating systems such as Linux or Windows, and confer all the advantages and disadvantages of a regular computer.

And because hardware peripherals require the GPIO and protocol support that are found in MCUs, most SoCs embed one or more MCUs in their chip design. Microsoft’s Sphere board, for instance, is a single-board computer that runs an MT3620 chip, which has two microcontrollers within it!

All of that silicon and supporting components require power. Which brings up another thing: power efficiency.

Power Efficiency Means more than Battery Powered

Microcontrollers, in comparison, are designed to use very little energy as a primary design function; it drives design decisions both in the silicon, and the software that they run. Nearly all modern microcontroller architectures, and therefore the chips themselves, have power saving features built in, such as advanced sleep functions. And while peripherals are usually the biggest user of power, a microcontroller might draw a milliwatt or less during operation⁵, compared to half a watt by the A53⁶ chip that the Raspberry Pi uses. That means a Raspberry Pi can be expected to draw a thousand times more power, even without peripherals, than a microcontroller.

And while the application processors on SBCs are trending towards high power usage over time, microcontrollers are actually being tuned for increased power-efficiency. The STM32F7 chip, for instance, is twice as powerful, but uses half the energy of its predecessor, the STM32F4.

This power efficiency means that microcontrollers, even with sensors, can run for years on a small battery, or indefinitely by adding a small solar cell.

And while not every IoT use case calls for the need to be run on a battery, the power efficiency of a microcontroller pays dividends in other ways.

Firstly, by using a fraction of the power that single-board computers do, their total cost of ownership is significantly lower; a savings that is often multiplied by multiple device installations. And all that power savings means that microcontrollers run much cooler than application processors, which not only reduces cooling complexities in designs, but they’re useful in places where cooling is difficult, such as in vacuums (like space).

That small power draw also means that they can be powered by alternative sources than a wall plug, such as energy harvesting technology. Meaning they can be placed virtually anywhere, vastly expanding their addressable use cases and delivering on the promise of putting IoT virtually everywhere.

Other Advantages over SBCs

There are very few IoT use cases where an application process or SBC is actually necessary to support them, largely because few tasks are functionally constrained by the processing power of a microcontroller.

Machine Vision, for example runs incredibly well on microcontrollers. In Pete Warden’s fantastic post, Why the Future of Machine Learning is Tiny, he argues that microcontrollers are actually better at machine learning than externalized architectures found in SBCs. And he should know; he leads the TensorFlow team at Google that specializes in mobile and embedded machine learning applications.

User Interfaces (UIs), are another reason sometimes cited to use an application process for IoT, but in truth, most modern microcontrollers have 2D graphics acceleration built into them. And there are a number of microcontroller targeted graphics libraries out there.

One of the few places that a single-board computer actually beats a microcontroller is in 3D graphics and HDMI output. But almost by definition, there are few IoT use cases that require those kinds of kiosk features. When trying to draw a border around what IoT is, kiosks don’t readily come to mind. And with 2D graphics acceleration, native touch screen input support, and broad support for displays that themselves aren’t computers (any display that has HDMI input is likely a computer itself) and are instead, commodity components, microcontrollers really shine.

Cost — Their Killer Advantage

While the advantages of microcontrollers are many, there’s one standout feature that’s driving them to be the single most important part of the IoT revolution: price. 32-bit microcontrollers start at around $1 each at production quantities, and even a flagship microcontroller like the STM32F7 (ST’s top-end ARM F7) is under $10. And a new breed of microcontroller from Espressif Systems, the ESP32 has built in WiFi and Blutetooth Low-Energy (BLE) and costs around $2.

In contrast, A7 chips generally run upwards of $25/each and many of them run more than $35 each!

And the price of an application processor is more than just the chip itself. They require far more external components in support of them, than their MCU counterparts. While the Bill of Materials (BOM) and cost of the Raspberry Pi is proprietary, it’s well known that it’s sold at almost zero margin, and the main chip is subsidized and also supplied at near cost, and yet, the Raspberry Pi Compute Module (their embeddable version) costs around USD $45. Compare that to an embeddable ESP32 based board which has integrated WiFi and BLE costs around $9, and the actual cost of that board is likely closer to $5.

This means that you can buy or build 5 to 10 microcontroller-based IoT devices for the same price as one SoC device! And at the lower end of the microcontroller spectrum, the price break becomes even more apparent.

Primitive Developer Platforms — The One Disadvantage

The one major drawback for microcontrollers is that the developer platforms have not caught up to the hardware. Microcontroller development hasn’t changed much since the ’80s; most development is still done in low-level languages such as C/C++ or variants like Wiring (the Arduino IDE language). And while players like Particle have added over the air update support and LTE connectivity, their developer story hasn’t deviated much from Arduino.

Single board computer platforms, in contrast, don’t have this limitation today; since they run full many full Linux variants or Windows IoT, developers can use most of their favorite tools for development.

This story is changing rapidly, however, with new platforms and tooling available for microcontrollers on the horizon.

The Future is Tiny

The connected things revolution is on its way and will be powered by billions of tiny connected microcontrollers. They cover nearly every conceivable IoT use case, many of which can’t be practically supported by any other chip architecture, and they do so at a fraction of the total cost of ownership. And while microcontroller tooling may largely still be primitive, the field is changing fast in response to the growing demand for microcontroller solutions.