Thanks to the good folks at Vector Software, I was pointed to a conference recording on Youtube, from the Google Test Automation Conference (GTAC) 2015 (Youtube video). The recording covers quite a few talks, but at around 4 hours 38 minutes, Brian Gogan describes the testing used for the Chromecast product. This offers a very cool insight into how networked consumer systems are being tested at Google. Brian labels the Chromecast as an “Internet of Things” device*, and pitches his talk as being about IoT testing. While I might disagree about his definition of IoT, he is definitely right that the techniques presented are applicable to IoT systems, or at least individual devices.

Most of the talk is really about the wide variety of physical test beds employed to test the WiFi connectivity of the Chromecast. It seems clear that WiFi is a right pain to get to work right, and that we should be very thankful for the hard work that companies put into testing to make sure their WiFi-connected devices perform consistently. Wifi is a very contagious communications system, it is really hard to isolate just one device. It is ubiquitous and everywhere in any modern office, and thus the test lab is built around various ways to isolate WiFi so that a device can be tested with consistently.

Google has a few office-size isolation rooms where a person can work with a Chromecast and test equipment. An interesting point here was that just slowly opening the door lets in an increasing amount of outside WiFi noise, and they can see how the devices start to struggle to get a good connection as the noise increases! That might explain why WiFi at home sometimes seems worse than it should be – we have so many things talking WiFi now that it is a miracle that not everything jams everything else.

Another device was the attenuation boxes where a single device can be put to isolate it from the outside world, along with an antenna to connect in selected WiFi signals.

They even have a special two-unit communications test box from Spirent, the OctoBox, which was really cool. Here, you can have two communicating WiFI devices connected to each other, and control the signal strength and induce network errors. In a completely controlled fashion. Incredibly cool stuff, but for obvious reasons there are only a few of those around.

Basic high-volume testing of the real hardware system is done by replacing the WiFi by wired Ethernet. Once that is achieved, it is possible to build large farms of device that can be automated. With network input controlled, you also need to control the HDMI output (many tests require that there is something connected that will “display” output to make the device happy). Finally, the test benches use some kind of special serial connection for debug to monitor and provision the devices. The test lab contains hundreds of devices, from each generation and variant of the Chromecast hardware. The Google people are lucky in the sense that their hardware is comparatively cheap, and built by a well-funded company. Thus, they can afford to have hundreds of units available in the test lab. Not everyone has that luxury.

To manage access to all those testing resources, an allocation and scheduling system is obviously needed. Gogan spend quite a bit of time on how lab allocation works – rather than allocate individual devices, you ask for an instance of a particular class of device. The test system can then find a free device that is also in working order for you – in more primitive systems where you call up a particular specific device, you have to deal with how to handle the case when that is busy or crashed or in a state you don’t want it to be. By leveraging the sheer volume of devices, the Google test team can provide a much better tester experience most of the time.

Given my interest (Wind River blog posts: 1, 2) in testing IoT systems using simulation, I was a bit disappointed that Gogan did not say much on what they do with emulation in their testing. What he did say was highly relevant, though. They use emulation in order to get stable devices to test on, avoid hardware flakiness, and repeatable flaky networks to test software robustness.The tests are run on the Google cloud, since it is just software – presumably allowing dynamic scale up and scale down of tests as needed (which is a core benefit of Cloud systems as I have written about at the Wind River blog). It is worth noting that the emulated devices are also very useful for internal collaboration at Google – which is also a key cloud value. I guess it means the people at other Google services can use emulated networks to test how their services look on Chromecast right now, without having to have actual Chromecast devices available physically. This would also have been interesting to hear more about!

It would have been rather interesting to learn more about the level of emulation used at google – is this just running on an Intel host with code compiled for the host, or rather running an ARM emulator like the one Google provides for Android? The ChromeCast hardware is definitely ARM-based (Marvell Armada 1500 mini for gen 1, Marvell Armada 1500 mini plus, based on Cortex-A9 and Cortex-A7 respectively). If it is ARM emulation, how far is the hardware simulated? Just as a processor plus dummy devices, or do they actually model the entire Marvell chip?

Overall, a very interesting presentation (among many good talks at the GTAC, it is worth reviewing the event and video to see if anything else catches your eye).

* About IoT and IIoT

Call me a curmudgeon, but to me Internet of Things, IoT, is supposed to be about invisible sensors and embedded systems. To me, it has always been about sensor networks, ever since I first came upon the idea in an EU research project (RUNES) back in 2005. It is not supposed to be about video players and smartphones. But it does seem that the IoT term is drifting quickly into the consumer goods area. And as a result, “serious” business is starting to brand itself Industrial IoT, or IIoT. This was clearly noticeable at the Embedded Conference Scandinavia in late 2015 – many people and product presentations used IIoT to emphasize that they work in industrial systems.

So in this sense, I guess Brian is right. But I reserve the right to be annoyed.