Automated testing is a key component to LinkedIn’s 3x3 strategy for releasing mobile applications. As we developed the new LinkedIn Android app, we found that our tests had a major problem: our testing environment was unreliable, so our tests failed intermittently. We needed a solution that would let us rely on our tests to inform us when there was a problem with the app, not the testing environment. For this reason, we created and open sourced Test Butler, a reliable Android testing tool. LinkedIn runs over one million tests each day using Test Butler and we believe that it can provide a benefit to anyone running Android tests.

Android testing

UI testing on Android can be unstable for a variety of reasons: tests could depend on being run in a certain order, or depend on some shared application state that may not be reset between tests. These types of flakiness are largely up to app developers to address, but the more insidious issues can occur within the Android device itself.

If you’ve ever tried to run a large number of UI tests on Android, you may be familiar with some of the ways the emulator can be unreliable. Animations must be disabled, so that Espresso tests can run reliably. Tests may fail because the emulator CPU randomly goes to sleep, WiFi turns off unexpectedly, or rogue accelerometer data causes the device to change orientations. If a system app crashes on the emulator in the background, the resulting crash or app-not-responding dialog will cause Espresso UI tests to fail. At LinkedIn, we’ve even seen cases where the lock screen on the emulator is randomly triggered, causing tests to fail. This inconsistent behavior was causing our developers to lose trust in our tests and question why we were even writing them in the first place.