By Mikhail Nakhimovich

(This is a guest post by Mikhail Nakhimovich. By day Mikhail Nakhimovich is an architect of the award winning New York Times Android App and by night he writes about Android and helps startups create performant, delightful apps with his team at Friendly Robot.)

It’s easy to make a tiny app fast. It’s more challenging when you need to deal with the complexities required by most apps: data caching, json parsing, immutability, dependency injection, and a slew of support libraries.

It’s still possible to minimize startup time, even when you’re following modern development practices such as functional reactive programming with RxJava, Immutable values objects, Guava Collections, Optionals, and Dependency Injection with Dagger. The key is to eliminate blocking code, reflection, and heavy resource loading. We should be able to make a normal sized app start up almost as fast as a “hello world” example if we can eliminate these pain points.

In my experience, it just takes a few calls to a library or dependency that isn’t optimized specifically for Android to significantly slow down your app. While this sounds dreary, this is actually great news! It means you don’t need to be a 10x rockstar ninja to make your apps as fast as Google’s. By following a few best practices, especially when choosing external dependencies, you can turn a sluggish app into one that’s well-tuned and highly performant.

In this post, we’ll walk through the development of an app that displays 200kb of data in a RecyclerView, works offline, and approaches the dex limit while still starting up in a paltry .8 seconds on a Nexus 5.

We’ll start with a fork of the excellent Android Boilerplate project created by Ribot. Next, we’ll add a few more dependencies (Guava, Immutables, SqlDelight and a few support libs) to beef up the method count.

Here’s what our app is going to do on first launch after install:

Download 200kb of json from reddit.com Cache the data at the network layer Cache the data in memory Save the data to SQLite Load The data from SQLite Display the data in a RecyclerView using MVP pattern

On subsequent launches it will:

Load The data from SQLite (on subsequent launches) Display the data in a RecyclerView using MVP pattern

The caching above is a bit redundant because it caches at multiple layers but acts as a demonstration of various interactions you need to do when first starting your apps. First, we need to optimize JSON marshaling and unmarshaling. Our parser of choice will be Gson due to its small library size and speed of instantiation. By default Gson suffers from performing reflection when marshaling or unmarshaling json. We can remove the reflection by setting type adapters on Gson. Because we are using Immutables, we’ll use the auto generated type adapters which are created during compilation time.

new GsonBuilder () . serializeNulls () . registerTypeAdapterFactory ( new GsonAdaptersEntities ()) . create ();

Once our type adapters are set, we simply register the Gson instance with Retrofit.

new Retrofit . Builder () . baseUrl ( changeableBaseUrl ) . client ( okHttpClient ) . addConverterFactory ( GsonConverterFactory . create ( gson )) . addCallAdapterFactory ( RxJavaCallAdapterFactory . create ()) . build () . create ( Api . class );

We are also using the RxJavaCallAdapterFactory to keep any responses from network as Observables (supporting ease in async processing). With our custom gson instance registered, Retrofit can hit our data endpoint and inflate our Java Models with 0 reflection. Here’s an example call to https://www.reddit.com/r/cheetahPics.json which returns a ~200kb json file. As you can see in the trace of the network call , the total time for the network call is 91ms with only 10ms of overhead for the parser (follow the RxCachedThreadScheduler-1 flame).

The last thing we are using above is an instance of OKHTTP with a custom interceptor. This lets us cache at our network at close to 0 cost. Thanks to the brilliant minds at Square and a design pattern called a Tee, Retrofit is able to cache and unmarshal the byte segments concurrently as they come over the network. We use a custom interceptor whenever working with an API that does not have proper Cache Control headers allowing us to spoof the headers client side.

Finally, we’ll also use a middle tier we love at Friendly Robot, individualized data stores that will cache the data once more in a Guava cache (memory cache).

While caching the entire response in memory and on disk is great for app restarts, there are times when we need to load a very small subset of our data on first load. For example, if we have a blogging app we might want to load a post count before inflating all of the post data. The next step in our data flow will be to save to a database (SqLite) to make it easier to query our data prior to hitting the network.

We’re going to use another Square library, SqlDelight, which generates SQL mappers and marshallers to keep us one abstraction above cursors. I’d recommend using SqlDelight instead of an ORM for two reasons. First, most ORMs don’t play nicely with Immutables or AutoValue objects. Secondly, ORMs usually rely on some sort of runtime reflection. Avoiding these pitfalls makes the cost of saving to a database with SqlDelight as small as if we were using no library at all.

After our data is saved, all future restarts of the app can work with database data (exposed reactively with SQLBrite). This makes app restarts even faster and more delightful to the end user.

Finally, we will use a MVP structure to display the downloaded data within a RecyclerView. Since the screen is loading data from DB only, on rotation we will have no issues repopulating a stateless UI.

After hooking everything up we have achieved our goal of creating an app architecture which will go from home screen to full loaded data in .8 seconds. See the full call stack.

Here are stats from one more version that forces the data call on every startup. This is closer to what you would see on fresh install. Normally you only need to go to the database prior to startup on every cold start of the app. View the full call stack

It seems like we lose about 150ms of startup by firing up a network client.

We hope that the including sample project will help the community when deciding on what libraries are Nimble Approved. View the sample repo.