It can be easy to begin engineering a single page app in a manner similar to more common web apps using Django or a PHP framework. You plan it out one page at a time. And, in large part, this works. But there comes a time when each page might share some common data that needs to be loaded from the server. With a Django-driven app, the cost of re-retrieving that data is trivial compared to making an HTTP round-trip from the single page app to the server. It makes sense to hold this information in javascript for later use. So how exactly does one go about doing this?

It’s tempting to just hold the data somewhere that’s easy to remember. Perhaps the root scope? It’s available everywhere, but that’s part of the problem – it can be treated like a global data store, and nothing in particular owns it. As the AngularJS docs put it, “$rootScope exists, but it can be used for evil.” A better solution would be to construct a service that handles the request to the server, and then inject that service wherever it is needed. Let’s take a look at how that might work.

The example above is composed of a service to store data from the server, and two controllers that will use that data. On the demo page, you’ll see a button that allows you to clear the data from the controller and another button to request the cached data from the service again. Note that there is a section in the demo that notes how many times the service is called versus how many times it needs to make requests to the server. And after the demo loads, you can click the button to load the data from the service as many times as you want, and the number of requests to the server do not increase.

There’s just one problem with this solution: the number of requests to the server start out as 2. As I’ve already mentioned, there are two controllers that require the same information from the server. Shouldn’t the cache we’ve set up handle this though? No. The data is only cached when the request to the server is complete. Since both controllers are instantiated at the same time, the service has not yet cached anything, and understand that data must be requested. Even if later calls to the caching service eliminate subsequent calls to the server, there is still redundant work being done. So how exactly do we solve this new problem? Let’s cache the promise itself, instead of the data.

This change is a small, but significant one that highlights how useful promises can be. Promises are employed across AngularJS services, and can be leveraged by engineers as well through the $q service. You can read more about promises at this earlier blog post. By caching the promise returned when making a request to the server, we store the task being performed itself. Promises can be passed around, even after the requested task has been completed. This is what happens when the button is clicked after the initial request to the server. The promise is returned, and AngularJS understands that it’s already been resolved, so the function defined in the “then” callback is triggered immediately. This highlights another benefit to using promises: the data from the server is already cached, implicitly.

So in the second example, you’ll see that the server is called only once, even with both controllers being instantiated at the same time: because they are sharing the same promise. It requires engineers to think about storing and handling data a little differently than they might be used to, but it’s a pattern that saves you from a lot of headache in design, refactoring, and troubleshooting. Promises are designed to handle the overhead of this sort of common task. Take advantage of it!