July 29, 2016 Maxime Sraïki 5 min read

Web Cache?

A few days ago, Nicolas Trinquier wrote a cool article about how to improve your webapp performances using browser cache with Nginx.

Reading it will teach you that browsers can store content received from the server in order to avoid downloading it again.

You can actually use cache at different levels of your application’s infrastructure.

A web cache stands between your users’ browsers and your server. It caches server responses in its memory so that if another client asks for it again, the request will not go to the server but will be served by the web cache immediately. This can save the server from having to execute its most intensive tasks.

Nginx?

Usually, Nginx is used as a reverse proxy/load balancer for apps.

As it stands between client and server to check all of your user’s requests, it’s perfectly shaped to serve as a web cache!

Looks great! How?

To use Nginx as a web cache, we need to use some of its directives:

[proxy_cache_path](http://nginx.org/en/docs/http/ngx_http_proxy_module.html#proxy_cache_path) :

Here we will define the path where nginx will store cached content and the memory you want to allow for it.

There are two mandatory parameters, path and keys_zone :

path defines the path where nginx is going to store cached content, basically you should store this in something like /data/nginx/cache.The content to be cached is actually written to a temporary directory that you can chose by setting the use_temp_path option which is the general proxy_temp_path you set for Nginx by default.

defines the path where nginx is going to store cached content, basically you should store this in something like /data/nginx/cache.The content to be cached is actually written to a temporary directory that you can chose by setting the option which is the general proxy_temp_path you set for Nginx by default. keys_zone allows you to define the zone where the keys leading to these contents will be stored, name it as you wish and define the size you need (we speak of the keys, according to documentation, 1MB is about 8000 keys).

A word about the inactive parameter: it actually lets Nginx remove files that have not been requested during the specified duration from the cache. If not explicitly set this duration will be 10mins. This means Nginx’s cache will get rid of unused content to leave space for the most requested one - exactly what we are looking for, isn’t it?

[proxy_cache](http://nginx.org/en/docs/http/ngx_http_proxy_module.html#proxy_cache) :

This directive stands in a location block and allows you to link it with the keys_zone of your choice (references the keys_zone parameter of the proxy_cache_path directive).

A simple working example where we also added a header to all of our requests called ‘Web-Cache-Status’ that will be set by nginx to ‘Hit’ if the content comes from the cache or ‘Miss’ if it doesn’t:

proxy_cache_path /data/nginx/cache keys_zone=my_zone:10m inactive=60m; server { listen 80 default_server; root /var/www/; index index.html index.htm; server_name example.com www.example.com; charset utf-8; location / { proxy_cache my_zone; add_header Web-Cache-Status $upstream_cache_status; include proxy_params; proxy_pass urlOfProxiedServer } }

Going a little bit further

Stale can still be good!

You can allow Nginx to keep serving stale content if it hasn’t been modified on the server side. Whenever it finds stale content, Nginx will check the last date it was modified before trying to download it.

To do so, add [proxy_cache_revalidate](http://nginx.org/en/docs/http/ngx_http_proxy_module.html#proxy_cache_revalidate): on; to the ‘server’ section of your conf.

Once for all

If multiple users request content at the same time, Nginx will download the content from the server each time until it has it in the cache.

You can actually tell Nginx to download the content only once and to wait for it to be downloaded before serving it to all the users.

Just add [proxy_cache_lock](http://nginx.org/en/docs/http/ngx_http_proxy_module.html#proxy_cache_lock): on; to the correct ‘location’ section of your conf!

Bypassing web cache

Most of the time, you need to cache stuff on your client’s browser AND use a web cache.

If your browser asks for content that just expired from its cache, your web cache might just send it back without asking to the server for the freshest version.

Don’t worry, Nginx lets you set conditions in which your requests will bypass the web cache :)

[proxy_cache_bypass](http://nginx.org/en/docs/http/ngx_http_proxy_module.html?&_ga=1.23982778.1836737070.1464607162#proxy_cache_bypass) :

Lets you set the conditions in which the content will not be TAKEN FROM the web cache.

: Lets you set the conditions in which the content will not be TAKEN FROM the web cache. [proxy_no_cache](http://nginx.org/en/docs/http/ngx_http_proxy_module.html?&_ga=1.23982778.1836737070.1464607162#proxy_no_cache) :

Lets you set the conditions in which the content will not be STORED TO the web cache.

For example:

location / { proxy_cache_bypass $cookie_nocache $arg_nocache; }

Here we’re asking Nginx to go straight to the fresh content for requests showing a no cache cookie or argument.

Replace proxy_cache_bypass with proxy_no_cache in order to prevent Nginx from even storing the content in the response.

Stale is better than nothing

You can set nginx to deliver stale content if it is not able to get fresh one from the server.

That will prevent your website from showing server errors to users.

Following is a small working example, using the [proxy_cache_use_stale](http://nginx.org/en/docs/http/ngx_http_proxy_module.html#proxy_cache_use_stale) directive, allowing nginx to show stale content in case it gets a timeout or a 5XX error from your server!

location / { proxy_cache_use_stale error timeout updating http_500 http_502 http_503 http_504; }

Here we also add an optional parameter called updating !

Doing so will let Nginx deliver stale content while it is downloading a fresh version of the file from your server!

Conclusion

These few steps help you speed up your web app really easily and are, in my humble opinion, not yet automatic among developers!

Don’t forget it next time you’re working on a project that’s likely to get big!

You can think of various solutions for your web caching, like using Varnish for instance, but it seems to me that Nginx really is one of the best candidates for you!

It is indeed very easy to configure, pretty efficient and can still handle all its other functions, like being a web server, while managing the web cache in a great way!