How to cache API requests with a service worker
If you want to learn everything you need to know about service workers, head over to serviceworkerbook.com and order my book, “Let’s Take This Offline”! Be sure to use cache-api-requests
for 10% off!
See the source code for this post.
Lots of service worker posts (like this one I wrote for the David Walsh blog) show you enough of a service worker to get started. Usually, you’re caching files. This is a great start! It improves your app’s performance, and with 20% of Americans experiencing smartphone dependence, it’s a great way to make sure users can access your app – regardless of their connection or network speed.
But what about requests, and specifically GET
requests? A service worker, along with the Cache Storage API, can also cache your GET
requests to avoid unnecessary trips over the network. Let’s look at how we would do that.
Note: This post assumes a basic understanding of service workers. If you need a service worker explainer, check out this blog post.
Let’s take a look at our onfetch
function in our service worker. It currently looks like this:
self.onfetch = function(event) {
event.respondWith(
(async function() {
var cache = await caches.open(cacheName);
var cachedFiles = await cache.match(event.request);
if(cachedFiles) {
return cachedFiles;
} else {
return fetch(event.request);
}
}())
);
}
Briefly, this function intercepts a fetch request and searches for a match in the cache. If a match isn’t found, the request proceeds as usual. The problem here is that there’s nothing to go ahead and cache new data as it’s received. Let’s change that.
Our new function looks like this:
/* ... */
else {
var response = await fetch(event.request);
}
If this looks similar to what you’re doing in your client code, that’s because it is! Essentially, we recreate the original fetch, but this time within the service worker. Now we’ll go ahead and cache the response.
/* ... */
else {
var response = await fetch(event.request);
await cache.put(event.request, response.clone());
}
There’s a couple of new things in here. First, we are using cache.put
over cache.add
. cache.put
allows a key-value pair to be passed in, which will match the request to the appropriate response. You might also notice the response.clone()
. The body of a response object can only be used once. This means, if you cache the response object, you’ll be able to return it, but your client won’t be able to access the body of the response. To be able to access your data we’ll go ahead and make a clone of the response and cache that instead.
Lastly, you return the response. So the full onfetch
function looks like this:
self.onfetch = function(event) {
event.respondWith(
(async function() {
var cache = await caches.open(cacheName);
var cachedFiles = await cache.match(event.request);
if(cachedFiles) {
return cachedFiles;
} else {
try {
var response = await fetch(event.request);
await cache.put(event.request, response.clone());
return response;
} catch(e) { /* ... */ }
}
}())
)
}
There you have it! Now you’re ready to start dynamically caching API responses.
Thanks so much for reading! If you liked this post, you should head on over to serviceworkerbook.com and buy my book! Be sure to use cache-api-requests
for 10% off!