If you want to learn everything you need to know about service workers, head over to serviceworkerbook.com and pre-order my book, “Let’s Take This Offline”!

Arguably, the biggest part of service workers is being able to fetch data off of a user’s machine, rather than making trips across the network. There’s a variety of reasons reaching over the network isn’t ideal – lack of connection, intermittent connection, speed of the network. Fortunately service workers help with all of these things.

In this post, we’ll look at caches.match – the function to fetch data from a service worker.

There are two matching methods within the Cache API – caches.match and cache.matchAll.

While caches.match requires at least one parameter, cache.matchAll has no required parameters. First, let’s dig into cache.matchAll.

cache.matchAll can only be run on a single cache. It has two optional parameters – request and options.

(async function() {
	var cache = await caches.open(cacheName);
	var rez = await cache.matchAll(request);
}())

If the cache contained a key-value pair matching that request, the associated response would be returned. If there were no match, rez will be undefined.

The steps look like this:

1. Check that the request exists as a key in the list

2. If it does, add it to a response list

3. Return list of responses

Source: Service worker spec

You can call cache.matchAll without a request, and it will return all the responses in the specified cache.

So what’s the difference between caches.match and cache.matchAll?

caches.match must be run with a request parameter. Additionally it’ll check over each and every cache in the app. It might look like this:

(async function() {
	var rez = await caches.match(request);
}())

Unfortunately checking over each cache can result in it being a little slow, especially if there are multiple caches created for an app. It can be better to specify a cache and search it specifically, or even only have only a single cache. That would look really similar to our code up above with matchAll.

(async function() {
	var cache = await caches.open(cacheName);
	var rez = await cache.match(request);
}())

If you want to learn more about service workers, I hope you’ll head over to serviceworkerbook.com and sign up for my mailing list, and follow me on Twitter! You’ll be the first to know when my book, ‘Let’s Take This Offline’ is out!


If you want to learn everything you need to know about service workers, head over to serviceworkerbook.com and pre-order my book, “Let’s Take This Offline”!

What is cache.add?

If you’ve ever looked at a service worker, you’re probably familiar with this little bit of code from an oninstall function:

cache.addAll([ /* ... */ ]);

Or perhaps you’ve seen:

cache.add(/* ... */);

Fortunately the syntax is pretty self-explanatory on what these two functions are doing. They are adding items to a cache. In one case they add multiple files, and in another, only a single file or request.

What is cache.put

Another function for adding items to a cache is cache.put. This function has two parameters: the request and the response.

cache.put(request, response);

So besides the obvious difference, that addAll and add only accept a single parameter (though of varying types), what is the difference between add and put?

The difference between add and put

It’s a trick question! As it turns out, add and addAll use put under the hood.

It works like this:

    1. For each request in the array (or a single request in the case of `add`), do a fetch
    2. Add the response to a key-value pair list, with the request as the identifying key
    3. Cache the request _and_ response together (by using put)

Source: Service Worker Spec

When to use add and when to use put

Usually in a service worker install function, you’ll see this:

// code: https://github.com/carmalou/cache-response-with-sw/blob/master/serviceworker.js

self.oninstall = function() {
    caches.open(cacheName).then(function(cache) {
        cache.addAll([ /* ... */ ])
        .then(/* ... */)
        .catch(/* ... */)
    })
}

It makes a lot of sense to use addAll over put in this case, because the user has never visited our site before. addAll is going to go fetch those responses and add them to your cache automatically, so we can use it during once our install event and no longer need to go over the network to get our static assets like HTML files. Additionally the syntax is very clean, so its very clear to future-us what we’re doing.

So, if we should use addAll in an install event, where should we use put?

Before we get into the add vs put debate, let’s look at our fetch event.

self.onfetch = function(event) {
    event.respondWith(
        caches.match(event.request)
        .then(function(cachedFiles) {
            if(cachedFiles) {
                return cachedFiles;
            } else {
                // should I use cache.put or cache.add??
            }
          })
        )
    )
}
        

In the above code, we have a fetch event sitting between the client making the request and the network which will process the request. Our service worker will intercept our request and check for a match within our cache. If a match is found, the service worker will return the response. Otherwise, we’ll fall into the else.

Let’s first look at using cache.add:

/* ... */

if(cachedFiles) {
    return cachedFiles;
} else {
    return caches.open(cacheName)
    .then(function(cache) {
        return caches.add(request)
        .then(function() {
            return fetch(request)
        });
    })
}

/* ... */

Briefly, should no match be found, we’ll fall into the else. Within the else, we’ll go ahead and open up the cache we want to store our data in, and then use cache.add to store off the response. At the end, we’ll go ahead and do a fetch over the network so the client can access the data.

But there’s a problem with this: we’ll end up needing to do two fetches! Because caches.add doesn’t actually return the response to the request it runs, we still need to do an additional fetch to get the response back to the client. Doing two fetches for the same data is redundant, and fortunately put makes it unnecessary.

Let’s take a look at how we might rewrite this with put:

// code: https://github.com/carmalou/cache-response-with-sw/blob/master/serviceworker.js

/* ... */

if(cachedFiles) {
    return cachedFiles;
} else {
    return fetch(event.request)
    .then(function(response) {
        return caches.open(cacheName)
        .then(function(cache) {
            return cache.put(event.request, response.clone());
        })
        .then(function() {
            return response;
        })
    })
}

/* ... */

So we’ve sort of flipped the order of what we were doing before with cache.add. Instead of doing the fetch last, we go ahead and do it first. After that’s completed, we have a reference to the response. Next we can go ahead and get a reference to our cache with caches.open and use put to cache both the request and the response.

Note: Keep your eyes peeled for a blog post about why we are cloning the response!

Once we’ve finished caching the data, we can go ahead and return the response back to the client! And then next time the client makes this request, a trip over the network will no longer be necessary!

So, in conclusion, you should use cache.add or cache.addAll in situations where you don’t need that data to get back to the client – such as your install event. When you’re caching new requests, such as API requests, it’s better to use cache.put because it allows you to cache the data and send it back to the client with a single network request.

If you want to learn more about service workers, I hope you’ll head over to serviceworkerbook.com and sign up for my mailing list, and follow me on Twitter! You’ll be the first to know when my book, ‘Let’s Take This Offline’ is out!


If you want to learn everything you need to know about service workers, head over to serviceworkerbook.com and pre-order my book, “Let’s Take This Offline”!

See the source code for this post.

Lots of service worker posts (like this one I wrote for the David Walsh blog) show you enough of a service worker to get started. Usually, you’re caching files. This is a great start! It improves your app’s performance, and with 20% of Americans experiencing smartphone dependence, it’s a great way to make sure users can access your app – regardless of their connection or network speed.

But what about requests, and specifically GET requests? A service worker, along with the Cache Storage API, can also cache your GET requests to avoid unnecessary trips over the network. Let’s look at how we would do that.

Note: This post assumes a basic understanding of service workers. If you need a service worker explainer, check out this blog post.

Let’s take a look at our onfetch function in our service worker. It currently looks like this:

self.onfetch = function(event) {
    event.respondWith(
        (async function() {
            var cache = await caches.open(cacheName);
            var cachedFiles = await cache.match(event.request);
            if(cachedFiles) {
                return cachedFiles;
            } else {
                return fetch(event.request);
            }
        }())
    );
}

Briefly, this function intercepts a fetch request and searches for a match in the cache. If a match isn’t found, the request proceeds as usual. The problem here is that there’s nothing to go ahead and cache new data as it’s received. Let’s change that.

Our new function looks like this:

/* ... */
else {
    var response = await fetch(event.request);
}

If this looks similar to what you’re doing in your client code, that’s because it is! Essentially, we recreate the original fetch, but this time within the service worker. Now we’ll go ahead and cache the response.

/* ... */
else {
    var response = await fetch(event.request);
    await cache.put(event.request, response.clone());
}

There’s a couple of new things in here. First, we are using cache.put over cache.add. cache.put allows a key-value pair to be passed in, which will match the request to the appropriate response. You might also notice the response.clone(). The body of a response object can only be used once. This means, if you cache the response object, you’ll be able to return it, but your client won’t be able to access the body of the response. To be able to access your data we’ll go ahead and make a clone of the response and cache that instead.

Lastly, you return the response. So the full onfetch function looks like this:

self.onfetch = function(event) {
    event.respondWith(
         (async function() {
            var cache = await caches.open(cacheName);
            var cachedFiles = await cache.match(event.request);
            if(cachedFiles) {
                return cachedFiles;
            } else {
                try {
                    var response = await fetch(event.request);
                    await cache.put(event.request, response.clone());
                    return response;
                } catch(e) { /* ... */ }
            }
        }())
    )
}

There you have it! Now you’re ready to start dynamically caching API responses.

If you want to learn more about service workers, I hope you’ll head over to serviceworkerbook.com and sign up for my mailing list, and follow me on Twitter! You’ll be the first to know when my book, ‘Let’s Take This Offline’ is out!


If you want to learn everything you need to know about service workers, head over to serviceworkerbook.com and pre-order my book, “Let’s Take This Offline”!

Note: this blog post assumes a working knowledge of service workers. If you need a refresher, I recommend looking here.

There might be times you want to send back a custom response from your service worker, rather than going out over the network. For example, you might not have a certain asset cached, while a user’s internet connection is simultaneously down. In this case, you can’t go over the network to fetch the asset, so you might want to send a custom response back to the client.

Let’s take a look at how we might implement a custom response.

This project demonstrates how you would return a custom response from a service worker. The basic idea is to make a fetch request to an API (in this case FayePI, which you should definitely check out). The service worker, however, sits between the client making requests and the API receiving them.

In this case, the service worker intercepts the fetch request and sends back a custom response – rather than going across the network.

Let’s look at the service worker to see how that works.

A service worker’s fetch event listener usually looks like this:

self.onfetch = function(event) {
    event.respondWith(
        caches.match(event.request)
        .then(function(cachedFiles) {
            if(cachedFiles) {
                return cachedFiles;
            } else {
                // go get the files/assets over the network
                // probably something like this: `fetch(event.request)`
            }
        })
    )
}

Let’s change up what’s happening in the else-block of this code to return a custom response.

/* ... */
else {
    if(!event.request.url.includes(location.origin)) {
        var init = { "status" : 200 , "statusText" : "I am a custom service worker response!" };
        return new Response(null, init);
    }
}

You might be wondering why the if-check is included there. We only want this to affect out-going requests, rather than requests coming to the app, which are likely requests for html files, and other assets. This quick check makes sure that the requested URL doesn’t match the URL of the app, so all of the static pages will still load properly.

Within that if-check, we are newing up a Response object. Here we go ahead and set the status code to 200 since it “worked” (that is, it reached our service worker and our service worker returned our custom response). We’ll also go ahead and change the status text, so we know that the response is actually coming from the service worker.

And that’s about it! Now you know how to return a custom response from your service worker! Be sure to check back soon to learn about how to cache more than just static files!

If you want to learn more about service workers, I hope you’ll head over to serviceworkerbook.com and sign up for my mailing list, and follow me on Twitter! You’ll be the first to know when my book, ‘Let’s Take This Offline’ is out!


If you want to learn everything you need to know about service workers, head over to serviceworkerbook.com and pre-order my book, “Let’s Take This Offline”!

A lot of service worker examples show an install example that looks something like this:


self.oninstall = function(event) {
    caches.open('hard-coded-value')
    .then(function(cache) {
        cache.addAll([ /.../ ])
        .catch( /.../ )
    })
    .catch( /.../ )
}

Let’s do a quick overview of the above code. When a browser detects a service worker, a few events are fired, the first of which is an install event. That’s where our code comes in.

Our function creates a cache called hard-coded-value and then stores some files in it. There’s one small problem with this … our cache is hard-coded!

There’s an issue with that. A browser will check in with a service worker every 24 hours and re-initiate the process, but only if there are changes. You might change your app’s CSS or JavaScript, but without a change to the service worker, the browser will never go and update your service worker. And if the service worker never gets updated, the changed files will never make it to your user’s browser!

Fortunately, there’s a pretty simple fix – we’ll version our cache. We could hard code a version number in the service worker, but our app actually already has one. So handy!

We’ll use our app’s version number from the package.json file to help. This method also requires (pun intended) us to be using webpack.

In our service worker, we’ll require our package.json file. We’ll grab the version number from the the package.json and concatenate that to our cache name.


self.oninstall = function(event) {
    var version = require(packagedotjson).version;
    caches.open('hard-coded-valuev' + version)
    .then(function(cache) {
        cache.addAll([ /.../ ])
        .catch( /.../ )
    })
    .catch( /.../ )
}

Turns out, there’s actually an even better way than above using some of webpack’s built-in tools. A problem with the above code is that your package.json file will get bundled into your service worker. That’s pretty unnecessary and it’s going to increase the size of your bundle.

We’ll use DefinePlugin to make this even cleaner.

Let’s add a property to our DefinePlugin function in our webpack file. We’ll call it process.env.PACKAGEVERSION.

It might look like this:


var version = require(packagedotjson).version;
new webpack.DefinePlugin({
  'process.env.PACKAGEVERSION': JSON.stringify(version)
});

Source: webpack DefinePlugin

And then in our service worker instead of referencing version directly, we’ll use process.env.PACKAGEVERSION. It’ll look like this:


self.oninstall = function(event) {
    caches.open('hard-coded-valuev' + process.env.PACKAGEVERSION)
    .then(function(cache) {
        cache.addAll([ /.../ ])
        .catch( /.../ )
    })
    .catch( /.../ )
}

webpack will work behind the scenes for you, and swap out the ‘process.env.PACKAGEVERSION’ for the proper number. This solves the problem of needing to update our service worker, and it handles it in a clean, simple way. Plus it will help us out when we need to clean up former caches. I’ll write about that next, so stay tuned!

If you want to learn more about service workers, I hope you’ll head over to serviceworkerbook.com and sign up for my mailing list, and follow me on Twitter! You’ll be the first to know when my book, ‘Let’s Take This Offline’ is out!