#btconf Düsseldorf, Germany 15 - 17 May 2017

Patty Toland

Patty is a founder and partner at Filament Group, a Boston-based web design and front-end development studio that helps companies design and build super-fast responsive sites and web apps that are simple to use and accessible to everyone.

Patty has more than twenty-five years consulting experience with corporate and institutional clients with a focus on communicating complex messages across a range of media; her expertise lies in developing robust communication strategies, information architecture and system design. Prior to founding Filament Group, Patty worked at ZEFER Corp., US Peace Corps, Kohn Cruikshank Inc. and Harvard Business School.

Want to watch this video on YouTube directly? This way, please.

What we talk about when we talk about web performance

As the numbers of people who access the internet continually grow, and the devices they use diversify, one thing is clear: speed and performance are a top priority.

Despite this, the average web page size has more than tripled in the past 5 years, and many techniques, platforms, tools and practices don’t champion performance or consider its implications. One of the biggest challenges is simply the absence of good information to build awareness about performance impacts.

Luckily, our conversations about performance-related choices are getting better-informed: in the past few years, compelling tools and data are helping us measure, track and understand the real state of performance and how it affects our audiences; and techniques are emerging to make responsible design and coding choices.

Transcription

Audience: [Applause]

Patty Toland: Thank you, Marc. I’m a little nervous. It’s very high bar. These talks have been really amazing and inspiring. The name of my talk today, What We Talk About When We Talk About Web Performance, is actually borrowing from the title of a short story by Raymond Carver. He’s an American writer. Is anybody familiar with Raymond Carver?

His story was called What We Talk About When We Talk About Love. It’s a story of two couples that sit down at a table one day with a giant bottle of gin and tell the story of their past loves. When they talk about love, they tell stories of pain and frustration and misunderstanding and insult and disappointment and heartbreak. And when I think about every conversation we have had about Web performance, I can’t think of better ways to describe them, so it felt very, very appropriate.

As Marc mentioned, I’m from Filament Group. I am one-sixth of Filament Group, and we are a design firm. We’re in Boston, but we’re distributed. We do visual design in front-end code, and we focus on progressive enhancement and responsive Web design. But what we have set for ourselves as our real goal is that we want to make a Web that works for everyone.

We have been around since 2001 and, in that time, we’ve done about 170 separate projects for about 80 clients. Since 2010, we’ve really focused on responsive and, in the past 6 years, we’ve done about 40 responsive projects with about 30 client teams. In that time, we’ve had lots of conversations about how to approach Web design in dev.

Especially in the past four or five years, those conversations, the most awkward and sort of challenging parts of them have been around Web performance and what it is and why we should care about it. I think the big challenge has been that, especially around about 2012, like right after responsive came out, about a year later there was a real backlash where people were talking about how responsive was too slow and too heavy, and it wasn’t the right way to go. But we really believed that it was the way the future needed to go.

When we first started having those conversations, we were just looking for data to sort of figure out why it was so slow, and it was very hard to come by. But what we did see is, around about 2011, that your average Web page size was about 800K. And, in the past six years, or past five years, it’s tripled.

When people were talking about responsive being slow, they were right. But it wasn’t that responsive was slow. It was that the Web was getting slower all the time. There were a lot of reasons for that. The data that we were able to find when we first started looking was just that there was an explosion of devices, and there was an explosion of people who were using those devices.

When we looked to the device manufacturers to figure out what was happening, most of the data that came from them was really about the capabilities, about the most optimistic versions of those, so these devices had different sizes. They had great speed. If you looked at sort of the range of Samsung devices in 2014, it was kind of astonishing.

Open signal just looked at just the Android mobile environment for over four years, and it changed so rapidly, things were happening so quickly. Then when you look to the networks, to the capabilities of how you could get your data, even from 2014 to 2016, AT&T in the U.S. said that they pretty much doubled or tripled their coverage, and Verizon did the same thing from 2013 to 2016, really, really expanding the network. They were so much more powerful.

We looked at this information. Ericsson, I don’t know if anyone is familiar with the Ericsson Mobility Report. They track what’s happening with networks. They looked at, you know, sort of how networks are evolving. These are their projections of what was happening in 2016 and what will happen in 2022 in terms of network capabilities in different regions in the world. It really did look like that the networks are expanding, the devices are expanding. You have so much data saying that there’s bigger screens and better networks and more pixels, so of course we’re getting this big, bigger, better more version of the Web.

Where’s the problem? Well, what we were seeing, we were having these conversations with our clients and most of them were saying, it doesn’t really matter. Everybody has got fiber. We’re fine. More often than not the conversations we would have with parts of our client teams were more like this, like we’re not looking backward. Let’s move forward.

There’s a place for that argument. The common misperception about network challenges was that it was really a third world problem. People -- you know, there were these tweets that there were more people accessing Facebook on 2G than 4G, and three years ago that was still true. But when companies talked about it, they really talked about it in that third world context, so Facebook was doing 2G Tuesdays, and they were doing it because they wanted people to have empathy for people in India. We were looking at hack-a-thons were people were focusing on what’s happening in Nigeria or Jakarta, and so the real focus was emerging markets. Those were all true.

Our anecdotal evidence was that when people went abroad, especially to places that didn’t have as good a network, they really felt the pain, and so that was all true. But for us, for our clients, mostly in the U.S., mostly larger companies, it was very easy for them to look at that data and say, “But that’s not us. We don’t have to care about that. We’re not in Jakarta. We’re not in Nigeria. So we don’t think that’s our problem.”

But the thing that we started to see and the thing that we really know is that 2G networks are not exclusive to the third world. There are places in the U.S., there are places here in Germany, where that’s the only network you have available, and that all network speed assessments are really only a description of the best case scenario. Even when you look at these coverage maps, they have disclaimers all over the place that that’s their best possible outcome, but it’s not reliable, and they don’t guarantee anything.

I just found this research last week that even in 4G networks the 4G network is only capable of delivering as much speed as the traffic that comes to it. Even the most powerful ones can be really compromised from one year to the next.

This, I think this chart was the one that was, for a long time, the thing that really inspired me, but I had a moment of revelation about two years ago where I was looking at it, and one of the things that I think happens with this is when you look at this chart, it reads like a trend graph. And it looks like -- you can see the blue bars, which are 2G. They’re descending and, at the end of the bar, they’re gone. The purple, which are 3G, again on a trend line down. Look at how big the green is. 4G is growing. It’s magnifying so much. But this is not a trend graph. This is a market share graph of different regions of the world.

About two years ago, I scaled it to actually reflect those regions of the world, and I think that this tells a really different story. This right now is 2016. 2G and 3G are by far the larger network capabilities in most regions. Even if we project forward to 2022, five, six, years from now, 2G and 3G are still a part of our future, so these are things that we should be caring about deeply.

We know that speed issues are not exclusive to lower end networks. If we just look at all people who are experiencing the Web, 73% of mobile Internet users say that they have encountered sites that are too slow. This is a very, very common opinion. And that people have had sites crash and freeze and files not load properly. Again, like half of the people say that their largest peeve with the Web is that it’s too slow.

This, I thought, was a wonderful, very specific research project. There was a journalist who traveled on a high speed rail. It’s a train that goes from New York to Washington, D.C., that said that it had high speed network, and he actually tested the speed of the network everywhere along the way. This is a business class, and he found that 80% of the network had dropped out as they were traveling. I think this is everyone, anyone who has been on Gogo wi-fi on an airplane knows what this experience is like.

When we were encountering these conversations, mostly with development teams, but also with marketing teams and with CIOs, what we came back with was this incredibly inspiring quote that I saw last year. Last year the Economist celebrated Tim Berners-Lee on the occasion of the 25th anniversary of the invention of the World Wide Web. He had a quote during his presentation speech where he said, “The Web’s true potential for democracy, economic growth, and human creativity is only just beginning to be glimpsed. And in 2016, all of us must protect and enhance this public space for the benefit of all human kind.”

When we think about performance, this is the goal that we’re working toward. We need data and tools to convince our “not really our problem” skeptics to help bring them along to that vision. In the past two years especially, there are four resources that we have found that we found has been really compelling when we have conversations with our clients.
• The first one, she’s a researcher named Tammy Everts. Until recently, she was at SOASTA. She announced yesterday that she’s leaving and going to SpeedCurve. But she’s a neuroscientist who studies performance, impacts, and behaviors.
• Google has been doing some fantastic research on user behavior and opinion and publishing their data about how performance impacts the way people behave on the Web.
• The Pew Research Center is an American research organization that has done some fantastic, very, very well documented, orthogonal research about how people use mobile and how it’s impacting their lives.
• Then this tool, I’m sure everyone is familiar with it, but this tool has been a game changer for us in these conversations. This is Web Page Test. It is at webpagetest.org, and it is available for anyone to test any site. It actually has a network of real devices on real networks, so you can choose a mobile phone on a 2G network somewhere in Germany and find out exactly how fast it is, and it will spit out a timeline image of exactly how long it takes for site to load. This has been a seeing is believing moment for us that has been incredibly, incredibly useful.

I will stop right now and just mention that this presentation, I’m going to go pretty fast. But it’s a Google doc, and I will tweet out the link right after I get off stage, so all the data will be available to you immediately, so you don’t feel like -- if you feel like you’re not getting everything you wanted, you want to be able to get back, you’ll have it right, like, as soon as I’m done.

I want to start with Tammy Everts because I think, in 2012, she posted an article that again was a super helpful, sort of orienting, mental model for us because there were a lot of articles that came out when people were complaining about performance, and companies were saying, “Oh, poor millennials. They want everything immediately. They’re so cranky.”

She posted this article about how neuroscience works and how we think. She was talking about how memory works, so she was saying you have sensory memory. What sensory memory does is this is your instinct that helps you, you know, flee from tigers. Every 100 milliseconds your brain is looking around and it’s just cataloging the things that happen around you, and it’s bringing them together.

Then in about a 10- or 15-second window, your short-term memory tries to take all the inputs and put them into a universe so you can make decisions. Every 100 milliseconds you’re looking for information. The Internet actually really sort of gets in the way of this. These delays create problems for our cognitive processing.

This is a separate Ericsson report that they did a separate study where people worked on the Web and they introduced delays and actually tested their physiological responses. People had elevated stress when things went longer than four seconds. People had elevated stress equivalent to watching a horror movie. People were so agitated.

This is something that, as you think about your website taking four seconds on any device, on any network, this is what’s happening to people. They are having a visceral reaction, and that visceral reaction affects your brand. People don’t know that they’re having it, but they feel it. They feel that negativity. We want to make sure that we can avoid that.

Tammy also, she sort of summed it up for us in a really nice way. She said that the way our neuroscience has been evolved over time, we’re really designed to do that 100 millisecond gathering and to make decisions in two-second increments, like we can sort of process and move on in two seconds. When we can do that, we’re in a place of flow. That’s where we feel harmony. That’s where we feel like we can move effectively. That’s where delight happens where we are moving efficiently, we’re comfortable, and we’re productive. So two seconds is where we want to be.

Now in addition to doing that cognitive neuroscience research, Tammy worked at a company called SOASTA, and they did really detailed retail studies, 500 million user sessions in a year over the Christmas holiday in the U.S., and they actually published some really interesting data. They say from 2014 to 2015 that just traffic to all retail websites shifted dramatically from 60% desktop and 40% mobile in 2014 to 25/75, so that’s a huge shift in the way data is being delivered and people are interacting in one year.

This, I think, these three slides were really more compelling to me. They tested not only what people were using the data for, but they also tested effective conversion rates, so how often were people buying. They broke them down by tenths of a second. What they saw was that, between 2014 and 2015, the optimum load time for peak conversion shifted from just under 4 seconds to under 2.5 seconds. The most successful sites that were seeing the most engagement and the most purchasing were the ones that were able to deliver a page in less than 2.5 seconds.

Now the thing that was kind of interesting, in 2014 the average pattern of the way the sites were working was right in that curve, so the 3.8 seconds people were sort of optimizing to somewhere around 4 seconds, so they were really getting effective engagement. In 2015, user expectation shifted ahead of where the Web was, and people were really losing opportunities to connect with their customers.

I think these are the kinds of the data that I think are very, very compelling to our clients. A change of 500 milliseconds for Facebook is 3% drop in traffic. That’s dramatic, right? Change of a second is 6%. If you look at Amazon.com, a page load change of an increase of 100 milliseconds could change a one percent loss in sales. These are the kinds of data and the kinds of statistics that we started gathering because they made our conversation much more tangible and just much easier to have.

This is actually sort of the counterpoint. There were the ideas about technology, about how people are engaging and how they’re converting, things like that. There’s also a sense of sort of like what’s the universe of our audience. This is a case where the Pew Research study did a really great, comprehensive study of mobile device technology ownership in America. They actually did all kinds of technology. They then spoke specifically on smart phones.

They looked at cell phones, desktops, smart phones, tablets, and readers. One of the things that they definitely saw, in the past two years especially, was there’s such a sharp upward trend with mobile and either a flat or downward trend with pretty much everything else, which I think reinforces the information that we already know. But they also said that 77% of the cell users experienced download speeds--again, this is something we know--that prevented them from loading something as quickly as they would like.

One of the fascinating data points that they shared that I was shocked by was that--they have 5,400 people that did this data over multiple years--15% were on devices that at some point did not have JavaScript enabled - 15%. That may have been -- so, you know, who knows why that happened, whether it was conscious or whether it was unconscious, but it was there where they have devices that are so old that they don’t have the capability.

Sixty-eight percent of adults in America, a year and a half ago, had a smart phone, so that shift from sort of dumb phones to smart phones, again, we’re sort of moving in that trend. Nineteen percent of Americans--and I think there were probably parallels here in Europe--are smart phone dependent, which means that they have limited or not other access to the Internet, so their smart phone is their only connection to the Internet. That’s a trend that’s growing, especially with younger people, which people who are poor, and with people who are in sort of marginalized communities. When we think of that Tim Berners-Lee, just making sure that it’s a public site and public access, this is really important to consider.

Thirty-seven percent of smart phone owners in the U.S. reach their maximum amount of data every month, and 15% said that happened frequently. I think I heard lots of conversations actually in this room about sort of data plans and how it’s challenging to even understand what’s happening, but it’s something that this is a real hardship for some people. Of the people who are smart phone dependent, 48% said they had to cancel or suspend their service for financial reasons.

There were a couple of incidents I think maybe a year and a half ago. Apple launched a new update where they implemented something called wi-fi assist, and they launched it so that it was automatically on. What it did was, if you were not close enough to a network, it would automatically shift you over to data without telling you. They had several class action lawsuits, but one of my favorite studies was there was a complaint for there was a kid who was sitting in his bedroom playing an online game thinking he was on his wi-fi and he wasn’t, and he had a $2,000 mobile bill for a month because he had been sitting and gaming with his friends three steps away from where the wi-fi reached in his house.

The decisions we make are great if we have abundant, unlimited data, and if we want the decision for made for us, but those are big assumptions, right? I think actually Jeremy just talked about what are the assumptions that we’re basing our decisions on. These are the kinds of things that we absolutely do need to think about.

It’s not only for poor. It’s not poor people. It’s not only for kids. It’s for everyone. If you’re not being thoughtful about it, one of your conference attendees could spend $7 to load a conference homepage on their iPhone because they’re outside of their own data area. Just for context, again, I know that this is very America centric, but 51% of Americans, these include everyone who actually earns a wage, made less than $30,000 in 2014, so this is real, genuine hardship.

When we look at our clients and our audiences, we want to help them understand this context, so we started looking at the Internet and the way people behave with it and how performance is being accomplished, especially on mobile, and especially on mobile networks, 2G and 3G networks, that are very common for many, many, many people. Sixty-two percent of smart phone owners in America use their smart phone to find out information about a health condition. I went and looked at my local Massachusetts Health Connector, which is a way to get insurance, and just looked at them on Web Page Test to figure out how well they were achieving the goal of meeting people who had health concerns where they are.

Now this site on desktop was a two megabyte page. It had 130 requests. It took 19 seconds to load and 7 seconds of blank screen. We have this person waiting for their two-second instinct of trying to solve a problem that’s urgent, and they’re waiting five more seconds and getting more and more anxious as they wait. It was also only JavaScript enabled navigation. Again, opportunities that people would actually sort of suffer for.

On mobile, the same thing, 1.6 megabyte site, 21 seconds for the full page, and 11 seconds on mobile for the site to load. These were opportunities for someone, for an organization, this is a nonprofit organization that’s supposed to be serving the public. This kind of experience, I think, is a real failure for that audience that has an urgent concern, right?

A very valid point: At what point did we decide that finding a doctor and having JavaScript enabled is required or a necessity, right?

Sixty-eight percent of smart phone users follow along the news at least occasionally and many look for breaking news frequently. My local newspaper, The Boston Globe, is a subscription site, so you have to pay to get that. But they also have a free site called Boston.com that is a mix of news and community information.

That site, the day that I looked at it, was 3.8 megabytes and 600 requests. It took 34 seconds for the page to load. They did a really good job of loading the first part of the page and the first byte, but they loaded 3.8 megabytes of data onto a page where most people would click one of the first 3 sort of squares, first 3 stories on the page and go. But they were sort of handing them everything. The same thing or actually slightly worse on mobile. More requests, seven seconds of dead space on 3G, ten seconds to get the initial content, so they were loading; they were using custom fonts, and the custom fonts were blocking, so they weren’t actually giving you data until you had the custom fonts.

Around the time that ad blockers came out, New York Times did a really exceptional study about the impact of mobile ads because mobile ad networks were being blocked. That particular site, The Boston Globe, was by far the worst offender. They did a study where they looked at the amount of video/audio scripts with the ad blocker on and the ad blocker off, and they visualized it, which was fantastic.

This is sort of astonishing, but the analysis, I thought, that they did afterwards was more interesting. They determined, based on that page, that if someone went to that free site every day to get their news, they would spend about $9.50 a month on their data plan just for the advertising. That free site was certainly not free for the many people who were choosing not to subscribe to the more expensive subscription-based site.

Now there are other newspapers. The Guardian did a phenomenal job of going just the opposite direction. There are techniques and tools to be more responsible, and we definitely want to make sure that we can share them.

Another sort of scenario that we were looking at: 28% of Americans, and 53% of young Americans 18 to 29, have used their smart phone to look for a job. So we went to the largest job site in the U.S., and again 2.1 megabyte site, 16 seconds for the full page to load, 5.5 seconds of blank screen, and it’s a form to enter a city and put in a two-word description of what kind of job you want. This is like they were sort of moving toward the advertising and the sort of branded experience. It was really inhibiting the function of the site, and there’s opportunities, clearly, to make it better. It was again sort of equivalent or worse on mobile.

Then there were, like, really urgent scenarios where 53% of people have found themselves in an emergency situation and have tried to use their phone for help. We looked at the two largest insurance companies. Again, sort of four or five seconds to wait when you are in, you know, when you’ve just been hit by a car or when something traumatic has happened is a very long time. On mobile, seven seconds of waiting can feel like you, as an organization, are not -- you don’t care about the people that you serve. The same thing effectively happened for the second largest site. These opportunities, I think, were pretty clear to us.

This, I think, was an especially unique case. There was an organization that put together this wonderful resource for refugees arriving in Europe that just help them get resources. But we found that it was built with a framework, and that framework was JavaScript dependent. We know that refugees, a phone is a real, genuine lifeline, and they were sort of cutting off that lifeline for people in grave need because a framework had made a decision and they didn’t have a way to sort of work around it.

There’s also--this is sort of a bigger reality check--some really sobering data emerging about the environmental impacts of the decisions we’re making. Cell networks, we know right now, are energy hogs. As we project out that the people are going to use more devices and there’s more content, we’re actually looking at 40-, 50,000 gigawatts of energy that we need just to power the servers. Energy consumption associated with this is the equivalent of adding 4.9 million cars to the road in 2015. That to me is a reality check that is really concerning, and it’s not something that you sort of automatically think about but it’s something that, once you have it in your head, it’s hard not to.

In 2016, again, 2015 to 2016, about $80 billion of power was wasted on IoT, on Internet of Things devices that are just sitting and charging a little bit. That is only expected to grow. These are big audiences and serious consequences. When we talk about performance, this is the type of context that we like to try to bring to bear to those discussions because these large Web pages and the big sites that we have really have an invisible, but very real cost. Our design and code needs to be mindful of how we address that, and we need to start looking at the future of the Internet that delivers on the promise that it starts with.

One of the things that Jeremy was just talking again about sort of how technology has built on itself, this is the language of the TCP/IP protocol. When they were talking about how that network was going to work, the guideline that they gave was that we wanted to be conservative in what we do and liberal in what we expect, right? For anyone who hasn’t seen this, I was reminded of that data from this wonderful book from A Book Apart, Design for Real Life, which is sort of full of anecdotes of context about how to help people with your Web decisions.

But the biggest questions that we have, and the one that we want to bring to all of the discussions we have with our clients is, how do we make our decisions and code conservative and accepting? File size and page speed are design choices and technology choices. We want to take a critical eye to everything we do and pare down to the most essential pieces and optimize.

One of the things that is really helpful for that that we try to get people to do is to make speed part of the design process and build performance goals into the design stack. That’s one of the cases where that Web Page Test tool has been enormously helpful. We have had clients, actually, who sort of engage, and there’s a way to build Web Page Test into your build so, if you exceed your performance goals, it will break the build. There’s ways to sort of enforce tools like that.

But the other thing we want to also think about is perceived performance. We want to give something useful to our audience as quickly as possible. Our performance opportunities, the ways that we can think about that, by far the most egregious one is images. Images, you know, optimizing images is such an easy win. But video and audio are also clearly an opportunity. Custom fonts are a real optimization challenge. Third party tools, when you look at them--advertising, social media, tracking, frameworks, all of those things--and then thinking about data networks, these are all of the factors that we want to be mindful of as we’re looking at our design challenges.

We want to make design and make performance a competitive advantage and this is one of the things I think that is really compelling for us that, when we talk to our clients, we sort of bring them in by this area of self-interest. You know Google actually documented that when someone looked at two competitive sites side-by-side, an increase in speed of 250 milliseconds would encourage many consumers to choose one site over another, so a quarter of a second difference from your closest competitor can be a defining competitive advantage. That’s always sort of a nice -- that’s the carrot. Then on the sly, we can sort of get in this sort of like optimizing can be this path to democracy and growth and creativity and access to a bigger audience, that vision that Tm Berners-Lee was talking about.

This is sort of where we got into -- the rest of this is really around technique. Images, I think, in most of sort of that 2,500, that 2.4K of each website, images usually account for somewhere around 60% to 70% of the page weight. There are a couple of different ways that we can think about how to optimize images. There are formats that we consider. There are ways that we think about compression. Then there are ways that we think about preparing assets so that they are most appropriate for the devices we have.

I don’t know. I know Sarah is going to talk about SVG after this, but for anyone who doesn’t, SVG is just a fantastic opportunity if you’re thinking about ways to think about illustrations, diagrams, fairly simple things like icons.

  • Because they’re math, they scale.
  • They can be recolored.
  • They can be animated.
  • They will honor CSS, so they can be sort of built into the branding in a really seamless way.
  • They can be scaled appropriately to the size of the device, so you can sort of add detail and remove detail, and it really can sort of be appropriate to where it’s delivering in a really seamless way.
  • They can sort of engage the user, that sort of little bits of delight that Espen was talking about today, sort of unexpected things.
  • They can actually be animated in more thoughtful ways.

We actually did a little bit of an experiment where one of my colleagues built a tool that would let you expert layers from Adobe Illustrator, and they would generate SVG animations that can be played in the device and also can be styled to be responsive. You can use media queries with them, so that was kind of nice. For anyone who would be interested in that, there are some demos on our website to understand how to do that. Sara Soueidan also does a really great job of documenting how to understand -- how to work with SVGs to optimize them really well. My colleague, Todd Parker, also did a presentation on sort of when you’re evaluating any kind of image on your site, when to think about pixels versus SVG and how to evaluate that.

There are cases where bitmaps make sense, right? Photographs, more complex information, that’s where a jpeg or a png is going to be the right thing to do if you need alpha transparency. There are a lot of cases where that makes sense. In that case you want to look at the bitmap quality. The retina screen kind of blew everything up, and I think, again, from the sort of marketing perspective, people want to talk advantage of every pixel, but there are ways that that is actually a slippery slope into a real challenge for responsiveness.

We started looking into ways to think about how to use compression more effectively so that you can get better quality experiences. Now we know that the browser sort of rescales things on the fly so that it will do a little bit of the work. We started looking at a technique called compressive images, so that you take an image, make it much, much larger, but compress it with lower quality. The browser rendering will actually do a lot of work in the middle to make the image look better for you because that’s part of its native behavior.

This is actually just, you know, if you’re in Photoshop, this is a way to sort of optimize your settings to be able to do compressive images. It takes a little bit of play, but it actually, we found, can be a way to cut off maybe some 30% to 60% of the size of images and still get a really exceptional feel of an experience. For anyone who is processing images, that’s the real details.

The other thing that we wanted to look at just generally is, what images do you deliver to what devices? We’ve had some clients in the past who want to make sure that the most optimal experience, you know, retailers, clothing sites, art sites, things like that. But when you deliver an 1,800 pixel image to a small mobile device, that’s just kind of insulting. There’s no way that those devices can really use all those pixels effectively, so this is an opportunity where we want to look at responsive images as a way to think about that.

There are techniques from the picture. Picturefill was a polyfill that is actually not quite as necessary lately, but thinking about the source set, thinking the picture element. Looking at opportunities to take multiple images, starting with the smallest one that you need in a scenario, and letting someone opt into large images if they want to expand.

Then the second thing that we try to think about quite a lot is optimizing asset loading. This is more around the case of optimizing to that two-second window, so making sure you’re getting to something as quickly as possible. We know that JavaScript can be very slow to parse and execute, especially on some mobile devices. We did some tests with jQuery, which is kind of a size and bandwidth hog anyhow, but to see how it changes from a MacBook to an Android phone. It was kind of dramatic from a 34 milliseconds to a 652 millisecond download time and parse time for the same file. That’s sort of giving you a sense of context of how the same experience can be very different on different devices.

I don’t know. For anyone who follows Addy Osmani, who works for the Google Chrome project, he actually did a test last week on Web Page Test where he tested an identical JavaScript file on 25 devices, and these are not old devices. The top device is a MacBook Pro. Of course that was super fast. But the bottom, the second to the bottom, which was a six-second download, was a Samsung Galaxy Note. These are newer devices. They just have different capabilities.

If you don’t have one or all of these devices in your pocket or in your test lab, you may not be aware that this is happening. What we want to do is sort of spread the knowledge to help people understand. One of the best ways to understand this is to build a test lab that has real devices, bring them out on real networks, and then the second fallback is stuff like Web Page Test or, you know, Google Chrome dev tools will let you throttle and see what an experience feels like when you load a page now too.

Tim Kadlec, I don’t know if anybody is familiar with his work. He does wonderful work documenting how the experience works with JavaScript specifically and file optimization. He’s got some great resources to explain how to think through that process as well.

Speed and optimization, I think, on the script and the CSS side, there are so many ways to think about how to load parts of the page in a sequence, removing, in any way possible, things that block rendering and really thinking about making sure that you’re putting your customers in control a little bit more. This is an exercise. When Wired did their redesign, we just looked at their pages and Wired has beautiful design, and they use lots of custom fonts. One of the things that they clearly wanted to do was to make sure that the beautiful design and the custom fonts was really highlighted. But by making that decision, they put themselves in a situation where pages were taking 12 to 15 seconds to load. What we did was we actually just took the page that they had and just slightly changed the way the asset loading worked and documented it just to show that we sort of believe that it’s friendly and more respectful to give your user something that’s usable for them as quickly as possible, and so we’ve been working on some techniques to help make that happen.

When you optimize for speed and for caching, you want to look at your JavaScript, first of all, and just make sure everything is necessary there. Make sure that everything is needed. Then look at whether it’s needed for all pages and structure your files appropriately.

We want to make sure that every site works without JavaScript, initially. You know, render first. Deliver HTML and CSS first. Then do your JavaScript second. For any third party tool--for social, for tracking, for ads or any of those--making sure that those are loading after the page in a non-blocking fashion, and maybe delayed to prioritize rendering so that they only are loading when they’re within a certain sort of number of pixels as the page is scrolling. You want to only deliver to people what they really need, as much as possible.

There are a couple of techniques to look at even whether or not files should be delivered to devices at all. There are ways. There are tools. Cutting the mustard is a term, I believe, that the BBC came up with or the BBC development team. They were saying that the first thing you want to do is ask nicely whether a device can use an asset at all and then deliver it afterwards. They have a couple of techniques to be able to do that, to do device testing.

We have a small script called Enhance.js that will do the same thing. It will look at some capabilities of the browser. If the device satisfies those capabilities, then you deliver files after the fact.

If a browser passes a test, it requests the experience in the JavaScript. If it doesn’t, you don’t load that file, which you now know that the device doesn’t need. And you also don’t load all of the other secondary files that would depend on it, so you don’t load the custom fonts if you don’t think they’re going to be rendered - things like that.

Customer fonts, I think, are a very interesting challenge as well. I think, especially for our marketing teams, that’s something that they really want to be able to take advantage of. But I don’t know that every marketing person knows that some browsers will wait up to 30 seconds to wait for a font before they’ll time out. And sometimes they’ll time out and not load anything. That’s something that you want to be aware of.

One of the things that we generally recommend is that, again, something is better than nothing. If you want custom fonts to be blocked, then you’re going to get a bar like that for as long as it takes to load the page. We recommend that a flash of un-styled test, finding a native font on each device that’s a close enough approximation to a font is an infinitely more satisfying experience for your customers. Then when the font appears, caches, then it becomes available.

There’s, again, for any of the developers in the group that are interested in the techniques for how to do this, this presentation will be available for that. We also say that if you can, tuning and sub-setting fonts so that you’re only delivering the glyphs that you need can be a nice way to optimize as well. Each time that you load a font, every face of that font is a separate file, so there are some nice techniques to be able to really optimize within a font face to get to the smallest required set that you need.

For anyone who is interested in it in more detail, my colleague, Scott Jehl, wrote a book last year called Responsible Responsive Design that is a fairly deep dive into the techniques of optimizing pages. It’s a quick read, but it’s full of good techniques.

This is a little bit dated, but this is another thing that we always want to look at. MVC frameworks, they’re size and scale. They have a lot of benefits for developers, but they also render very differently on different devices. And some of them, just the framework itself can be a really cumbersome loading moment before you even get to any content at all. One of the things -- and I know some of these are no longer the most recent frameworks that are out there, but one of the things that we wanted to look at is making sure that if you’re choosing a framework that you’re building an assessment of the performance effect of that framework into your discussion of how you’re going to be delivering the site.

Then the other thing that we like to talk a little bit about is thinking more about things on demand. A lot of wasted bandwidth is happening because -- and this is sort of a native behavior of the browser. The browser wants to bring in everything just in case so it’ll be ready. But you need to try to counteract the “deliver everything just in case” with “don’t deliver things that will never be seen and never be used.”

What we try to do is make sure that we figure out techniques to help to deliver on demand content and, as much as possible, putting your user in control. We never want to play, put a video in the body of a page and have it autoplay when it’s not in the visible screen, and certainly if it’s not in an active tab, right? There’s ways to think about how to tune your content so that that works respectfully for the user’s experience.

Whenever possible, we want to put the user in control. If you have a carousal that has 40 images, you might want to load the first 2 or 3, but you don’t load the other 37 until the user has interacted with the first 2. There are ways to sort of layer in experience based on user interaction that will give people what they need in a timely fashion, but won’t weigh down the page and weigh down their data plan if they don’t.

This is a technique that we used for a retailer that we worked with where, on a 2G connection, we delivered a very, very, very tiny and very poorly optimized image and, as the page scrolled, higher quality images would slot in. Actually, you can kind of see them updating. We did this on 3G and 4G. It felt instantaneous. On 2G, you can actually see it loading, but it was actually only when someone actually, like, moved through the page did the higher quality assets appear.

The final thing that we want to make sure that we’re considering, again, for that sort of being ready for the future and also thinking backward to the base is, don’t compromise accessibility. Chris actually mentioned yesterday the whole idea of progressive enhancement. There’s no reason to limit access to your basic content. We want to make sure that we can reach the largest possible audience. It’s really common courtesy. It’s the equivalent of making sure that someone can access. If you were a grocery store, would you tell people that they couldn’t come and shop if they rode on a bike instead of a car? No, so you don’t want to block people who come on a different device than you might be optimizing for.

From the U.S. perspective, this is a very compelling argument for many of our audiences. There’s actually a lot more accessibility companies that are starting to sort of focus on lawsuits against companies that do not consider accessibility. About five years ago Target actually settled a $6 million settlement because they were not fully accessible. Netflix, again, almost $800,000. In the last year alone, in the U.S., there have been more than 240 lawsuits filed for accessibility violations, mostly for universities, retail, but banks. The companies are starting to expand, and I think this is mostly a factor of the fact that people are moving to smart phone dependent, smart phone only sites, and also that accessibility can be compromised in that way.

In those cases, progressive enhancement, which is an approach that uses standards based HTML and CSS, really is your friend. It ensures broad access and fault tolerance. Its semantic markup and then features are laid on based on capabilities, again using enhance or using a cut the mustard approach.

It’s not an all or nothing. We have a lot of discussions with developers where they think it’s dumbing down to use progressive enhancement. We want to really communicate that that’s not the case. It’s just thinking differently about how content, source order, structure, and interaction are layered.

We did a little bit of work with Lego, and they wanted to make sure that their site was really accessible everywhere, but they had some really unique branding elements. They wanted to make sure that their buttons were all nicely rounded, and they also wanted things like their select menus to be rounded. So you’ll see that on the left we have the sort of two kind of ugly standard select menus in a button. Then on the right it’s styled beautifully, but both those pages--the left without JavaScript, the right with--would work and the purchasing would work. It seemed to us like a small compromise to have a slightly uglier select that still worked.

The same thing was true with visual feedback. The experience on the right feels more energetic. It’s more fun. But the experience on the left will work everywhere, and both of them can coexist. These are the kinds of experiences that we want to be layering in. We want to make sure that we can take advantage of JavaScript. We want to make sure that we can see those real time feedback opportunities, but we want to build them on top of a semantic framework that will work everywhere.

They can get really complex, so we did -- you can do things like multilayered sliders that connect to each other that are based on simple select menus. The thing that you need to be careful of and thing that we need to be aware of is just understanding that semantic HTML is going to be accessible everywhere, and the problem is that JavaScript and CSS can break it. That’s our job is just not to break things when we improve them.

Then there’s a whole spec of accessibility with the WAI ARIA spec, which is a layer of enhancement, again, that Chris mentioned yesterday where you put roles on to objects in the page to make sure that they’re accessible for people with screen readers and also sort of bots and things like that so that it enhances the semantic richness of your experiences. This is for anyone who is interested. Again, this is something that is as much a discussion we want to have with the non-technicians in our audiences as with the technicians to help them appreciate how big an audience is that they might be missing and that there are really great resources and tools to help you understand how to do that well.

From our perspective, because we feel really strongly about this and because we appreciate the way the community is developing things, we actually create lots of small tools, assets that we just release open source. And so if anyone is interested in seeing some of those tools, these widgets are generally accessible. They all have the ARIA tools built in, things like that, and everyone is welcome to go to the site and join them.

We also wrote a book about six years ago about designing with progressive enhance and how to think about building that into your workflow. It’s a little bit dated, but surprisingly not so much. For anyone who wants to do a really deep dive on how to rethink the way you approach a design process that takes progressive enhancement into account, it’s out there.

Because, progressive enhancement and the other factors of performance optimization, we find, really do liberate to focus on great design opportunities that will get us toward the vision that Sir Tim Berners-Lee set out for us.

I have a minute and 48 seconds left, so thank you.

Audience: [Applause]

Speakers