Robin Christopherson: Morning. Is this on?
Audience member: Yeah.
Robin: Yeah, cool. Great. Thanks for that, Marc. That was really, really brilliant.
I’m going to talk about accessibility or, rather, why accessibility is no longer what we should be thinking about. We should be thinking about something really quite different. I can’t see at all, and I would normally go with -- oh, sorry.
JAWS: Slide 3: This is how it started. Slide 2.
Robin: Oh, does that come up right? My dog.
Robin: [Laughter] Yeah, this is Archie, and he hasn’t had all the right jabs, so he isn’t with us today.
Marc Thiele: Sorry, there was no audio on your machine.
Robin: I don’t need it at the moment.
Robin: Yeah, I’ll do it in a sec.
Marc: You’re good?
Robin: Yeah. Cheers. Yep.
Marc: Oh, sorry.
Robin: Thanks. [Laughter]
That’s my guide dog. I can’t see, and that’s why you can probably hear this voice at the front here, JAWS, although it’s a nice lady. Yeah.
This is going to be a presentation, which will be based upon a number of demos and practical things, so not that many slides. Hopefully, it’ll be really interesting.
JAWS: Slide 3: This is how it started.
Robin: Very, very briefly, back in the ‘80s, this is how it all got started with the PC. This is a very, very standard looking setup here. In the U.K. now, at least 90% of jobs use a computer in some shape or form. I’m sure it’s the same elsewhere, certainly here in Germany. But then ten years ago--
JAWS: Slide 4: The age of a small screen….
Robin: --something happened, which was the advent of the smartphone, and everything changed. All of the power, all of the potential of technology is now with you all the time, and that’s hugely, hugely empowering. Hands up, people who love their smartphones.
Now this isn’t going to work because I can’t see, so somebody in the front row--
Robin: I need an official estimator. Somebody in the front row, how many? Who is going to be my estimator in chief?
Robin: Thank you. Thanks, John.
Robin: Cool. See, now you’re my official estimator in chief. Okay? I’ll ask you again. Cool. Yeah, 95% of people, we all love our smartphones. They’re with us all the time. But, with the advent of the smartphone, we are now plunged into the age of extreme computing, computing on the edge.
What we saw a moment ago, you’ve got your setup. If it’s a bright, sunny day, then you can pull the blinds down if it’s shining on your monitor. If you can’t see the text, you can change the text on your monitor, or you can get a larger monitor. If you can’t use the keyboard, you get a different one, et cetera.
But now with the age of extreme computing, as we can see here, this lady is squinting at her phone. She probably hasn’t got any kind of vision impairment, but maybe those are reading glasses or they’re long-distance glasses. But every one of you in this room, apart from me, knows about when you’ve got a bright, sunny day. You’ve got a small, shiny sheet of glass. If there has been a poor choice of default font size, in an app, for example, poor type face choice, poor color, color contrast or color palette choice, then you’re going to struggle. This is the age of extreme computing. Everybody is temporarily disabled or impaired on a daily basis.
JAWS: Slide 5: And the one-handed juggle.
Robin: Here we’ve got somebody holding a cup of coffee with their phone in the other hand. We’ve all done this, probably today already. You’re trying to juggle your phone one-handed. You’re maybe just using your thumb. This guy is temporarily motor impaired, juggling his phone one-handed. He’s got a cup of coffee in the other hand, so maybe he’s on a coffee break. Maybe he’s time constrained because, ideally, we’d have all day coffee breaks, but that’s not how it works. He’s only got a certain amount of break time, and he wants to complete a transaction. Maybe he’s only got a minute and a half left of his break, and he’s quickly doing something.
What does he need in that time? He needs extreme UX to be able to complete that transaction on his app or mobile website in the time that he’s got allocated. But for people with a cognitive impairment or learning difficulty, they might need that extreme UX to be able to complete that transaction full stop, regardless of how much time they have. To be able to do it successfully, unsupported, they need extreme UX as well. Whether you’re temporarily motor impaired because of juggling your phone one-handed while you have a cup of coffee in the other hand or, like me, with a guide dog in the other hand, or whether it’s time constrained, you’re all temporarily impaired on a daily basis.
If this wasn’t coffee, if this was alcohol, for example, you’re out on a night out, what do you do at the end of the day? You call a cab, an Uber, or something, and you might use an app for that. Just think of how extremely usable the Uber app has to be, for people that have had a number of glasses of something, to be able to successfully use that. Alcohol makes you temporarily, what, cognitive impaired, motor impaired, not hearing impaired, perhaps literacy impaired. There are so many parallels here.
The main message I’m kind of conveying if you don’t take anything else away, it’s that accessibility is no longer for disabled people. Accessibility is for every single one of your users, every single one of your customers on a daily basis. As soon as you start thinking about accessibility as being for every single customer, then it suddenly becomes much more important. It should become central to your thinking, to your planning, to the decisions right from the beginning of a project, right through to go live, and onwards.
When that is no longer a bolt-on activity, when accessibility becomes core, when it becomes inclusive to design, then because it’s no longer bolt-on, it can’t be dropped off when push comes to shove. You will be creating products and services that are going to be fit for purpose in this extreme computing age.
JAWS: Slide 6: We don’t all use oversized phones just yet.
Robin: I couldn’t find a picture of someone with great mutant sausage fingers. Just imagine this person has got huge, fat fingers. Hands up who has got huge, mutant fingers.
Robin: Any hands?
John: You have eight.
Robin: Okay. Yeah, just imagine this person has got huge, sausage fingers. Yes, again, we don’t all have massive tablets. I know phones are getting bigger, but we don’t all carry these great big phones around.
Often that person is going to find it difficult to tap on small areas that are close together. You guys probably know the deal. It’s 44x44 px is the smallest tapable size you should have on mobile, and there should be good separation between them, et cetera. This chap is permanently motor impaired now that he’s switched from a big keyboard and mouse to this tiny, little, small screen.
JAWS: Slide 7: Cars and mobiles don’t mix.
Robin: Why have I got a picture of a blank phone in a car? That’s because, basically, this is the UI. This is the interface when you’re driving in a car, or at least it should be. There should be zero interaction or very limited interaction.
If people have used Waze or any kind of satnav, you know that it has got a big, clear UI so that you can just glance at it, get the information really quickly. With Waze, you can wave at the camera on the phone to report a speed camera or road kill, whatever it might be, heavy traffic, and you can do that by voice when the menus come up after you’ve waved at it. It’s a different approach to the UI.
These choices, these considerations of extreme use cases, of edge use cases, that make the products or services fit for purpose in this extremely diverse world. When you’re dealing with those people who need to use speech output, for example, because they can’t look at the screen, then you’re going to be catering for someone like myself who uses speech output all the time.
JAWS: Slide 8: Many mobile users do have a disability.
Robin: We had a hands-up a minute ago about people who love their smartphones. Well, you guys, and I’m sure that some of you, probably 95+% of you, who really, really love their smartphones. Just to let you know that we’ve been talking about temporary impairment here; but people who do have a disability, permanent disability with a capital D, love their smartphones because I don’t have other options. I can’t use a pen and paper.
I used to carry a backpack of equipment around, a talking GPS, a talking notetaker, a talking barcode scanner, a talking MP3 player. Now all of those have been replaced by one device, the same device as everybody else is using, and that’s super cool. All the apps on there are free or a few pence or cents compared to these hundreds and hundreds of pounds in this huge backpack with all the varying individual chargers and that sort of thing.
You will have customers that have got disabilities, actual disabilities, permanent disabilities, and they will be hugely appreciative that your app, your website isn’t the broken link in the chain of their experience. The technology is here. The gadgets, the interface. Accessibility is built into so many devices these days. Please don’t be that kind of missing link in the chain, which makes their life incredibly difficult.
JAWS: Items: A multi-select list box to….
Robin: I’m just going to quickly plug the audio in. Because we’re in a different country here, or at least I am, I could show you a million different examples of how the smarts within a smartphone are being used by people. Let’s just focus on a couple. These are mainstream applications. I’ll just fire this up.
Robin: Nice, groovy music, but I might actually--
I’ll leave it on. You guys can still hear me. This is called Word Lens. It used to be a free app, a standalone app in the store. Now it was bought by Google a while ago and it’s in Google Translate. If you guys have got iOS or Android and you’ve got Google Translate, then you’ve got this. Just tap on the camera button, and it’s using AR, augmented reality, to try and magically change, translate the text.
I’m going to mute that because it’s getting on my nerves. [Laughter]
Yeah, it’s kind of magic, isn’t it? It sort of tries to retain the original look and feel, the fonts, colors, and the original background image, et cetera. We’ve used this all the time since we’ve been here over the last couple of days. We’ve pointed it at menus, at posters, all different things. It just magically translates it. Now that’s a mainstream thing. It’s cool because it’s an AR sort of thing.
For someone with an impairment, it’s even cooler because what’s the alternative? It’s like manually typing things in. For me, it would be taking a photograph or something, perhaps, and running it manually through a translation, et cetera.
What’s cool for you guys is super cool for people with disabilities. Increasingly, the APIs associated with this machine learning smartness are available to you free of charge for a low volume usage.
I’ll just close that.
Robin: By the way, I’ll tweet and email, to Marc, links to everything that I’m going to cover because I’ve got a really annoying habit of not showing any of the videos that we’re going to look at today.
Robin: Google announced the other day the Pixel Buds, which are like their answer to the AirPods that Apple does. One of the really cool features, sticking with translation for a second, is that they have this kind of Babel fish translation capability.
JAWS: The LC--
Sean Hollister: Google’s Pixel Buds, the company’s first wireless headphones, and they're going to take on Apple’s AirPods at the same $159 price. What they do is that Apple’s AirPods don’t is real-time language translation. You can use the Google Translate app just by holding down on the right earpiece for a moment. It’ll summon the Google Translate app on your phone. You can hand it to another person and talk to them in another language, back and forth between Japanese and English, for instance.
Male: [Speaking Japanese]
Female: Where are you from today?
Male: Where are you coming from today?
Female: I am in San Francisco.
Translator: [Speaking Japanese]
Male: [Speaking Japanese]
Female: Shall I see a movie?
Male: I was hoping we’d see it together, but that’ll work.
Female: Yes, we should see a movie together.
Translator: [Speaking Japanese]
Sean: They can also read notifications from your phone, allow you to make other Google assistant queries, stop and start your music by tapping on the right earbud, or change the volume by swiping backwards and forwards.
Robin: I’ll pause it there. That’s really cool. Now, actually, it’s the Google Translate app that’s kind of doing all the smarts, but you can trigger it through the Pixel Buds, et cetera. If anyone, again, has got that Google Translate app on either their iPhone or Android device, then you’ve got that. For the Word Lens a moment ago, just tap on the camera.
For this one, just tap on the microphone, obviously after you’ve chosen your German, English, or whatever it might be, translation. But if you tap and hold on your microphone, then it will stay locked, and you guys can just have a conversation. It will automatically detect which language. Tap and hold on that microphone, put it on the table between you, talk in German. It will auto detect. It will speak it back in English and vice versa. Super cool.
We really have moved into sort of the future today.
JAWS: For Seeing AI-Microsoft.
Robin: That’s really well demonstrated with this app. It’s not available. It might be available in Germany or other EU countries. I’m not sure. But it’s in the States at the moment. It’s not available in the U.K. It’s seeking approval because, for some reason, I don’t know, it’s kind of got a medical -- it’s been defined as a medical application.
It’s called Seeing AI, and I mentioned it before that the API is associated with the machine learning of different platforms--Google, in this case, Microsoft--are freely available cognitive services APIs are what give you object recognition, text recognition, natural language recognition, and language translation out of the box. As a developer, you don’t have to recreate any of these. And, as I say, for low volume, it’s free of charge from both Microsoft and Google.
Let’s have a quick look at this.
JAWS: The LC--
[Snap. Bag rustling. Door handle rattle. Street traffic.]
Saquib Shaikh: I’m Saquib Shaikh. I lost my sight when I was seven and, shortly after that, I went to a school for the blind. That’s where I was introduced to talking computers, and that really opened up a whole new world of opportunities.
I joined Microsoft ten years ago as a software engineer. I love making things, which improve people’s lives. One of the things I’ve always dreamt of since I was at university was this idea of something that could tell you at any moment what’s going on around you.
[Skateboard wheels clanking. Camera snap.]
A.I.: I think it’s a man jumping in the air doing a trick on a skateboard.
Saquib: I teamed up with like-minded engineers to make an app, which lets you know who and what is around you. It’s based on top of the Microsoft intelligence APIs, which makes it so much easier to make this kind of thing. The app runs on smartphones, but also on the Pivothead SMART glasses. When you’re talking to a bigger group, sometimes you can talk and talk, and there’s no response. You think, is everyone listening really well, or are they half asleep?
Robin: I get that all the time.
Saquib: And you never know.
A.I.: I see two faces: a 40-year-old man with a beard looking surprised, a 20-year-old woman looking happy.
Saquib: The app can describe the general age and gender of the people around me and what their emotions are, which is incredible. One of the things that’s most useful about the app is the ability to read out a text.
Waitress: Hello. Good afternoon. Here’s your menu.
Saquib: Great. Thank you. I can use the app on my phone to take a picture of the menu, and it’s going to guide me on how to take that correct photo.
A.I.: Move camera to the bottom right and away from the document.
Saquib: And then it will recognize the text. Read me the headings.
A.I.: I see appetizers, salads, paninis, pizzas.
Robin: I’ll just pause it there. There’s nothing new there. Object recognition is built into all of these devices. It’s in your Facebook app, for example. If you had voiceover running, it would tell you all of these things. Character recognition, text recognition has been around for a very long time as well, but it’s bringing it all together and, hopefully, you can see that there’s a massive application, a massive benefit for people like myself who can’t see, for example.
Robin: Now, that isn’t quite here yet. It’s been held up with red tape, but there’s something that I use quite a lot, which is as good and very low tech. Well, actually, no. It’s high tech; but, comparatively speaking, it basically just connects someone who can’t see with a willing pair of eyes. You guys could do this. You could download this right now if you’re getting bored. [Laughter]
Robin: Get your phones out. It’s called Be My Eyes. It’s on iOS and Android. I’ll just give you a quick flavor of what it does, but you guys can help.
JAWS: The LC Media Player.
[Street noise. Elevator hum.]
Male: You might wonder how blind people deal with everyday challenges. Well, normally the answer is simple.
Male: They’re not that different from you. We play music. We go to school. We go to work. You get the picture.
Male: But sometimes the simplest things can be difficult, and we need a pair of eyes.
A.I.: …Connect to--
Karen: The first available helper.
Male: That’s where you come in.
A.I.: Establishing video connection.
Male: Through your smartphone, Be My Eyes connects the blind with sighted people through a live video connection. Simply choose if you need help or want to help by the click of a button.
Female helper: That’s a nice picture of you and your family, Karen. Is it for a present?
Karen: Yes, It’s a photo for my parents.
Male: You can help just by installing the Be My Eyes app.
A.I.: Print image.
Male: And we’ll notify you when someone needs your help. If you’re in the middle of something, don’t worry. Someone else will step in.
Male: That milk is way too old.
Robin: I’m going to stop it there. Sorry about that. Yeah, you guys could help that out on a daily basis for people that have a permanent impairment. Object recognition, fantastic. Text recognition is really moving along in leaps and bounds, but sometimes you just need a human at the other end to tell you how many minutes are left on your dishwasher cycle, for example, or what Fn plus which function key is to change the screen brightness or whatever it might be.
JAWS: Six: dot-to-dot--
Robin: I’ve got a slide up here. Does that come up all right?
Robin: Okay. Hands up. Who has got an Echo or any kind of home assistant: Google Home, Microsoft INVOKE? Anyone, chief estimator?
Robin: A few?
John: Can you do it again?
Robin: [Laughter] All right, then who likes the idea of these home assistants? They’re available, I think, in Germany, aren’t they? Yeah.
John: About 10%.
Robin: Whoa! Okay, then. I guarantee that those 10% absolutely love them because, once you’ve tasted this idea of what we’re now calling ambient computing or is being called ambient computing, I would argue it’s the third kind of generation or age of computing after the PC. Then we have mobile computing. Now we’ve got ambient computing. It’s this next sort of edge or use case, which is going to be becoming more and more mainstream.
Just to talk to the air. You’re not even aware of where the device is, perhaps. You’re just talking to the air, and you’re getting utility. You’re getting information. It’s performing tasks for you. It’s playing music, whatever it might be.
These devices are beloved, and they are increasingly becoming used by people with an impairment as well, because, just think about the simplicity. There are obvious considerations for people with a hearing impairment, for example. But then the app will display everything that, in this case, the Echo speaks out, and you can also type to talk to it using the app as well.
Now, I do a daily podcast. I’m that enamoured with the Echo. This is the small version, the Dot. It’s only 40 or 50 pounds, and it does everything that the other ones can do as well. There’s literally a dozen different models of Echo. There are several models of Google Home. Apple is bringing out the HomePod towards the end of the year.
Watch this space. Ambient computing, smart home assistance, although they’re going to be built into cars, washing machines. Amazon is bringing out a pair of glasses towards the end of this year that will have Alexa built in.
I do a daily podcast, if anyone is interested. Each day we look at a different skill. We demo a different capability of the Echo. There are about 2,000 skills a week coming out of the Echo.
JAWS: Six: dot-to-dot. Seven: Echo smart home demo.
Robin: That’s my lady. She sounds nice, doesn’t she? Maybe a bit fast for a comfortable speaking for you guys, but anyway--
Here’s a very quick clip of me demoing the Echo. This is a video episode that I did. You’ll be able to see readily after this what application it is for people with a significant physical impairment, for example.
Robin: Hi. It’s Robin here, and I’ve been looking at the Wasserstein smart plug and colorful lightbulb, and how we can operate them using the Echo, so let’s give it a go.
Alexa, turn on the lamp.
Robin: So that’s white, and it’s fully bright. Let’s make it half as bright. Alexa, turn the lamp 50%.
Robin: So now it’s half as bright. I can’t tell, but I’m hoping you guys can because I can’t see. Alexa, turn the lamp green.
Robin: Alexa, turn the lamp 100%.
Robin: Alexa, turn the lamp blue.
Robin: Alexa, turn the lamp red.
Robin: Now there’s a disco mode that you can do where it pulses different colors in time with the beat of the music, but unfortunately that isn’t able to be operated through the Echo, and you have to use the app, which unfortunately isn’t accessible to me using voiceover, but that’s the lamp. Alexa, turn off the lamp.
Robin: All of this commanding of that smart, colorful lightbulb has made me very hot, so I’m going to turn on the fan using the smart plug. This is a dumb fan that’s just plugged in through a smart plug. Alexa, turn on the fan.
Robin: Ah, oh, that’s better. So we labelled the smart plug “the fan” when we set up the smart plug, and that’s why I can say, “Turn on the fan.” If you want to connect it to some other device, then obviously you would call it that. You can just say “turn on” or “turn off” the device that you want to. So, ah, that’s nice. I’ll just turn it off now. Alexa, turn off the fan.
Robin: Ah, so obviously for people with a disability, this would be hugely empowering to be able to control their environment like that, but it’s also really cool. Cool. So this is Robin signing off. Thanks a lot. Bye.
AbilityNet: adapting technology; changing lives.
Robin: Ooh, I’m going to have to turn that down.
JAWS: Dot to Dot Daily podcast.
Robin: [Laughter] So you can see the application there. It can do so many things, including controlling your environment. That smart plug from Wasserstein is 16 pounds. The smart light bulb with 16 million colors is 16 pounds as well. We’re not talking about unaffordable here. This is mainstream and, with mainstream, comes an affordable price tag. It’s absolutely fantastic.
JAWS: My dot -- Dot to Dot daily podcast -- my dot, my dot--
JAWS: Beyond Tellerrand-Berlin … Echo-Saturday Night Live.
Robin: So to show how much the Echo and other smart devices have come into the public kind of psyche, this is an American Saturday Night Live. I don’t know if people are aware of it, a very famous, long-running kind of comedy show in America on Saturday nights, surprisingly. They’re doing sort of a parody of the Echo, a special edition of the Echo that could be used; that could be aimed specifically at the older generation.
Male: The new Amazon Echo has everyone asking Alexa for help.
Older male: Alyssa, what time is it? What the hell is wrong with this blasted thing? Amanda!
Male: But the latest technology isn’t always easy to use for people of a certain age.
Older male: These kids done bought me a busted machine again. Odessa!
Male: That’s why Amazon partnered with AARP to present the new Amazon Echo Silver, the only smart speaker device designed specifically to be used by the greatest generation. It’s super loud and responses to any name even remotely close to Alexa.
Male: So they can find out the weather.
Older female: Allegra, what is the weather outside?
Alexa: Is it 74 degrees and sunny.
Older female: Huh?
Alexa: It is 74 degrees and sunny.
Older female: Where?
Older female: What about it?
Alexa: The temperature outside is 74 degrees and sunny.
Older female: I don’t know about that.
Male: The latest in sports.
Older male: Clarissa, how many did old Satchel strikeout last night?
Alexa: Satchel Paige died in 1982.
Older male: Yeah, how many did he get?
Alexa: Satchel Paige is dead.
Older male: In what, now?
Older male: Who did?
Alexa: Satchel Paige.
Older male: Oh. I don’t know about that.
Male: Even local news and pop culture.
Older female: Juanita, what them boys up to across the street?
Alexa: They are just playing.
Older female: They what now?
Alexa: They are just playing.
Older female: You say they’re just playing now?
Alexa: Yes, they are just playing.
Older female: I don’t know about that.
Male: Pair it with smart devices like your thermostat.
Older female: Alexandra, turn the heat up.
Alexa: The room is already 100 degrees.
Older female: Are you trying to kill me, Alizae?
Male: The new Amazon Echo Silver plays all the music they loved when they were young.
Older male: Angela, play black jazz.
Alexa: Playing, uh, jazz.
Male: It also has a quick scan feature to help them find things.
Older female: Amelia, where did I put the phone?
Alexa: [Loud exhale] The phone is in your right hand.
Male: And it has an “uh-huh” feature for long, rambling stories.
Older male: But then I gave him five dollars, and he said I only gave him one dollar.
Older male: I said, I know I gave you a five.
Older male: Because I only had a five and a one on me.
Older male: And this is the one dollar right here.
Older male: So, I mean, you tell me who’s crazy.
Male: Amazon Echo Silver: get yours today.
Robin: I’ll just stop it there. Cool.
Robin: Okay, that’s a bit--
Robin: --you know, a bit flippant, maybe.
JAWS: Nine: Who done it?
Robin: The serious point there is that now we’ve moved into an age where you have the absolute broadest audience possible accessing your content, your functionality, the things that you’ve lovingly designed and coded. You really do need to think about the most diverse audience possible, so inclusive design really should broaden your idea of what your personas are that you would use, the default set of personas in any new project.
Hands up who has a set of personas within their organization that they use. Chief, estimator in chief?
John: About 10%.
Robin: Okay, 10%. Cool. Hands up; of those, how many of them have a disability, for example, represented?
John: One person.
Robin: Yeah, okay. Cool.
Robin: We really do need to sort of think out of the box. Inclusive design is where you’ve got a really broad -- and we saw here with the old, greatest generation accessing that content.
Robin: I should turn that down.
JAWS: Slide 1: Who done it? Firemen help firemen.
Robin: Okay. We’re going to do a left turn here into whodunit. Okay? A murder mystery. We’ve got four people here: a fireman, a banker, truck driver, and doctor. Okay? Are you picturing them, okay, in your head? You’ve got four characters here. One of them is a murderer. Okay?
Four people, now they’re in this room. Okay. They could be playing cards around a table. I don’t know. But the police are after them and, well, after this murderer. The only thing they know is that he’s a man, a he, a male murderer, and they’ve had a reliable report that he’s one of these four people in the room. Okay?
By the way, if you know this or if you kind of have it already, don’t shout out straightaway. Let people have a chance to think.
The police, they storm the room. They throw open the door, and they instantly know which of the four people it is. How?
Female: Holding the bloody knife.
Robin: [Laughter] Looking really guilty. No, it’s much more obvious than that. A fireman, a banker, a doctor. The clue is in front of you. Should I tell you?
Male: There’s only one man in the room.
Robin: Got it. The fireman. Okay? You probably -- hands up, who pictured four guys? Chief estimator?
John: Yeah, about 50%.
Robin: Yeah, okay, diverse personas. We can’t really assume anything. We probably need to all accept the fact that we have both conscious and unconscious biases, prejudices, preconceptions.
JAWS: Slide 2: The fireman dunnit!
Robin: It was the fireman what dunnit. Let’s take a very quick look at these four characters. Let’s kind of put to one side the fact that this guy is a murderer for the time being. [Laughter] Okay. We’ll just gloss over that.
Very quickly, let’s look at these guys. Imagine that these were your personas, for example. Those four characters have got a very different demographic, different income levels, different levels of education, perhaps different backgrounds and different ages. But let’s put all of those to one side. Let’s very quickly look at their impairments, for example, and think about how those could weave into the personas that you have in your organization or that are in your mind when you’re thinking about projects that you’re undertaking to try and make your things as inclusive as possible.
JAWS: Slide 3: Shuttle--
Robin: This fireman has dyslexia, and people with dyslexia are actually really good problem solvers because, when you’re born, the dyslexic brain doesn’t have so many direct right-left connections. There are more peripheral connections that are forged over the years to try and compensate for that lack of more direct, left-right connections. As a result, they’re really good lateral thinkers. They’re really good at thinking out of the box. They’re really useful to have in a tight space.
Now, why have I got a picture of a shuttle here? Well, when NASA was sending up crews on the shuttle, those seven crews always--they made sure--had one person with dyslexia because they’re very useful to have in a tight spot. A fireman would be really useful to have in a tight spot. But with dyslexia, there are the obvious considerations about sensible choice of type faces, sensible choice of color palette, avoiding full justification of text, so you don’t get rivers of white running through the copy. That’s the fireman.
JAWS: Slide 4: Banker has high functioning autism.
Robin: Here’s the banker. She’s a she, and she’s got autism, high functioning autism, which makes her fantastic--
JAWS: Slide 5: Stock....
Robin: She’s like a stock trader, market banker, and she can spot patterns really well. In her job that makes her really, really good at what she does. But because of her autism, if she has to fill out an online form, where it says, for example, time at current address, we all know what that means, don’t we? How many years have you been living at the address that you’re currently at? But what would she do? She’d look at the clock, and she’d write down 10:14 or something like that, time at current address. There are loads of implications--with an appreciation of this kind of diverse set of users that you will have, absolutely, undoubtedly will have--that would inform your decisions as you go through.
JAWS: Slide 6: Doctor still uses DNS....
Robin: Here’s the doctor. She hasn’t got a permanent impairment, but she broke her wrist six months ago, and she had to start using Dragon, the dictation software. She found that she could now dictate her reports at 300 words a minute. I mean a lot of doctors do have support, and they do dictate into a little machine, don’t they? But she was able to crank out her reports really, really quickly. She bought an additional pack of medical terminology so that it would really get all of those complicated words right in her reports.
When her wrist mended, she still carried on using it because, why not, you know. For her, though, for example, on a Web form, if you’ve got a graphical button that says, “Go,” but the labels in the HTML is “Search,” for example, say it was a search for with a little “Go” button, she could be saying “click go” to her Dragon until she’s blue in the face and it won’t submit that form because it’s not what the actual HTML is looking for.
JAWS: Slide 7: Truck driver in a wheelchair.
Robin: The truck driver is actually the only person, of all these people, that does have a visible impairment. Please don’t equate disability with something visible. Many times it is, but often it isn’t the case either.
JAWS: Slide 8: Adapted period.
Robin: She uses an adapted truck, a lorry, to get around. She’s got no pedals, but she has them up on the steering column. What are her technical requirements with regard to using a smartphone or a computer? None. None at all, so a disability doesn’t necessarily equate to something that you guys need to consider.
If she is using a desktop, for example, then the arms of her wheelchair get in the way, so she can’t use a normal desk. The answer isn’t an adjustable desk. You crank it up so she can get the wheelchair under because then she’s typing under her chin. That’s not the answer. It’s a wheelchair tray, lap tray, which a compact keyboard--you know the ones without the number pad--and a trackball. But that hasn’t really got an implication for what you guys would need to do or think about when you’re designing.
JAWS: Items ... really going mobile.
Robin: Let me just check what time we’ve got.
Robin: Ooh, okay. We haven’t got a lot of time. [Laughter] I’m going to just skip over.
Robin: I was going to talk about--
JAWS: Ready. Slide 1--
Robin: --who is old enough?
Robin: Okay. We’ve talked about A.I., earlier, quite extensively. Now with the lorry driver lady, it’s nicely moved us into getting really mobile, and so we’ve got The Hoff and KITT, the car, Knight Rider, and that combines autonomous vehicles with A.I. and that sort of thing. If anyone ever uses RSS to get your articles, just put a search in for autonomous, and every single day you’ll get a dozen articles from the Internet about how autonomous vehicles are just around the corner. Excuse the pun.
JAWS: Items in multi-select list box. Even ... Google....
Robin: I won’t show you this clip about what the computer sees. You can have a look at that when I text that around--
Robin: --when I tweet that around. What I do want to show you--
JAWS: The LC--
Robin: Just start up in a sec.
Robin: Does that come up all right?
Robin: Nice music, but I’m going to mute it.
Robin: So, you want a bit more?
Robin: Okay. So, why am I showing you this? It’s 110, 111 years old in San Francisco. Well, again, articles coming left, right, and center about how this is actually what the future of inner city traffic is going to look like again. Okay? It’s a relatively new phenomenon for pedestrians to be stuck down the few meters of concrete in front of the faces of buildings on either side of the street and having to wait at the ampelmann to cross the road.
But, in the future, when a lot of cars are smart, then they will be like this, a real free for all where people are much more at liberty to go where they like, to cross the road when they like. It could see the demise of the ampelmann. I don’t know. Maybe it could.
Do people know what I’m talking about? Who doesn’t know what the ampelmann is?
Robin: [Laughter] What? How many?
John: They all know.
Robin: Okay. Fine. Sorry. It just wasn’t funny. Okay. Fine. [Laughter]
Robin: [Laughter] Yeah, so this is what it’s going to be like. An article I read a couple of days ago, it must have been the 20th, in this vein, says that in the future 20-mile-an-hour speed limit free-flowing traffic, no junctions, no lights, no intersections, just complete free for all is what it will be like in the future. Watch this space.
What impact has it got for you guys and why am I showing it to you? Well, the car will be yet another forum, arena for you guys, for people to consume your content, whether it’s on their smartphones, whether it’s an assistant that’s built into the car, whether it’s in their Amazon Alexa glasses. Whatever it might be, people are going to be consuming your content on the go. The streets might be bumpy. They might be temporarily motor impaired, et cetera. They might not have a very long ride to be able to do the thing that they want to do. Obviously, again, for disability, huge implications. But if the Uber app isn’t accessible, I won’t be able to participate in this self-driving future or own a car of my own, for example.
Robin: I’m going to have to skip over that one as well, I’m afraid.
JAWS: 14: Capture....
Robin: What I would like to do now, just finish off with a very quick audience participation exercise. We all know what this is, right? They’re evil, right?
Robin: Luckily, you don’t see them that often, and there are some that replace it, you know, a tick box, which says, “I am not a robot,” but that doesn’t work on mobile for many people. They’re trying to get away from this, but I’ll explain in a minute why it’s still relevant for me to talk about CAPTCHA.
Now, CAPTCHA, don’t forget, doesn’t exist on -- they don’t use it on simple forms. They use it on important forms, forms that they don’t want people to be able to spuriously submit. It’s only on important forms. The form might have taken, particularly if you’ve got a disability, half an hour to fill out, or a long time. It’s on page five of five. Oh, blimey, there’s a CAPTCHA.
Now, you guys probably can see that quite well, but you’ve probably got 20/20 vision. The audio version, which is usually a wheelchair symbol, which is quite an odd choice of icons, but anyway, so I would click the wheelchair, and it would give me the audio one, which you guys are about to listen to. Now what I’d like you to do is try your absolute hardest. If you’ve got pen and paper, brilliant. If you haven’t, please open your phone and find an app where you can quickly punch in letters and numbers because it will be a combination. And also, perhaps if it’s something you can scribble on, you know, a drafting app with your finger, then maybe you could just write the code down with your finger. If you don’t mind, I’d really like to have you guys participating and trying to listen to what I have to listen to and write down the code. Is that all right?
Robin: Cool. It’s going to repeat it twice, so you get two bites of the cherry. But I want you to try your absolute hardest, like you’ve just spent half an hour filling out this form, and this is the last hurdle. Okay? Ready?
Robin: I just hit the spacebar. What’s going on? Have you turned the sound off, guys? You haven’t turned the sound--?
JAWS: Start list box. Toggle start navigation. User account for Robin Christopherson.
Robin: Oh, what happened?
JAWS: Beyond Tellerrand dashboard ... software goes....
Robin: Is that up on the screen again?
Audience member: Yes, it is.
Robin: Ooh, okay. Ready?
Audience member: Yeah.
CAPTCHA: 9 ... 2, 3, 7, 9, 7 ... 8, O, H, 0 ... 9 ... once again, 9 ... 2, 3....
Robin: Come on! Your life depends on this!!
CAPTCHA: --9, 7 ... 8, O, H, 0 ... 9....
Robin: Okay, so five seconds to just compare with your neighbors, please, and I’ll ask you afterwards if anyone has got the same.
JAWS: Beyond Tellerrand ... 15....
Robin: Okay. That was quick. Does that last slide come up right?
Audience member: Yes.
Robin: Cool. I’m having ... issues here. So, okay, hands up, anyone that got the same as their neighbors. Now, I don’t even need our chief estimator in chief.
Robin: No hands?
John: No hands.
Nobody got it right, right? That isn’t even -- even if two people had, who’s to say that it actually would have been the correct code? Okay.
Yeah, just to finish off then, hopefully, what you’ll take away is the power of technology, the power of technology to help every single person. If you’re going to do it with the broadest approach possible, thinking about inclusive design in everything that you do, then you’re going to take everyone with you. Thank you very much.
Marc: Thank you, Robin.