Transcription
LAURA KALBAG: There we go.
Excellent.
Yeah, everyone enjoy their cold showers this morning.
If you’re in Holiday Inn, you know what I mean.
That woke us up, that’s for sure.
So I was browsing this LGBTQ news site, as I often do.
And I was brought, when I scrolled down the page, with this ad.
And it’s the usual kind of clickbait.
I mean, sponsored posts, sorry.
Dublin, say goodbye to expensive solar panels.
And then I got 25 celebs you didn’t realise are gay.
Number eight will surprise women.
Drink this before going to bed to help burn belly fat.
And this clickbait is kind of quite revealing.
Like just through its chosen topics.
I mean, it knows that I live near Dublin.
It knows the broad subject matter of the site that I’m on.
It suspects I’m a woman.
And well, who doesn’t feel targeted by ads about belly fat?
And this clickbait is provided by, to Pink News, by Taboola.
And I used to work on blocking trackers with a privacy tool called Better Blocker.
Like your ad blocker, but it was a tracker blocker.
And we looked into Taboola.
And in our crawls of the most popular sites on the web, we found Taboola on around 5% of sites.
Now, Taboola, as it claims, it helps promote your brand at the moments your audience is most receptive to new messages and products and services.
And you can do that with their data-rich recommendations.
Ensure your brand reaches interested people by leveraging the massive amounts of user data powering the Taboola engine.
And they provide this handy graphic here showing some of the information that might be useful about a site’s visitor.
Like their device and their operating system.
They’re in the market for a car or some fashion or an electric bike.
And they have interests of being a pet lover or environment or entertainment or science and technology.
And as I scroll down to Taboola’s privacy policy to see how they know this information, because I am one of those people, and what they intend to do with it, they seem to have a specific policy for third-party online advertising.
So, in I go.
And we automatically collect user information when users interact with our services that appear on our customers’ websites and digital properties.
And Taboola collects only pseudonymisd data, which means we do not know who you are because we do not know or process your name or your email address or other identifiable data.
So, let’s debunk this for a second.
So, pseudonymisd or anonymisd data does not mean you’re unidentifiable.
Even though it is a claim that privacy policies have been hanging off for years.
As Bruce Schneier said years and years ago in Wired, it takes only a small named database, as in a database that does contain names, for someone to pry the anonymity of a much larger anonymous database.
They just need to compare some data points in each database, and they can work out who you are.
And a recent study, and it’s not the only study, into methods to reidentify individuals from so-called anonymisd data sets, showed that 99.98% of Americans, it’s always Americans, would be correctly reidentified in any data set using 15 demographic attributes.
Attributes such as your age, your gender, your ethnicity, your postcode, your number of children, number of cars you own, your location, status updates, results you’ve submitted on your personality quiz.
So, returning to Taboola’s privacy policy, I want to know how the interest Taboola infers compares to these kinds of demographic attributes.
So, they’re described by Taboola as data segments.
If you’ve ever done anything with analytics, this will be familiar to you.
So, a data segment is a grouping of users who share one or more attributes.
For example, travel enthusiasts.
We offer a number of data segments, both proprietary and from our data partners.
Now, kindly, they’ve provided a link to their data partners.
So, I go ahead and click there.
And two of these data partners stand out to me in particular.
That would be Axiom and Oracle.
And that’s because Cracked Labs have done multiple reports into the personal data that corporations collect and combine and analyse and trade and use, and the data brokers behind that data.
Including two of the biggest ones, Oracle and Axiom.
And according to Cracked Labs, Axiom provides up to 3,000 attributes and scores on 700 million people in the US, Europe, and other regions.
And Oracle sorts people into thousands of categories and provides more than 30,000 attributes on two billion consumer profiles.
So, what are these attributes and categories?
Now, the text on here is pretty small.
So, I’m going to pull out some of them for you.
One of nearly 200 ethnic codes, political views, relationship status, income, details about banking, and insurance policies.
Your type of home, including if your home is a prison.
Likelihood of whether a person is planning to have a baby or planning to adopt a child.
Number and age of children.
Purchases, including whether a person bought pain relief products.
Whether a person is likely to have an interest in the Air Force, the Army, the Navy, the lottery and sweepstakes, or gay and lesbian movies.
Search history, including whether a person has searched about abortion, legalizing drugs, or gay marriage, or protests, or strikes, or boycotts, or riots.
Or the likelihood that a person is a social influencer or is more likely to be socially influenced themselves.
And Taboola says, it does not knowingly create segments that are based upon what we consider to be sensitive information.
Though helpfully, Taboola also includes a very detailed list of their apparently non-sensitive standard health-related segments.
Check these out.
I’ve picked out some that really jumped out at me as being quite personal.
Active health management, far below average.
Health, I have no confidence in the healthcare system.
Family and parenting.
Motherhood.
Artificial insemination.
Pretty clinical, that.
First sign of pain, I take medicine.
That’s a category.
Health and fitness.
Addiction.
Health and fitness disorders.
Panic and anxiety.
Personality.
Dealing with stress, bottled up.
Or emotional.
Or like to have a quick fix.
This isn’t exactly the kind of information you want marketers to use to sell to you.
It’s personal.
And these personality-style attributes were also used by Cambridge Analytica.
They collected them through a personality test app on Facebook that also harvested the profiles of the participants’ friends and their friends’ friends.
And users were scored on big five personality traits – your openness, conscientiousness, extroversion, agreeableness, neuroticism – and in exchange, 40% of them consented to access to their Facebook profiles.
And Cambridge Analytica itself claimed to be able to analyse huge amounts of consumer data and combine that with behavioural science to identify people who organisations can target with marketing material.
It’s just profiling.
That’s what they’re doing to you.
And Cambridge Analytica was a venture of SCL Elections whose expertise was in psychological operations, or PSYOPs, changing people’s minds not through persuasion but through informational dominance.
And it’s a set of techniques that includes rumour, disinformation and fake news, which is just targeting.
And that same SCL worked with Steve Bannon on the first Trump election campaign.
And as this neat graphic from The Guardian shows, SCL’s ventures, Cambridge Analytica and Aggregate IQ worked on multiple Brexit leave campaigns too.
Thanks a lot, Cambridge Analytica.
So we as citizens could be manipulated by this profiling and this targeting.
And this is all the topic of a documentary you can find very easily on Netflix called The Great Hack.
I’d recommend it if you don’t like reading privacy policies as much as I do.
And it’s very accessible to your non-techie pals too.
But all of this means it’s not exaggerating to say that tracking actually affects democracy.
And if we use tracking in what we’re building, then we have to consider its ethical implications.
And I could talk about this whole stuff in a lot more depth for longer, but it is very early in the morning, especially after the late night last night, and I’ve not got the time.
But if you want a decent read on it, I’d recommend Surveillance Capitalism by Shoshana Zuboff.
And it contains both the history of how tracking has been used in this way, but also the predicted future of what these massive complex surveillance systems can result in.
So Shoshana Zuboff coined the term surveillance capitalism and describes it in this book as unilaterally claiming human experience as free raw material for translation into behavioural data.
And although some of these data are applied to product or service improvement, you always hear that, don’t you?
We’re collecting it because it’s going to improve the product for you.
The rest are fabricated into prediction products that will anticipate what you will do now, soon, and later.
And if you look at the size of that book and the kind of academic language and think, there’s a really good podcast with Shoshana Zuboff with Adam Buxton.
If you don’t already listen to that podcast, it’s really good.
But it’s also convenient, is what we all say.
Like, targeting and profiling, sure, that’s okay, because it makes technology just more convenient for the rest of us.
But convenient exploitative technology is a bit like fluffy handcuffs.
They might look like they’re leading to some fun, but they are still handcuffs, and you really want to have access to the key to that.
So how can we protect ourselves as individuals?
Let’s look at some of the things we could do.
Well, we could avoid logging in, if you can.
For example, when you’re watching videos on YouTube, you could be tracked by fingerprinting, though, even if you haven’t logged in.
This is a combination of identifiers that your browser can provide that act as a fingerprint and can identify your browser via things like just its height and its width, the device you’re using, and ironically, whether you have the do not track setting enabled.
In 2015, Facebook even filed a patent saying it could identify people who might know each other because they appear in photos taken by the same camera.
How do they know this is the same camera?
The same lens scratches and dust.
Avoid providing your phone number to these services.
People often recommend using two-factor authentication, and I would be one of those people.
However, giving your phone number as one of the authenticating factors might not be the wisest idea.
So a study found that Facebook used those phone numbers for targeted advertising.
We added and verified a phone number for two-factor authentication to one of the author’s accounts, and the phone number became targetable by advertising after 22 days.
Not long after that, Twitter admitted that they did exactly the same thing.
This was back when it was Twitter.
When an advertiser uploaded their marketing list, we may have matched people on Twitter to their list based on their email or phone number the Twitter account holder provided for safety and security purposes.
You could always disallow cookies in your browser preferences, but we all know that if you block cookies, you tend to have the whole experience fall to pieces.
And usually completely silently as well.
So even if we only blocked cookies from third parties, if a site relies on anything third-party for anything persistent, whether it is a login, or your preferences, or a shopping basket, that will probably break.
Like I say, don’t use Gmail.
Your email not only contains all of your communication, but the receipts for everything you’ve bought, the confirmations of every event you’ve signed up for, every platform, newsletter, service you’ve joined, and don’t mention when you log into Chrome.
From our own crawls of the web that we did with Better Blocker, we discovered Google had its tentacles in around 80 per cent of the popular web.
So you can think about the amount of information that Google can gather just from those sites.
If your friends and your family use Gmail, you’re still kind of a bit stuck there.
Likewise, your choices affect your friends and family.
And these are all the choices we can make once we’re on the web.
But we need to be aware of the other places we’re tracked as well.
So Google Nest knows everything about your home.
Amazon Ring and Alexa can hear everything you say and spy on your neighbours.
Hello Barbie knows your kid’s secrets.
A smart pacifier means you can put a chip in your baby.
Of course, it was only a matter of time before they introduced a smart menstrual cup and a smart tampon.
And let’s not forget the smart dildo.
We connect to the smart dildo makers were sued for tracking their users’ habits.
Have you ever wondered how many calories you’re burning during intercourse?
How many thrusts?
The speed of your thrusts?
The duration of your sessions and their frequency?
How many different positions you use in a period of a week, month, or year?
You want the icon dom.
Have you ever wanted to share all of that information with advertisers, insurers, your government, and who knows else?
Avoiding it all is just too much work, right?
I advocate for privacy.
I know quite a lot of stuff about the various tools we could use.
And I don’t have the time and the resources to protect myself all the time.
And that’s why it’s unfair to blame the victim for having their privacy eroded.
Not to mention that our concept of privacy is getting completely twisted by the same people who have an agenda to erode it.
So, one of the biggest culprits in attempting to redefine privacy is Facebook.
So, this is a Facebook ad that was showing on TV a while ago.
It shows a person undressing behind towels held up by her friends on the beach and alongside the Facebook post visibility options saying, public, friends, only me, close friends, explaining how we each have different privacy preferences in life.
And it ends saying, there’s lots of ways to control your privacy settings on Facebook.
But it doesn’t mention that friends, only me, and close friends should actually really read friends and Facebook.
Only me and Facebook.
And close friends and Facebook.
Because you’re never really sharing something with only me on Facebook.
Meta has access to everything you share.
Privacy is the ability to choose what you want to share with others and what you want to keep to yourself.
And Facebook shouldn’t be trying to tell us otherwise.
Google also has an interesting interpretation of privacy. 15 years ago, Eric Schmidt, then CEO of Google, famously said, if you have something you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.
A lot of people will actually regurgitate lines like this.
Would they feel comfortable sharing their recent uncleared browser history?
I don’t think so.
So do we need to be smart about what we share publicly?
Yeah, sure.
Don’t go posting photos of your credit card or your address online.
Maybe it’s unwise to share a photo of you blackout drunk the week before a job interview.
But perhaps we should take responsibility if we’re going to say something awful about someone else online.
But this isn’t about what we knowingly share publicly.
Right now, these corporations are more than happy to blame us for our loss of privacy.
They say, well, we agreed to the terms and conditions.
We should read the privacy policies.
It’s all our fault.
And this in itself is the subject of a documentary made way back in 2013 called Terms and Conditions May Apply.
And it covers the ridiculous length and the legalese in the terms and conditions and how we couldn’t possibly read them for every service we use.
And a few years after that was released, an editorial board op-ed in the New York Times pointed out the flaws in privacy policies for consent.
The clicks that pass for consent are uninformed, non-negotiated, and offered in exchange for services that are often necessary for civic life.
And there are studies that speak to how difficult it is to understand these policies too.
Like two law professors analyzed the sign-in terms and conditions of 500 popular U.S. websites, including Google and Facebook, and found that more than 99% of them were unreadable, far exceeding the level most American adults can read at, but are still enforced.
And it is not informed consent if you can’t understand the terms.
And how can we even truly consent if we don’t know how our information can be used against us?
It’s also not really true consent if it’s not a real choice.
We’re not asked who should be allowed access to our information and how much of that information or how often or for how long, when.
Like this absolute fresh hell that used to be on Huffington Post.
They’ve got this because the GDPR requires that they ask for consent before tracking you.
I think everyone in Europe is pretty familiar with seeing these kinds of things.
But in this one, I was trying to work out how to say no.
So I went to manage my options.
Apparently that’s what we mean by no nowadays.
And we select manage.
And then I’m like, well, okay.
See how partners use your data?
Okay.
Show me that.
Okay.
Small white text on a blue background.
Yeah, we learned yesterday how accessible that is.
Okay.
That’s not very helpful.
I’ll select hide again.
Go back.
I’ll see and customize which partners can use your data.
Show.
List of third parties.
Oh, their privacy policies.
Great.
Most of those don’t have any reference to third party use and have no options for controlling the data.
Wonderful.
At no point was there a choice for me.
Nothing that resembles consent.
Just a lot of text saying the same thing over and over again until I just give up and say, okay, fine.
I’ll just, okay.
And of course, these interfaces have mostly been ruled noncompliant with the general data protection regulation.
We’re just asked to give up everything or get nothing.
And that is not a real choice.
And it’s certainly not a real choice when the cost of not consenting is losing access to social, civil, and labor infrastructure.
So there was a paper a few years ago by Jan Fernbeck and Gwen Schaffer, Cell Phone Security and Social Capital.
And this paper examined the privacy tradeoffs that disproportionately affect mobile mostly internet users, which is a huge amount of internet users.
And what they found shows the cost of not giving consent.
So speaking about this paper, Gwen Schaffer explained all individuals are vulnerable to security breaches, identity fraud, system errors, and hacking, but economically disadvantaged individuals who rely exclusively on their mobile phones to access the internet are disproportionately exploited.
Some focus group participants reported that in an effort to maintain data privacy, they modify their online activities in ways that harm personal relationships and force them to forego job opportunities.
The thing is that the technology that we use is our new everyday things.
It forms vital social, civil, and labor infrastructure.
And as largely helpless consumers, there’s often not much we can do to protect ourselves without a huge amount of time and money.
And when the technology you use is a lifeline to access, you’re impacted more severely by its exploitative factors.
So seven years ago, Dr. Francis Ryan covered this in an article, The Missing Link, Why Disabled People Can’t Afford to Hashtag Delete Facebook.
Because after the Cambridge Analytica scandal was uncovered, loads of people started saying, well, you should hashtag delete Facebook, obviously.
But as Dr. Ryan pointed out, I can’t help but wonder if only privileged people can afford to take a position of social media puritanism.
For many particular people from marginalized groups, social media is a lifeline, a bridge to a new community, a route to employment, a way to tackle isolation.
So like so many issues that we have with technology, what we’re dealing with is underlying social and systemic issues.
And as technologists, we can’t help ourselves trying to smooth over these problems with technology.
But technology can’t fix the issues of domination or oppression or discrimination.
And technology can actually make those issues a lot worse.
And at scale, we can and we do amplify and speed up systemic issues with technology.
So Mike and Nanny wrote a really great post about this in an article about tech platforms, and that we still seem to operate with the notion that online life is somehow different life from our everyday existence.
It’s detached from it.
And tech platforms really take advantage of that notion by suggesting, well, if we don’t like technology, then we can just log out or log off or go do some other shit instead.
But people with this mindset often show how shallow they are when they say, well, if you don’t like the technology, you don’t have to use it.
The amount of times I’ve heard that after giving a talk like this.
But we can’t escape technology.
Platforms are societies of intertwined people and machines.
There is no such thing as online life versus real life.
And we give massive ground if we pretend that these companies are simply having an effect or an impact on some separate society.
Which brings me to another issue rife in technology today, which is technology colonialism.
So Anjuan Simmons wrote about this over a decade ago in Model View Culture, and he started with a little bit of history.
So colonial powers, I mean, I’m very familiar with these, I don’t know about you, always saw themselves as superiors over the native people whose culture was rarely recognized or respected.
The colonizers saw economic value in foreign relations, but it was always viewed as a transaction based on inequality.
And then he compared it to what we do in technology.
And technology companies continue this with the same philosophy and how they present their own products.
These products are almost always designed by white men or for a global audience with little understanding of the diverse interests of end users.
And we have to reckon with our colonial history, and it speaks to us politically too, but today I’m really just talking about the tech industry and the tech community.
We have to reckon with the colonial way in which we’re creating technology.
So we don’t speak to users, instead we use analytics and data to design interfaces where people will never even try to speak to, whether they even wanted our tech in the first place, we don’t ask them that.
We’ll assume we know best, because we are the experts and they are just the users.
We don’t have diverse teams.
We barely even try to involve people who have backgrounds that are different from our own, and we fetishize our tools.
We value the designer and the developer experience over that of the people using what we build.
And we can say that we have the right intentions, but it kind of doesn’t really mean that much.
Designer Tatiana Mack invokes this phrase, change does not erase impact, to describe how we haphazardly design stuff for tech.
We not only have a responsibility to design more rights-respecting technology, but to consider the impact that our design has outside of its immediate interface.
But as the people that are advocating for change, we can’t exactly go around telling people to stop using technology unless there are real rights-respecting alternatives.
And that’s where we come in.
Because, as people who work in technology, and who create technology, we have far more power.
We can encourage some rights-respecting practices, and we can build alternatives.
So how do we build more rights-respecting technology?
Well, as an antidote to big tech, we build small technology.
Everyday tools for everyday people designed to increase human welfare and not corporate profits.
That kind of sounds like a bit of a lofty goal, but there are some practical ways that we can approach it.
First of all, we want to make it easy to use.
Because plenty of privacy-respecting tools exist out there for nerds to protect themselves.
I’m a nerd.
I use some of them.
We mustn’t make protecting ourselves a privilege that’s only available to those who have the knowledge or the time or the money.
And that’s why we need easy-to-use technology that’s functional, so that includes accessibility.
If it’s not accessible, it’s not functional, convenient, and reliable.
We want to make it inclusive.
We must ensure that people have equal rights and access to the tools we build, and the communities that build them, with a particular focus on including people from traditionally marginalised groups.
A current privacy-respecting tools, most of our tools, are really bad at this, at not building inclusive technology, and often surrounding themselves with pretty toxic communities as well.
And don’t be colonial.
Easy to say, right?
But our teams must reflect the intended audience of our technology.
That’s kind of hard to do.
We can’t always build in teams like that.
Some of us work in tiny teams, or a team of one.
We’ve got to ensure that the people with different needs can take what we build and specialise it for their own needs.
We can build upon the best practices and shared experiences of other people, but we shouldn’t be making assumptions about what’s suitable for an audience that we’re just not a part of.
We should make it personal.
We’ve got to stop our infatuation with growth and greed, and focus on building personal technology for everyday people, not spending and wasting all of our focus and all of our experience and all of the money on tools that are just for start-ups and enterprises.
And make it private by default.
I’m going to say it again, I’m trying to undo the Facebook curse of privacy-washing.
Privacy is the ability to choose what you want to share with others, and what you want to keep to yourself.
And you can make your technology functional without personal information.
It’s totally possible.
You don’t need to know a person’s gender to provide them with services.
You don’t need analytics that segment people into stereotypes based on guesswork.
We can allow people to share their information for relevant functionality only with their explicit consent.
And that is consent where we tell a person how we will use their information, and when we will use it, and who will have access to it, and how long it will be stored.
I mean, that’s the GDPR, essentially.
And write easy-to-understand privacy policies.
Don’t just copy and paste them from another website, because they probably copied and pasted them from another website.
It’s completely meaningless.
And ensure that it’s up-to-date with every time you update your technology as well.
And don’t use third-party consent frameworks.
Most of them aren’t GDPR-compliant.
They’re awful experiences for your visitors, and they’ll probably get you into legal trouble eventually.
Don’t use third-party services at all if you can avoid them.
I’m aware of what a big ask that is.
But they do present a risk to you and your users.
And if you do use them, make it your responsibility to know their terms and policies.
You can’t just say, oh, well, I’m using this.
I’m sure everyone else is using it.
It must be fine.
Inexcusable.
You need to know what information they’re collecting and what they’re doing with it.
And if you use third-party scripts or services or content delivery networks and videos and images and fonts, self-host them wherever you can.
And ask your providers if it’s unclear whether you can self-host them.
A lot of the time, you can.
And it’s probably worth mentioning just a little bit of social media etiquette here.
If you know how, strip the tracking junk out of the URL before you share it.
Friends don’t let friends invade their privacy.
And if you feel the need to have a presence of social media or blogging platform, don’t make it the only option.
Post it to your own site first.
Then mirror those posts on the third-party platforms for the exposure that you desire.
And your basic blog is so much better than medium.
Make it zero knowledge.
Zero knowledge tech has no knowledge at all of your information.
It might store a person’s information, but the people who make or host the tech cannot access that information, even if they wanted to.
And keep a person’s information on their device wherever possible.
And ensure any information synced to another device is end-to-end encrypted, so only that person has access to decrypt it.
But as the old adage is true, the cloud is just somebody else’s computer.
Make it interoperable.
Interoperable systems can talk to each other using well-established protocols, which means that if a person does want to go elsewhere, they can take your information, their information, export it, and take it to another technology or platform.
This is actually also a GDPR requirement.
We also have to take care with how we share our technology and all of the hard work that we do.
How we sustain its existence.
Make it share alike.
Try to cultivate a healthy commons by using licences that allow others to build upon and contribute back to your work.
Don’t let the big tech companies use your hard free and open-source work if they’re not going to contribute their changes back.
And make it non-commercial.
We also have to care about how we share our technology and how we sustain its existence.
And support sustainable business models.
We don’t need more tech companies who are aiming to fail fast or be sold as quickly as possible.
That’s not useful for anyone.
We need long-term sustainable technology.
And support not-for-profit technology.
If we’re building sustainable technology for everyday people, we need a compatible funding model.
Not venture capital.
Not equity-based investment.
It might feel difficult.
And impossible maybe.
And it probably is.
It’s just really difficult to build small technology, particularly often in the work situations that we’re currently in.
But there are a few steps that we can take to give ourselves the opportunity to build more rights-respecting technology.
So just use small technology as a criteria when you’re looking for your next project or your next job.
You don’t have to be at your current job forever.
Seek alternatives to the software that you use every day.
So switching.software is a really cool site.
Great list of resources provided by people who really care about ethical technology as well as ease of use.
And you could just find alternatives to something from Adobe or Amazon.
And if you can’t do it at work, do it at home.
If you have time, make a personal website.
Practice small technology on your own projects.
So one of the things you can do at home is as part of Small Technology Foundation’s work on the small web, my partner at Small Technology Foundation has created Kitten, which is a free and open framework and server where everyone can own their own place.
It’s been a long time in the research and development, but it’s not long until not just developers, but everyday people will be able to set up their own place as easily as creating a Facebook or Instagram account.
I’ve been speaking at these kinds of events about tracking your privacy for probably around 13 years at this point.
Luckily, it is becoming a lot more mainstream, but I’ve been heckled by a loyal Google employee.
I’ve been called a tinfoil hat wearing ranter by someone that worked for Facebook.
And I saw this yesterday in Chris’s talk, and it really resonated with me, because I gave a talk really similar to this at an event in Berlin, where I ended up just sobbing in my hotel room at the end of the night, because it was totally exhausting being confronted by so many people who just questioned my right to even speak at that event in the first place, because they just didn’t like my constructive criticism.
Next day, I woke up to posts telling me I was a social justice warrior.
Yeah, so what?
And I’ve had people tell me that there just isn’t any other way, that I’m just trying to impede the natural progress of technology.
But as Rosa Everlef wrote in a really great article on Vox, the assertion that technology companies can’t possibly be shaped or restrained with the public’s interest in mind is to argue that they are fundamentally different from any other industry.
They’re not.
We’re not that special.
And we can’t keep making poor excuses for bad practices.
We’ve got to divest ourselves of these exploitative organisations.
Consider, who are we financially supporting or even implicitly endorsing when we recommend their work and their products?
I’m sorry, I really don’t give a shit about cool stuff that’s coming out of exploitative organisations.
I don’t care about it.
I’m over it.
Our whole approach matters.
It’s not just about our philosophy or how we build technology, but it’s our approach about being part of communities that create technology.
And you might just be thinking, well, I’m just one small person.
Have you dared take a photo of me in front of this slide?
Skip past it.
We’re communities made up of many small individual persons.
We’re groups made up of all of these people that can do things together.
If we work together on this, we can have a huge impact.
And we have to remember that we are more than just the organisations that we work for.
If you work for a big corporation that does exploitative things, you probably didn’t make the decision to do that thing.
You probably weren’t even aware that they were doing it.
But I think the time has come where we can no longer unquestionably defend our employers when they do that kind of thing or our clients when they do that kind of thing.
We need to use our social capital to be the change we want to exist.
And I speak at events like this because I know that you’re the people that can make change.
And while I’ve been doing it for so long, I have actually seen change happen in that time.
Just slowly.
A lot of people come to me after I give talks and say, well, I don’t really know how to do that where I am right now.
I have quite limited power in what I do.
So what follows is a list of characters that I’ve created.
Because I recognise these in people that I’ve seen around me.
We all have different strengths and different weaknesses, positions, and powers, and we have to work out how to use those to our advantage.
So you could just be different.
We’ve got to get comfortable with being different from what everybody else is doing.
Be the person who creates alternatives.
Be a better designer.
Seek out the better solutions to create and fund useful and fun experiences for people without them losing their power.
And without them losing agency over their own data.
Be the advisor.
So do the research on inclusive and ethical technology and understand the space that we’re in and what’s around, and then you can use that to make recommendations to other people.
Make it harder for them to make some excuses.
You could be the advocate.
So you could speak up.
Marginalised people shouldn’t have to risk themselves to make change.
You can advocate for others, and you can advocate for people who are underrepresented.
Those who are not there to speak for themselves.
You could be the questioner.
You could question the norms.
Interrogate every decision that is made on your team.
Don’t be afraid to question the intent or the impact behind something.
Behind the decisions of other people.
Your questions don’t need to be hostile.
You don’t need to be a dick about it.
You can go a long way with just an innocent little question.
And try asking how a business makes its money.
Ask me how I make money, I don’t mind.
Ask why are we building it in this way.
And be the guardian.
So when advocacy isn’t getting you far enough, you can stand in the way of those making bad decisions.
Use your expertise to prevent those decisions from being made on your watch.
You could just be difficult.
Be the person who is known for always bringing up the issue.
Embrace the awkwardness that comes with that kind of power.
Call out the questionable behaviour, and don’t let anybody tell you that standing up for yourself or standing up for other people is unprofessional.
It’s a bit like when people want you to smile more.
Don’t let people tell you to be quiet.
Or you’ll get things done if you’re just a bit nicer, and you just pretend a little bit more that it is okay.
And be the supporter.
If you’re not comfortable speaking up for yourself, and I do understand that a lot of people are in that position, be there for those people who do.
Raise awareness in your network and in your community.
Support the people that are doing the work.
Be vocal and generous in your support.
And folks working against the mainstream don’t have an easy time of it.
And a little love goes a long way.
Nothing is inevitable.
Big tech will tell us that this is the only way to build technology.
It isn’t.
So go forth.
Disrupt the disruptors.
Thank you.