March 22, 2021

#105: Your Smart Fridge Just Overturned Democracy

#105: Your Smart Fridge Just Overturned Democracy

In Episode 105, Quinn & Brian ask: How safe is your data (and can your wine fridge take down democracy)?

Our guest is: Dr. Carissa Véliz, an Associate Professor at the Faculty of Philosophy and the Institute for Ethics in AI, and a Tutorial Fellow in Philosophy at Hertford College at Oxford University. 

If she sounds kind of awesome, that’s because she is.

So, what role does your phone, computer, wine fridge, smart toothbrush, and fun color-changing ceiling lights have to play in taking down — or saving — democracy, and maybe the planet too? Well, Dr. Véliz recently published “Privacy is Power,” in which she argues that the data economy is too dangerous to sustain. And although we may not be paying money for some of the incredible services that are built on top of our data, like Google Maps or Facebook, we are going to pay in the long run if we keep opting into the surveillance state.

Have feedback or questions? Tweet us, or send a message to questions@importantnotimportant.com

Important, Not Important Book Club:

Links:

Connect with us:


Important, Not Important is produced by Crate Media

Transcript

Quinn:

Welcome to Important, Not Important. My name is Quinn Emmett.

Brian:

And my name is Brian Colbert Kennedy.

Quinn:

And this is science for people who give a shit.

Brian:

Yeah, it is. We give you the tools that you need to fight for a better future for everyone, the context straight from the smartest people on earth and the action steps that you can take to support them.

Quinn:

Correct. Our guests are professors, scientists, doctors, nurses, journalists, authors, engineers, occasionally the politician, is trying to do good in the world. Activists, educators, CEOs, investors, astronauts, even a reverend.

Brian:

I was going to add author, but you said author. This is your friendly reminder that you can send questions, thoughts, and feedback to us on Twitter by using our favorite Twitter handle @importantnotimp.

Quinn:

Just looking for a few more characters there, Twitter, thanks.

Brian:

Or email us the questions at importantnotimportant.com.

Quinn:

Correct.

Brian:

You can also join tens of thousands of other folks and subscribe to our free weekly newsletter at importantnotimportant.com.

Quinn:

That's right. Brian, this week's episode, what a great conversation, all the way from across the pond, despite fucking daylight savings trying to ruin it.

Brian:

Daylight savings.

Quinn:

We are asking ... And Brian how safe is your data? And can your wine fridge take down a democracy?

Brian:

Not safe and yes, I think.

Quinn:

For sure. And not in the way you think.

Brian:

No, no, no, no. Our guest is Dr. Carissa Veliz. Author of Privacy Is Power and Professor at the University of Oxford.

Quinn:

That's right. And she's super great and boy, did she put up with a lot of you're going really.

Brian:

All right. I feel like that's a bit, you're a bit much.

Quinn:

No, but I mean it was probably applicable in this one though, right?

Brian:

There was just a lot of like woes and what's-

Quinn:

When I was like-

Brian:

But they were genuine.

Quinn:

Tell us how bad it is. And she goes okay. So when you wake up and you're just like, "Oh, shit." That's where we're starting? Great.

Brian:

Honestly, get ready for this episode, everybody.

Quinn:

It's good. But she's optimistic. That's always great. When someone is that deep and she's optimistic and she's like, "No, this is how to fix things." That's my people. Amen.

Brian:

That is truly inspirational.

Quinn:

When was the last time you were optimistic about anything?

Brian:

I don't know, let's just listen to this episode.

Quinn:

Was it when I made you get up at 7:30 in the morning to do this? It didn't sound like it.

Brian:

Happy to do it.

Quinn:

Let's go talk to the good doctor.

Brian:

Let's do it.

Quinn:

All right. Our guest today is Dr.Carissa Veliz. And together we're talking about data privacy and how taking control of everything from your phone, your computer to your smart toothbrush could help save democracy and maybe the planet too if we are lucky. Dr.Veliz, welcome.

Dr. Carissa Veliz:

Thank you so much for having me.

Quinn:

Absolutely, we feel very lucky to have you on the mic today.

Brian:

Yes, we do. Doctor if you could, just give everybody a little brief intro of who you are and what you do.

Dr. Carissa Veliz:

Sure. My name is Carissa Veliz. And I'm an Associate Professor at the Institute for Ethics in AI at the University of Oxford. And I work in Digital Ethics. More specifically, the Ethics of AI. And in particular lately I've been researching a lot about privacy. I wrote this book called Privacy Is Power, which was selected as one of the best books of 2020 by the economist. And there I argue that we should basically end the data economy, that it's too dangerous to sell and buy personal data for democracy, but also for individuals.

Brian:

Wow.

Quinn:

It seems like it's time for someone to make that argument-

Brian:

Yeah. Somebody's guest.

Quinn:

Considering where everything is gone. It's time for someone to say it. So we're very thankful you have ... I've read the book. I loved it, I'm excited to dig into it today. Brian, if you could just remind everybody what we're all about here and then we'll do this.

Brian:

Well, of course. Our whole goal with this show is to provide us some quick context for our topic at hand today. Then we'll dig into action oriented questions and steps that we can do to help us support the mission.

Quinn:

Awesome. All right. So Doctor, before we get going, we'd like to start with one important question to set the tone for this fiasco, instead of saying tell us your entire life story as incredible I'm sure that is. We like to ask Dr. Veliz, why are you vital to the survival of the species?

Dr. Carissa Veliz:

I think that everyone is more or less aware that there are lots of problems with tech, and we hear all these concerning stories about tech making us addicted and about tech having these and that consequence. But I don't think we realize that the source of all of those problems, or at least the vast majority of those problems is the business model that is about buying and selling attention through the cultivation of personal data and the exportation of personal data. And that is really the source of our problems. And therefore it's the beginning of the solution to those problems.

Quinn:

I think that that makes-

Brian:

Wow.

Dr. Carissa Veliz:

My contribution.

Quinn:

No, it makes a lot of sense. Again I'm so glad. It seems like there's a lot of folks who are standing up and doing it, but it feels like this book and everything you're doing around it, is going to help open a lot of eyes certainly. So I'm just going to give everybody just a little quick context here. Just some backstory of where we are and why we're talking about it.

Brian:

Then you can correct him Doctor.

Quinn:

Then you please just tell me all-

Dr. Carissa Veliz:

Okay, all right.

Quinn:

The different places I'm wrong.

Dr. Carissa Veliz:

Just call me Carissa please, that's fine.

Brian:

Sure, sorry.

Quinn:

Sure. All right. So look, I was an early computer user. How old am I Brian? 38-

Brian:

So old.

Quinn:

Something like that. I got on prodigy when it first came out and then AOL, the dial up. And my parents told me I had half an hour of either TV or computer at night and instant messenger and you're on email. Then I get to college and 911 happens. And it's terrible. Then after 911, there are things like the Patriot Act and in the name of national and global security, we're asked on one level, give up a little more freedom. So to be safer, of course, they went a lot further than we thought.

Quinn:

So we really started surrendering a lot of data. And we've been voluntarily surrendering our data since ... Again, if you're about my age since you got one of those elusive Gmail invites back in the day, back when it was invite only, right? Or when Facebook finally relied, reached your college campus, your university campus, and you started adding friends. And since we started hitting the like button when it was all over the internet and using Google maps and Chrome and YouTube, and we don't really pay for any of that stuff consciously, not with our cash but with our privacy. And you probably don't pay anything to someone like Google or Facebook or some of these new companies, unless you're someone who's buying those ads yourself.

Quinn:

And yet, those are two of the most profitable companies in the history of the planet. So we've put your data and my data and all of our data together. And that's a key in the hands of companies like these whose entire bottom lines right are dependent on selling that data anonymized however, they say it might be for advertising and to governments who claim to use it to protect us.

Quinn:

But to some, who are standing up like Dr. Veliz, this is not advertising, it's a form of surveillance. And it's amazing how powerful words can be, and that we use them in the way that they can be most effective, how we frame things. Storytelling is more important than ever right now. And that involves just like the beginning of Dr. Veliz Carissa's book talking about that there's data, you're being tracked from the moment you woke up in your password vault app and everything you use, in every program you sign up for, and almost every device that you buy.

Quinn:

And it used to be just clicks. And it's understandable if you think it's just clicks still, but it's not any more. It's everywhere from your GPS to your IP address, to listening, to viewing. It's the Instagram ads for sweat pants when you've never consciously searched for sweat pants. These companies are not full-stop interested in anything but growth, not the truth, not your safety, but growth. And they live and they die by it. And that's their form of capitalism and that's the end of it, but it can be different. So that's what I want to dig into today is how to think about this, how to keep your data private. But more importantly, why you should for yourself and your family and for this whole experiment, we call democracy and life on earth.

Quinn:

So Carissa, just this week, Karen Howe published some incredible reporting on Facebook algorithms. A couple of years of work I believe. And she asks and answers the sort of holy grail of journalistic questions, right? What did they know and when? And to put it briefly, it's a wonderful piece, and we'll put it in the show notes. The answers were a lot and they knew a while ago.

Quinn:

And specifically in this case, we're talking about Facebook and political and societal polarization. And what's most damning throughout all of her reporting. So it became very clear that anything that stood in the way of growth would go unfunded and unsupported. I'm curious, how does Ms. Howe's reporting line up with what you have discovered along the way in all of your work?

Dr. Carissa Veliz:

I agree with you that it's a fantastic article. I really recommend it. And it gives you the sense of how Facebook is not really a platform that's concerned with getting us to connect with each other or having a good experience. They are concerned just about growth. And the article makes very clear how they knew that their algorithms were worsening polarization, that they were leading to extremism, that they were having all these bad consequences and yet they still prioritize growth. And that is a very common pattern these days. I think Facebook is one of the worst companies around, but it's not the only one-

Quinn:

Sure.

Dr. Carissa Veliz:

That does that. And the idea is that the incentive is for you to be engaged for as long as possible, because the more you're engaged, the more you click, the more you see, the more you like, the more you scroll, the more personal data they can get out of you. Then this theory is that the more they can use the data to target ads and that's very profitable for them. And at the beginning of the data economy, it didn't seem like having this business model was going to be so toxic. I can imagine how people were excited about a new way of funding a project that seemed really interesting for instance Google search.

Dr. Carissa Veliz:

So I think at the beginning of it all, they weren't the evil guy. They wanted to make a project, they just wanted to fund themselves, but it just turns out that the most engaging content is the worst possible content you can imagine. The most vitriolic, it's fake news, it's hate speech. It's really the worst of the worst that really keeps us engaged and angry and wanting to come back to see what happens to the debates and so on.

Dr. Carissa Veliz:

And the fault lies with them realizing that this was the case, and not stopping, and not prioritizing other things. And even to this day, not looking for alternative business models. And that's something that really worries me. For instance, in the case of Google, there is some evidence that as early as 1998, they were aware that a search engine based on ads was going to be biased, because the clients are not the users. The clients are the people who buy the ads and that immediately inserts a [inaudible 00:12:12].

Quinn:

You said 1998?

Brian:

Yeah. So it has been too long.

Quinn:

Please.

Dr. Carissa Veliz:

Exactly, they published a paper about it actually in which they mentioned-

Brian:

Wow.

Quinn:

And the key then is that in a sort of sliding doors moment, they proceeded to build that, despite the warning in it. We were fairly generalist here and cover a fair bit of great and not so great topics on this show, but it really reminds me of smoking and the tobacco companies and what they knew. And it reminds me of ... With the vehicle and the power companies, fossil fuel companies knew about emissions and climate change and what we've discovered from their papers. It's not so difficult to track this down, but what remains is the fact that they did it anyways.

Dr. Carissa Veliz:

Yes. In the case of Google, they didn't want to do it at first. They want the Google search to be something like an academic project, but after three years or so, they were desperate because they couldn't get enough funding and investors were becoming impatient and they were basically threatening to pull out. So it was very urgent for Google to find a source of funding and they said, let's just go with it.

Brian:

Wow.

Quinn:

Got it.

Brian:

So they didn't start evil like you said, but they kind of became evil.

Dr. Carissa Veliz:

Yeah. And that's one of the lessons that we should learn from this experience with tech, that it doesn't take evil people to build an evil system. So I don't particularly think that Zuckerberg for instance is somebody virtuous. But at the same time, really the system thrives because there are so many people who work for these companies without bad intentions, just kind of being one more nail in the system. And also because we cooperate with it. Now we know what it's causing, and it's our responsibility to rebel and say enough is enough. This is unacceptable.

Brian:

Could you take a step back, I guess take us through a day of being tracked some examples of what-

Quinn:

Maybe some examples. So I know you did such a good job at the beginning of the book. And it's mildly terrifying, but it shouldn't be anybody at this point, but I'm not sure Brian got a chance to read it. And I know your book isn't out in the US for a few weeks now, but you did such a good job in that first chapter. I wonder if you can give us some examples about upon waking up, the ways we are being tracked and then being at work and things like that. You don't have to read from it, but just to paint the picture for people a little bit about how comprehensive this system is now.

Dr. Carissa Veliz:

Sure. So while you were sleeping, if your phone is on most likely than not it's sending information from all the information you've collected throughout the day, all your location and messages and all kinds of things to all the apps that you have installed in your phone. And as soon as you wake up, most people, the first thing that they do is check their phones. And that sends a signal to hundreds of corporations that you've just woken up, depending on who you slept with and whether they keep their phone next to them, which probably they do, they can infer whether you're having an affair or not, or whether you slept with your spouse or whether you are single for instance.

Dr. Carissa Veliz:

Then things get tracked, whether you sleep okay. If you wake up in the middle of the night and check your Facebook, that might be a signal that you're not sleeping very well. And that might be interesting information for insurance companies for instance.

Brian:

Jesus.

Dr. Carissa Veliz:

Then you wake up, I know it's really creepy. So you wake up and I imagine you just turn on your TV to watch the news while you make yourself some breakfast. And if you have a Samsung smart TV, for instance, and there are others, those conversations, the conversations that you have with your family while having breakfast will be recorded and send it to third-party companies who we don't know who they are, but there have been some studies that show that just in 15 minutes, information gets sent to about 700 companies.

Quinn:

Crazy. And we're just at breakfast.

Dr. Carissa Veliz:

We're just at breakfast. So then you go to work and you hop on your car and if you have a smart car, it's tracking all kinds of things. It's tracking not only where you go and whether you drive fast or not, which again is information that is important for insurance companies, but it's also tracking things like your weight. So the seats in your car is tracking your weight. Again, this insurance companies might be interested in knowing whether you're gaining weight or you're losing weight. Things like the music you listen to, bankers are buying this information to try to ascertain what your mood is. And maybe if you're listening to music that is correlated with depression, then there might be doubts about whether you'll be able to pay a loan back if you're feeling depressed and it just goes on and on.

Dr. Carissa Veliz:

It's nonstop and you go into a store and there might be cameras that are using facial recognition. They might be trying to assess your emotions, exactly where you stopped in the store. And there's a technology that I find particularly creepy called audio beacons. And that's ... So imagine that you were listening to the TV in the morning, and there was an ad there for, I don't know, a product you want, some shoes or something.

Brian:

Some very soft slippers maybe.

Dr. Carissa Veliz:

Exactly. Then you search for that on your phone, and then you decide to buy it in your local shop. When you go into the shop, there might be music in the background, and that music is going to emit these audio beacons that you can't hear, but your phone picks up. And that way, the company can know that you saw that ad in the morning on TV, because your phone was there. Then you looked up the product on your phone, and then you bought it at the store. And that gives them information as to how accurate and how effective their ads are. So that's in a sense-

Quinn:

Jesus Christ.

Dr. Carissa Veliz:

Tracking all your devices.

Brian:

So that's last part was a joke, right? That is insane.

Dr. Carissa Veliz:

I wish, it's interesting.

Quinn:

I don't want to spend too much ... I feel like we could spend hours digging into each of the particular things you just went over. And I'm pretty sure we're only at like 10:00 AM at this point in your timeline. But for instance, for that one for audio beacons, how widespread is the use of that technology?

Dr. Carissa Veliz:

It's unclear because it's not like companies are broadcasting this information, so it just comes out when something bad happens or when there are some investigative journalism going on. So it's really unclear, but my sense is that it happens often enough that you probably encounter it at least say once a day, if the pandemic wasn't going on, if you were just going around your normal day and going in and out of shops and supermarkets and so on.

Brian:

Like most people in most countries?

Quinn:

Youth?

Brian:

Sure.

Dr. Carissa Veliz:

That's my guess, but it's just a guess.

Quinn:

Wow. Then speaking of the pandemic, one of the things was again to kind of finish painting this picture. A lot of folks, everybody had to ... Everyone who was able to let me put it that way. Most white collar folks were sent to work from home. And a lot of folks confronted the fact that they didn't have a lot of work from home gear. So maybe companies, if you work for a company stepped up and said, "Okay, we'll get you a monitor, we'll get you a computer for home or this or this." Can you talk just for a moment about what it means to put really any data into a work device.

Dr. Carissa Veliz:

That's quite risky. Technically, your employer is the owner of that device and they have full access to it at least legally and many times practically. So one of the things that we have been witnessing during this pandemic is that there's more and more spyware being used for the purposes of surveilling workers. And that goes for companies like Amazon, which is infamous for doing this kind of thing, but also just for other companies in which people are working from their home and bosses want to have a sense of what they're doing.

Dr. Carissa Veliz:

And the problem is of course, that there is no separation between the public and the private sphere when we use our laptops. We're using our phones and our laptops in our bedrooms. We are using email for personal and professional reasons, and it's really hard to separate those two. So once your employer gets access to that data, they really get much more data on you that they should have.

Quinn:

And you had a very specific advanced list, example, something that I feel like I've always tried to be conscious of, especially when I was working for bigger companies. Which is friends and family, and otherwise using your work email, contacting you through your work email and what that exposes.

Dr. Carissa Veliz:

Yes, this is very risky. So for instance, if you work at a university that could be ... If somebody asks for freedom of information, requests that all your emails could become public. And likewise, if you're using your work email, you work for a company, the company could access all your personal emails. If you're using that and potentially use it against you.

Quinn:

Got it. And obviously folks, the caveat being here that this applies for everyone in your household, just not yourself, and the way those devices and all that information interacts also can go a long way. So I want to talk a little bit, because one of the things ... I'm not a scientist, I'm not a data engineer, I'm not a data scientist. I'm not a doctor.

Brian:

Mm-mm (negative). No way.

Quinn:

Brian, not the time. We make very clear that we're not here to go deep into these niches, but at the same time, I was a liberal arts major. I can ask a lot of questions. I've studied and love to think about anthropology and sociology. All the reasons people do what they do and how economies and societies work or don't work. So I was really interested in ... When you talked in the book about autonomy, because especially ... For instance, one of the main ethical principles of medicine is autonomy, the ability for people to make their own decisions unimpeded by others. You may actually have a very difficult time with that in the US at least on the medicinal front, as far as end of life and things like that.

Quinn:

But I think about it in the scope of data harvesting for whatever reason. So I wonder about with the pure volume of data that we input every day to so many different places, and this is a little bit like the whole, is it fate or freewill thing, but are we autonomous anymore? If we aren't in control of what we see and thus the inputs that go into our decision-making? I know that's a little more philosophical, but I'm really kind of curious how you feel about that.

Dr. Carissa Veliz:

Of course, autonomy is a gradient.

Quinn:

Sure, of course.

Dr. Carissa Veliz:

So more often than not, it's not that we're autonomous or we're not, but rather how autonomous we are. There's a lot of concern that we are much less autonomous than we used to be, and that we have enough of an influence to for instance jeopardize things or at least putting to question things like free elections. There's always been a very tight relationship between knowledge and power, and this is something that we have known for a long time. So for instance, Francis Bacon famously argued that the more knowledge you have on someone, the more power you have over them.

Dr. Carissa Veliz:

So the more knowledge companies have over us, the more they can try to influence our behavior and predict what we're going to do and act in consequence. But the reverse is also true. The more power you have, the more you can establish what counts as knowledge. So for instance, Google knows a lot about you and they get to decide what counts as knowledge about you. So imagine that you may have, I don't know, a BA, but you might be categorized as somebody who didn't finish high school by a mistake.

Dr. Carissa Veliz:

And there's no way you can correct that mistake. There's no way you can know about that mistake. And yet you will be treated as if you didn't finish high school. So the apps that you will see will not be for high paying jobs, the kind of opportunities that you get will be according to that mistake. Another way in which autonomy is being jeopardized is that when we look at our screens, when we go and look at Facebook and Twitter, and even the news, we have the feeling that we are seeing a representation of the world out there, and that we are becoming informed in order to act according to reality.

Dr. Carissa Veliz:

But in fact, many times, even most of the times online, you are not seeing a representation of a world outside. You are seeing a reflection of yourself, or at least you are seeing what other companies think about you. So if you see a very scandalous news, it might not be because it's true, but it might be because companies think that that's going to make you totally engaged. And that's a real problem because for autonomy to be real, for us to really be able to make informed decisions, and we have to be well-informed and if we can't get well-informed because we're only seeing these mirrors of ourselves, when we think we're seeing reality, that's a real problem.

Quinn:

And it also comes into the fact that we are unable to have real discussions about things, because there is no common ground. And I don't mean that in a sort of moralistic sense, but certainly that comes into it. But in an actual factual sense about what happened, we can go back to how in court cases, eye witness accounts are always a little tough because we're finding out more and more about how our memory works and how it doesn't work and how we can change things depending on the inputs later.

Quinn:

But that also comes into play with data and it can come into play with DNA, and it can come into play where we have this inability to come to a common place because we don't even have a fundamental set of first principles to start on because those aren't being presented to us or being presented to us in an equitable fashion.

Quinn:

So I think about sort of, you talked a little bit about knowledge is power. And I mean, America has so many examples of this. I mean slaves, female slaves would be raped and there would be children. And those children would be given the white slave owners names and their past would be erased. And that's made it so difficult for black Americans now to trace their genealogy and where they came from and how they got here. And that was purposeful.

Quinn:

But I'm also looking at what's happening today and going forward and looking at sort of the people, it's a flywheel. The people with power have more autonomy than ever if we're literally defining it, because they have more power over those levers that are defining what you see as detailed in Ms. Howe's work and in your book. How they're constantly tinkering with these algorithms. But they also have power over the politicians through lobbying because of the money they make. And that helps negate the idea of one person, one vote. And you've got Mark Zuckerberg that has special shares that give him more or less unchecked power.

Quinn:

And yet we've got all these volunteers in the streets that are trying to retain American votes, trying to make sure people aren't stripped for voter rolls. Meanwhile, the bottom of the iceberg is just being chipped away over and over and over. So I want to try to think about how do we help citizens understand that it's not just their credit score that's being threatened, that our ability to change democracy and to change the course of history is also at stake here.

Dr. Carissa Veliz:

I think you're right. One way to think about it is just to look at one very common practice in data science, and that is the creation of an artificial society. So the idea is that the data scientist has an avatar of every one of us. So it's literally having a little wood doll-

Quinn:

Hey cool.

Dr. Carissa Veliz:

Of you on their screen. Then they think with it. So they show it an ad or they show it different kinds of contents. Then they look at how they respond according to their own models. Then if they respond the way that they want people to respond, they try it on ourselves, on actual flesh and bone people. Then the idea is that you could influence society to an extent that you can basically design it. It's like playing God a little bit. That's the idea, whether works and to what extent it works is a different matter. But the fact that somebody's trying that should already be very, very alarming to us.

Dr. Carissa Veliz:

One of the reasons this is happening is because there's a huge asymmetry of knowledge. They know a lot about us and we know very little about data scientists and companies. So one of the first steps that we have to make is correct that power asymmetry and that knowledge asymmetry. So on the one hand, we have to make sure that they know a lot less about us, and that's why it's very important to protect our privacy. And on the other hand, we have to learn a lot more about them, which is very important, which is why it's very important to read about these topics and get interested and understand better what's going on.

Quinn:

And it's that last part that feels like might actually be the most difficult. I mean, so Google just fired in a row their top two AI ethics researchers, both of identify as women and the next week, they launched a device that uses radar to track your movements in bed. And I mean, it's like next to the Facebook portal or whatever they call it with the camera in your kitchen. I mean, just, no thank you. They said, they're like the data won't be used for advertising, which is the same thing they said when they bought Nest and that wouldn't be used for data harvesting. But the ethical questions around these devices and how these companies don't ask for permission, they ask for forgiveness and then they don't care if they get it.

Quinn:

And the questions about the data that they input and are on the receiving end of ... There's so many questions, there are so numerous and so powerful and necessary that we need women like that. We need people like that, especially inclusive voices, the world's best thinking about this and working on it. But I almost feel at this point if ... I mean, it's just like when people say the market will fix everything, these companies are not going to regulate themselves, how can we trust them with this sort of research? It's like when an agribusiness does research on pesticides, their business models are so very clearly diametrically opposed with the work that needs to be done.

Quinn:

So as much as I feel like every company from a startup to a market leader needs some sort of chief liberal arts major question asker, I wonder how effective those people can be and how that can be and what your thoughts are on the ways that we can actually get inside these companies and their practices and their black boxes. Do you see a way into that?

Dr. Carissa Veliz:

Absolutely, I'm actually very optimistic.

Quinn:

Thank God.

Brian:

That we need some of that around.

Quinn:

Jesus.

Dr. Carissa Veliz:

We just have to look at history. I mean, we've been able to regulate every big industry that has ever come our way, including the first ones that took us by surprise. It wasn't particularly easy to regulate someone like the people who created the railroads, but we did it and we regulated airplanes and cars and food and drugs, and now it's our time. So our ancestors took care of those industries, and now it's our time, and there's no reason why we're not going to be able to do this.

Dr. Carissa Veliz:

And I think those measures are already being designed. Right now there's so much debate about how to do it and when. So six years ago, the conversations that we're having now would have been incredibly unbelievable, just like conversations about regulating Rockefeller in its time at the beginning were unthinkable, and then eventually it just happens.

Quinn:

I mean, I hope so. It's just like since those times, there is so much at least in the American system, especially in the American system, there's so much money in politics and decisions like Citizens United have made a company equals a person equals a vote and the dark money that's involved is just it's immeasurable. I mean it's incredible how much is in there. When you look at the folks that were voting against our new secretary of the interior, the first indigenous person to hold a secretary position in a cabinet, and how much oil and gas money they're getting. It just seems like the hurdles are I guess much higher than they used to be, but maybe my context is wrong there.

Dr. Carissa Veliz:

I think you're right. We have more obstacles in the sense at least two ways. Yes, there's more money in politics. And on the other hand, these companies are much more opaque than railroads. It's easy to see railroads and what they're doing. It's not easy to see data. However, we have somethings that are less difficult than they used to be. For one thing, we've already regulated big companies before. So it's not our first time, and that matters. And another thing that I'm quite optimistic about is that many countries want to regulate these companies. It's not only the US, it's Europe, it's the UK, it's Japan, it's Australia, it's New Zealand. And we can use that. In the past, we were alone regulating our companies, but now there is a global society that is very, very much aware that we need regulation. And we should use that in our favor.

Quinn:

That's an interesting point, that these companies, so many of which are US based are so widespread now and such a part of global life, that everyone is on board with regulating them in some way at this point, right? It's not just US railroads.

Dr. Carissa Veliz:

Exactly. And we will need that because it kind of flows across boundaries and across borders. So we will need a lot of diplomacy and cooperation. It is the time to revive those old alliances that we used to have between Europe and the US.

Brian:

[inaudible 00:35:28] I'm on board. Carissa, one thing I wanted to, I don't know, try to wrap my head around is how some of the biggest companies of our time build products that are free to users and how those companies represent an enormous part of our stock market and our economy. And it's because they've become such otherwise indispensable pieces of our everyday lives. But also because being free means growth has been just so easy. And because our outdated antitrust laws don't solve for that growth, they've just grown and purchased competitors and are just unavoidable.

Brian:

And a lot of that growth is dependent on our networks, on the people that we are connected to and with, because data isn't harvested in isolation. And every single time you share a picture of yourself, or your family or your kids, or your friends, which for me is a thousand times a day for most people on Instagram. You use to train facial recognition algorithms. How are we exposing friends and family when we share our data? Because I'm going to need to know how to apologize to everybody in my life.

Dr. Carissa Veliz:

One of the things I argue in the book is that it's a mistake to think about privacy as an individual thing. Yes, it is about personal data, but it's also about collective data. So if every time you expose your location, you're exposing things about your neighbors, the people you live with, the people you work with. Every time you expose your genetic data, you're exposing your parents, your siblings, your kids, your very distant kin. So every time you expose yourself, you expose other people as well. And that can mean any host of things. So it can mean, for instance, there have been cases in which genetic data donated by people has been used to deport someone or to ... Yeah, in very questionable ways.

Dr. Carissa Veliz:

It can mean that somebody gets their life insurance denied because of a genetic test that they didn't even do themselves. It can have all kinds of consequences. And many times you don't know what other people have as vulnerability. So you might take a picture and there might be somebody in the background and you publish it online. And it turns out that that person is running away from an abuser for instance. And you have no way of knowing that. So we should err on the side of caution always.

Brian:

And throw our phones in the trash can, got it.

Dr. Carissa Veliz:

That will be nice. At the very least we don't upload pictures without asking for consent.

Quinn:

So despite all of that, I can still understand how folks have a hard time. And sometimes they don't want to, if they enjoy putting their kids on Instagram to share with friends and family, right? Especially during a pandemic when we're disconnected. But at the same time, I can understand, because it seems a little futuristic still somehow to go from ... I bought one of those cool smart fridges that shows you on the inside, what food you're out of, to Cambridge Analytics and Myanmar and the state is using facial recognition against me as I protest an election, there goes democracy.

Quinn:

It's one thing to talk about how terrible it is when an algorithm doesn't show the same well-paying job like you were saying to black Americans as whites, or how parole apps get misused by their creators. But there's the broader more dangerous consequences of this facade of privacy. I think about ... And again, they've done nothing to fix these things. Is Facebook feeds, YouTube recommendations and the used to be cool seaming Google search results that pop up as you're typing. They take information you've given historically and the people around you and where you are, and they use it to anticipate the next logical thing you might be looking for. And sometimes that can result in some pretty dark stuff.

Quinn:

I mean, we've seen that happen with YouTube and Facebook groups from QAnon. Then all of a sudden it's misinformation or it's disinformation about elections or the pandemic, your health, protection and treatments in vaccines. I guess I'm curious and I know it's different everywhere because everywhere has different laws of course, around this stuff. But how have we not made it imperative that user's health and life is protected? I guess, how did we let it get this far?

Dr. Carissa Veliz:

So partly it was just not realizing what was happening. I think Brian mentioned this, or maybe it was you Quinn a few moments ago about how our litmus test for whether something is a monopoly failed us. So typically, our litmus test is, if a company can increase their prices and not lose customers, that's kind of dodgy and we should look into it. But of course these companies are supposedly free, so that doesn't apply. And of course the general principle should be when a company creates some kind of abusive conditions, whether it's higher price or giving up too much data or something else, and they don't lose customers, that's what's suspicious and we should look into it.

Dr. Carissa Veliz:

So it was partly just naivety and not realizing what was going on and not wanting to realize. Partly it was that government wanted to spy on people and figured that they could literally make a core data and that would help them keep citizens safe. Now it turns out that that kind of data doesn't actually help prevent things like terrorism. And now they're realizing that on the other hand, it's a huge risk for national security.

Dr. Carissa Veliz:

Here are a couple of examples just to show this point. Maybe a year ago, the New York Times published an article in which two journalists who were not very tech savvy, managed to find out the location of the president of the United States. And this was through data that is easily accessible by buying it from data brokers. And they figured out who had the phone of a secret service agent, because they just correlated the president's schedule with this phone. Then they figured out where the president was. If the president of the United States is not safe from a couple of non tech savvy journalists, nobody is. And the whole country is unsafe.

Quinn:

Sure.

Dr. Carissa Veliz:

Another example is hackers only need to hack about 10% of electrical appliances and turn them on at the same time to bring down the national grid. Now, just imagine that in the context of the pandemic, how harmful that could be. So having this system of collecting ... Partly, I mean, the internet isn't secure in large part to allow for the collection of personal data, and now governments have a motivation to change that because they're realizing it's actually very, very dangerous for national security.

Quinn:

Well, it almost seems like ... And we try to be very increasingly candid and transparent and where we can proactive about this, which is again, we cover a fair load of things from climate change to health care, to things like this. And when you see stats like black moms in Illinois are six times more likely to die in or after childbirth than white moms. And you go a warfare it was twice as bad, that would be a disaster. But when it's six times as bad you go, "That's because the system is designed that way." And it feels like ... And please correct me if I'm wrong. But that's what you're saying about up to this point, that the system was designed to allow for this harvesting because the government wouldn't regulate them because they were doing it as well. And they saw benefit in that. Is that correct or am I off base?

Dr. Carissa Veliz:

Yeah, that's correct. We built the internet to be really insecure so we could collect data and the government had an interest in that, so they didn't do anything about it. And now we're realizing that our cyber security standards are so disastrous and are in huge danger at the moment.

Quinn:

I mean, I feel like we haven't even ... I mean, look again, you guys had Brexit and a bunch of other issues over there. Our capital guts stormed a couple months ago, but-

Dr. Carissa Veliz:

That's scary.

Quinn:

I don't remember if it was the same week or the week before, but we had the, is it the SolarWinds hack that apparently was absolutely catastrophic throughout the federal government to the point where they said, it's going to take years to even figure out what was lost. And I feel like that's just buried in the news.

Dr. Carissa Veliz:

It's true. And it's so complex and it's so unclear that we don't know what the consequences of that are. We don't know if Russians still have access to many of those systems. We don't know whether they changed anything. Is really, really quite scary.

Brian:

How could we be that unprepared? That is so wild to me.

Dr. Carissa Veliz:

It is wild. I don't have an answer to that.

Brian:

No. Sorry I was just sort of asking the why?

Quinn:

No, I think it's a complicated one. I think it was a little bit of by design. I think it's a lot of antiquated systems. I think it's a lot of budgets for that sort of things were hold back. Look if you're an IT person installing stuff for hundreds, if not thousands of new people is a pain. So you always have to wait to do the newest technology because you have to make sure it doesn't break anything. So other things languish behind. And then we always see the laughable headlines on CNN or whatever that the most popular password is one, two, three, four, five, or whatever it is.

Brian:

Yes.

Quinn:

Everyone does that everywhere. So it seems like it's a very comprehensive problem, but at the same time, Carissa, like you were saying, we have no idea what we don't know yet about the implications of what happened with that. And that's going to keep happening as we try to electrify our grid and bring things online, and make smart meters more available so that people can use their solar panels, and their batteries, and their electric cars to feed back into the meter and get paid for them. But I believe what, I'm trying to remember, was it in your book or I'm trying to think it was in your book or an article I read by you where you're talking about how unsecure smart meters are. It seems like there's just enormous trade offs as we go forward.

Dr. Carissa Veliz:

Another reason is because at the moment, companies don't have enough of an incentive to have good cyber security, because it's very expensive. It's invisible, so we as customers don't appreciate it because we don't know how to. We roughly know what a good door looks like. We have no idea what a good cybersecurity app looks like, and if something goes wrong, usually the company gets a free pass. And the ones who bear the brunt of it are citizens or the government. And that's why it's something that has to be regulated by the government because companies will never do this on their own. They have no reason to do it.

Brian:

Everything sounds great.

Quinn:

She says she's optimistic Brian.

Brian:

I know I'm so sorry.

Quinn:

Come on.

Brian:

I got to get on board with your optimism, sorry about that.

Quinn:

Let's do it through action here.

Brian:

Exactly. Carissa, so our entire bread and butter on this show is helping people understand how to think for themselves about what is happening in science and what exactly that they can do about it. And we call those like our action steps here. And we really work to make sure that they're reputable and effective. And for every topic like this one, we try to build a portfolio of action steps that people can take to improve the world, but also for themselves and their families, and their investments in companies to really create systemic change. So let's get into that. And let's start with the personal level, what are the most effective personal privacy action steps, moves that people can take right now?

Dr. Carissa Veliz:

There are many steps. One is just choosing privacy friendly alternatives. So instead of using WhatsApp, use Signal, instead of using Gmail use something like ProtonMail, instead of using Google search, use DuckDuckGo, there all kinds of friendly alternatives out there. Another thing that people can do and is very important is to choose the right devices. If you choose a device that is made by a company that earns their money mostly through personal data, you know that your data is going to be collected. There's a conflict of interest there. So choose a device from a company that earns their money through selling hardware, not through selling your data.

Dr. Carissa Veliz:

Another important thing to do is to contact your political representatives, tell them that you are worried about privacy, ask them what they're doing to protect it, pressure them, that has a huge impact. Something else can do is to write to companies and ask them for your data and ask them to delete your data. Even if you fail, creating that paper trail is super important, because it shows regulators once there's a problem that you weren't consenting and that the company didn't act like they should have.

Quinn:

Interesting. That's super compelling. As vis a vis calling representatives, we try to help make it specific for folks. Are there's specific and our listeners are about 70% US and the rest is kind of split between Europe and Asia and South Asia. Are there specific bills or things like that that are in play that we should be speaking about?

Dr. Carissa Veliz:

So at the moment there are a few being discussed and it's unclear what's going to happen in the near future. But in general, in the US for instance, now is the time to pressure your political representatives to try to come up with a federal privacy law. That is the challenge right now.

Quinn:

Maybe we'll talk offline if there's any other folks that you have recommendations so that we could talk to about that, who are in it, and working on that.

Brian:

That's pretty.

Quinn:

Do you have any and it's totally understandable if you don't want to go on the record about this, but if you have specific recommendations for like you were saying, companies that are not in the business of selling your stuff versus ... Basically, are there any specific good guys or bad guys as far as this goes? I hate to simplify that most, but I'm trying to help people think through this. Because when they walk into a Target or a Walmart and things are half off and the TV looks great, it's a hard one to persuade them to do otherwise if they can save four or 500 bucks.

Dr. Carissa Veliz:

Well, one way to think about it. I mean just do a search, the bad guys, they're pretty obvious and the good guys-

Brian:

Search for bad guys? On it.

Dr. Carissa Veliz:

Yeah. Bad guys tech, I'm sure the list will come up. One way to think about it is that, you might save $400 now, but you're going to pay for it later. If you get discriminated against, if you get denied a loan, get denied a job, so you're not actually saving money. It's more an illusion.

Quinn:

Could you tell the story and from your book about the smart TVs and sort of the language behind them as far as what information they're collecting or when you think they're off and they're not off.

Dr. Carissa Veliz:

So the privacy policy of the Samsung TV was pretty chilling. It says something to the effect that you shouldn't have private conversations in front of your TV if you don't want those sent to third parties, which of course is ridiculous because your TV is usually in the middle of your house. Was that what you unwind?

Quinn:

Yeah, I mean, because we hear these stories about, "Oh, Alexa was listening and it turns out these people were manually listening so they could train it." And they were listening to private conversations or they were listening to sex or fights or whatever it might be or domestic violence. Then they'll backtrack and say like, "Oh, we're not going to let those people do it anymore, it'll be automated." Or only certain people have privileges. Then I believe it actually even happened with Apple with their smart speakers.

Quinn:

But like you said, not everyone has those, everyone has a TV and the smart ones like Samsung or Vizio or whatever it might be. They're in the middle of every home, the index on how many people have a TV, what percent of people have TV is pretty high. So it seems like should we just not opt to connect those to the internet?

Dr. Carissa Veliz:

I certainly don't. And one good idea is to buy stupid objects instead of smart ones. So I go into the stores and I just ask I want something that's not connected to the internet at all.

Brian:

What's your oldest worst thing, that's what I want.

Quinn:

Basically what's your stupidest object?

Dr. Carissa Veliz:

Exactly.

Quinn:

Like are you on a tin can with a string, calling us on that? [crosstalk 00:53:03].

Brian:

I love this, this is incredible.

Quinn:

I think it makes sense. Let me ask you this. Is there places where these companies are doing good with our data? Is there examples out there of places that have used this in an effective way, either individually or for the public good, where the trade-offs are worth it? Because I can't imagine there's situations where there aren't trade offs, but you seem so optimistic about being able to control them and regulate them and make this work and give us more of our data and look at more what these companies can do.

Quinn:

Is there anyone out there? Basically, we just had this conversation with two incredible guys who are working on the new global.health project and it's supported by Johns Hopkins and Oxford and Harvard and Northeastern and google.org is paying for a bunch of Ed and things like that. And they're trying to build as many COVID records as they can to try to understand how this thing really works across demographics and DNA and what we can do going forward for COVID, but also for future infectious disease work.

Quinn:

Because we've done kind of a poor job with that and you want to cheer for a project like that. And I do, but I also asked them, how do you solve for trust from the beginning because of how people feel and because so many of these companies and projects aren't trustworthy. Where are the good examples and are there any.

Dr. Carissa Veliz:

Well, a few good examples are companies like DuckDuckGo that managed to offer really good search and they don't collect your data and they don't use it for marketing. Another really good example of Signal, but Signal is a nonprofit, so they don't collect your data. In terms of cases, in which companies do collect your data, but it's justified and they use it in a good way, it's very hard to know because at the moment, it's just the data economy is all over the place. So whenever you give your data, there's a risk that it's going to end up in the wrong hands.

Dr. Carissa Veliz:

So I think it will be much easier for companies to have a good use of the data if we banned selling data. And if we ban personalized content, and if we banned many of the practices that we have today. If it wasn't profitable to collect your personal data, then companies would only collect it when they need it. And they will use it for good purposes because otherwise it's a liability to them.

Quinn:

That seems to make sense. It's the selling mechanism, that's really the crutch of this thing. It seems like, I mean, Facebook taking ... Apple turning on the new disclaimers they made every app put out with iOS 14 and Facebook taking out New York Times ads and putting out their own disclaimer saying, "Well, this is going to hurt small businesses." They're not going to give this thing up willingly.

Dr. Carissa Veliz:

No, no, no. And as long as we allow for personal data to be bought and sold, we create the wrong incentives. At least two of them, we create the incentive of companies collecting more data than they need, and that is hugely dangerous for national security and for individuals. And we create the incentive for them to sell it to the highest bidder who is often not the institution or person with the best interests at heart.

Quinn:

All right, that clarifies things.

Brian:

Well, Carissa, it's been ... We've probably kept you for pretty close to as long as you need to be here. So first of all, thank you so very much, I'm glad that we were able to make this happen today. Sorry about the confusion.

Quinn:

It's Brian's fault.

Brian:

Please just blame everything on me, willingly-

Quinn:

Always Brian, he's a big supporter.

Dr. Carissa Veliz:

No, sorry about that.

Quinn:

He's a big fan of daylight savings and it's not pretty, it's not great.

Brian:

But we do, we would love to ask you we have a few little last questions we ask everybody. Does that sound okay?

Dr. Carissa Veliz:

Sure.

Quinn:

Carissa when was the first time in your life when you realized you had the power of change or the power to do something meaningful?

Dr. Carissa Veliz:

Maybe it was when I was teaching kids and especially adults to read, that seemed incredibly powerful to learn that an adult wasn't able to take the bus because they couldn't read the signs was just such an eye opener for me. And to be able to teach a person to read seemed kind of magic.

Brian:

That's incredible.

Quinn:

That's awesome. I love that. Doctor, who is someone in your life that's positively impacted your work in the past six months?

Dr. Carissa Veliz:

In the past six months?

Quinn:

You can't say Brian. So just skip over that one.

Brian:

I know, exactly. Right at the top of your list there.

Quinn:

You got to somebody else.

Brian:

Yeah.

Dr. Carissa Veliz:

I really enjoyed Christopher Wylie's book: Mindfuck. I thought that was very interesting. And it explains with a lot of detail what happened with Cambridge Analytica. Another book that I highly recommend about these topics that I read more or less recently was Edward Snowden's biography, autobiography.

Quinn:

Boy did he set off quite the chain of events here?

Dr. Carissa Veliz:

Yeah.

Quinn:

There's so much we wouldn't know without that. I've had mine on my list for a while. Do you feel like it's still pretty relevant to what we're going through now? Understanding what happened then?

Dr. Carissa Veliz:

For sure. Even though Cambridge Analytica doesn't exist anymore, there are about 300 firms that are very similar, that are still doing what they do.

Quinn:

Wow. That's awesome. Thank you. Brian, bring it on home.

Brian:

Bring it on home. Hey, this is a really great one. I love this, especially when somebody has such a high pressure so much to think about all the time for everybody else. What do you do for yourself when you feel overwhelmed, what is your self care?

Dr. Carissa Veliz:

I go out to walk and I read novels.

Quinn:

That's pretty great.

Brian:

Those are two great ones, holy cow. Then now I think we always, we love this question. I think you just answered it unless there's another one you want to throw on, but we have a little collection of books that we have online that everybody can access to purchase if they want books that our guests recommend. You obviously just mentioned two. Anything else that you've read recently that's I don't know, changed your thinking in some way or was about a topic you hadn't considered before?

Dr. Carissa Veliz:

I more or less recently read Maya Angelou's I know Why the Caged Bird Sings, it's not about tech at all.

Brian:

That's okay.

Dr. Carissa Veliz:

But it's one of those books that really left a mark.

Quinn:

That's awesome, that's an incredible recommendation. Doctor, where can our listeners follow you online?

Dr. Carissa Veliz:

Mostly on Twitter. My Twitter is just my name Carissa Veliz and I rant.

Quinn:

Perfect. That's awesome. Well, doctor we can't thank you enough for your time today and for all the work you're putting in to make this something that is understood better and at the same time actionable so that we can start to turn the ship around before it gets even more out of hand. We really do appreciate it. So thank you so much for coming on and for all that you do.

Dr. Carissa Veliz:

My pleasure, thank you so much for the invitation.

Quinn:

Thanks to our incredible guest today. And thanks to all of you for tuning in. We hope this episode has made your commute or awesome workout, or dish washing, or fucking dog walking late at night, that much more pleasant. As a reminder, please subscribe to our free email newsletter at importantnotimportant.com. It is all the news most vital to our survival as a species.

Brian:

And you can follow us all over the internet, you can find us on Twitter @importantnotimp.

Quinn:

That's just so weird.

Brian:

Also on Facebook and Instagram at Important, Not Important, Pinterest and Tumblr, the same thing. So check us out, follow us, share us, like us, you know the deal. And please subscribe to our show wherever you listen to things like this. And if you're really fucking awesome, rate us on Apple podcast. Keep the lights on, thanks.

Quinn:

Please.

Brian:

And you can find the show notes from today, right in your little podcast player and at our website importantnotimportant.com.

Quinn:

Thanks to the very awesome Tim Blane for our genuine music, to all of you for listening. And finally, most importantly to our moms for making us. Have a great day.

Brian:

Thanks guys.