SCIENCE FOR PEOPLE WHO GIVE A SHIT
Nov. 27, 2023

The Creepy Reality of Consumer Tech (Privacy Not Included)

What are the best holiday gifts that aren't privacy nightmares?

That's today's big question, and my guest is Jen Caltrider.

Jen is the lead researcher for Mozilla's Privacy Not Included program where since 2017 Mozilla has published 15 editions of Privacy Not Included, their Consumer Tech Buyer's Guide.

They've reviewed over 500 gadgets, apps, cars now, and more, assessing their security features what data they collect from you and your loved ones, and who they share that data with or sell it to. They have even built the first annual Consumer Creep-o-meter, distilling what's good, what's bad, and what's just plain creepy in the world of consumer tech.

While I love new tech and I own quite a bit of it, I have become pretty obsessed with at least understanding what I'm getting myself into and getting my kids into.

Part of that's for myself and my family, but obviously, so I can share it with you all as well.

-----------

Have feedback or questions? Tweet us, or send a message to questions@importantnotimportant.com

New here? Get started with our fan favorite episodes at podcast.importantnotimportant.com.

-----------

INI Book Club:


Links:


Follow us:


Advertise with us: importantnotimportant.com/c/sponsors

Mentioned in this episode:

Support Our Work

Transcript

Quinn: [00:00:00] What are the best holiday gifts that aren't privacy nightmares? That's today's big question, and my guest, my timely guest, is Jen Caltrider. Jen is the lead researcher for Mozilla's Privacy Not Included program where since 2017, 10, 000 years ago Mozilla has published 15 editions of Privacy Not Included, their Consumer Tech Buyer's Guide.

They've reviewed over 500 gadgets, apps, cars now, and more, assessing their security features what data they collect from you and your loved ones, and who they share that data with or sell it to. They've reported recently on the alarming lack of privacy in cars, we will talk about that, who basically track your entire life including often storing your text messages, which is completely insane to me.

They have even built the first annual consumer creep o meter, distilling what's good, what's bad, and what's just plain creepy in the world of consumer tech. And look, it is the holiday season, [00:01:00] sure, but these conversations are meant to be fairly evergreen, and obviously, it's never a bad time to ask, is this robot dog, or this kid's smartwatch, listening to everything I say, recording it, storing it, not securing it, selling it, along with my heartbeat and location data?

Whatever, the answer is usually yes. While I love new tech and I own quite a bit of it, I have become pretty obsessed with at least understanding what I'm getting myself into and getting my kids into. Part of that's for myself and my family, but obviously, so I can share it with you all as well.

Welcome to Important, Not Important.

My name is Quinn Emmett and this is science for people who give a shit. In these weekly conversations, I take a deep dive with an incredible human like Jen who's working on the front lines of the future to build a radically better today and tomorrow for everyone. Our mission is to understand and unfuck the future and our goal is to help you answer the question: What can I do?

[00:02:00] Let's go talk to Jen.

Jen welcome to the show. Thanks for coming.

Jen Caltrider: Oh, thanks for having me.

Quinn: And thanks to all your pups for joining us today. It's very exciting. They are obviously big supporters of data ethics.

Jen Caltrider: They are. They're great researchers and you might hear them in the background because they have opinions too.

Quinn: Yeah, sure. Oh yeah. My dogs have a lot of opinions. We agree on some of them depending on the time of day. Jen, really excited to dig into all the work you all are doing here. Obviously, it's really important, but we do like to start with one important question to set the tone, it's a little tongue in cheek, but it's usually kind of fun and we get something good out of it.

So, Jen, why would you say that you are vital to the survival of the species? And I encourage you to be bold and honest.

Jen Caltrider: Oh, well, I am absolutely not vital to the survival of the species. The species would probably be better off without me. But, you know, I mean, I don't know that the Cultrider DNA, I mean, it's had a good run, the Cultrider DNA.

[00:03:00] Come from a long line of hillbillies from West Virginia. The rumor has it came from Scotland after some brothers, Scottish brothers, got in a fight and kicked one out and the brawling brother ended up in West Virginia and those are my ancestors. I don't brawl. I don't drink. I'm gay, so I don't reproduce.

So I guess that the universe determined that I wasn't really vital to the survival of the species. But in lieu of, you know, making the species survive, I try to make my time on the planet as good as possible by living by my motto, don't cringe, leave it better than you found it.

Quinn: That's so wonderful.

There's a thousand ways, a million, million ways to contribute, and I think that's wonderful. Your family history sounds like in the best way, like one of the, the Showtime shows with the Scottish guys and all of that. My wife loves them. It's great. Have you delved into the ancestry stuff at all, or is that like a data ethics nightmare that I don't want to know about either?

Jen Caltrider: Oh, I would never give my DNA [00:04:00] to any of those DNA analysis places.

Quinn: What about just like the part where you search the public records and the census and all that? Would you do that part?

Jen Caltrider: Yeah, I mean, my family, you know, grew up, you know, my parents are actually cousins. So I, you know, yes, I'm from West Virginia.

And yes, my parents were cousins. They weren't first cousins. They were like third cousins.

Quinn: That doesn't even count.

Jen Caltrider: Yeah, it doesn't really count, especially in West Virginia. And I'm a very proud West Virginian. I do know some history of my family going back a bit, but I just like the story. Who knows if it's true, if the brawling brothers from Scotland and that's how they ended up in West Virginia, but I like it. I, yeah, I will own it.

Quinn: I think that's great. I have done the non DNA version just cause my family's from all over the place. And I thought, you know, it was one of those, as my grandparents were getting old things, I was like, boy, I better ask these questions now and lock some of this in.

Because like you said, some of it's myths and that's wonderful. Some of it is half truths and that's also wonderful, but I would love to get, you know, as many dates and names as I can sort out and then leave it at that. But [00:05:00] yeah I'm definitely not sending them my spit. Forget it. No, thank you.

Yeah. On that note, well, thank you. Well, your work is very important and very timely and getting more timely every day, obviously, because again the potpourri of data that is collected and the volume of it is truly astounding every day and, not just collected, but stored and then often sold, as we know.

So I'm really appreciative of what you all are doing. So I want to set the stage a little bit for, I know you all have been doing this work since 2017. Is that right about this particular version?

Jen Caltrider: Yeah, we started Privacy Not Included back in 2017, almost on a dare, just to see if people did care. If they if we could, A, if we could do it, if we could review connected products for privacy and security and B, if people cared. So, yeah.

Quinn: Well, I love that. 2017 was a hundred years ago. And so a lot has changed. But before we dive right into the sort of winners and losers and all that I want to, again, sort of set the stage, but let's talk [00:06:00] briefly about how you developed your standards for these things, how you rate and define the categories and the products, and who developed the ratings and how they might have changed since 2017?

Jen Caltrider: Yeah, well, as I mentioned, we kind of almost started on a dare. You know, Mozilla is mostly known for Firefox, right? Where we make a web browser and a bunch of other things, but above all that, we have a mission. And the mission boils down to, Hey, let's make the internet suck less and be work for everybody, which is a laudable mission.

And so one of those things that we wanted to do back in 2017 was be like, Hey a lot more connected devices coming on the market. We got these smart speakers from Amazon and we got these fitness trackers, a lot of people review products, but nobody's reviewing them for privacy and security. We should do that.

And the idea came up to do a buyer's guide and it was just very generic. Let's do a buyer's guide. And somebody had the idea of putting ten tips to shop for connected products on a PDF that people could download. And I was like, that's not a buyer's guide. We can't do that. [00:07:00] I said, a buyer's guide is where you review products.

And they're like, yeah, but we're Mozilla. We don't review products. And I was like, well, not yet. So yeah. So I stumbled across another researcher at Mozilla who was getting ready to write a white paper about connected products privacy and security. And I said, you know what's more interesting than a white paper is a buyer's guide.

And so that's how Privacy Not Included was born. I thwarted yet another white paper and turned it into Privacy Not Included and to develop our criteria it was very much trial and error. We well, you know, we sat down, I sat down and was like, well, what do people care about? They care about if products think, if products can snoop on them, if what they know about them, if they can control the data and, you know, just kind of ask the questions that consumers would ask.

Cause that's what I was. I wasn't a privacy researcher at that point. I was mostly just a consumer. And then fortunately my researcher friend who I worked with had a lot more knowledge than me and bestowed it upon me. And so over the years, we've evolved criteria that look at things like does the [00:08:00] product meet our minimum security standards?

Meaning does it use encryption or require a strong password? Have a privacy policy, have a way to manage security vulnerabilities. And then we looked at how companies use the data. Are they collecting data on you that they don't need? Are they collecting only the data they do need? Are they using it for the purpose of providing the service they're offering?

Are they using it for a lot of other purposes? Are they going out and collecting data from lots of third party sources to combine it on you to build a big profile? Can you control your data? Can you delete it? How long do they retain it? And then what I call the Facebook criteria, which is the track record of does the company have a track record of protecting and respecting our privacy and Facebook doesn't. They have a long history of bad actions. And so, we consider that when we look at companies and review their privacy and security.

Quinn: Thank you for sharing all that. It sounds so comprehensive and necessary. You know, what are you collecting? Why are you collecting it?

How long are you keeping it? Where is it going from there? How are you using it? [00:09:00] All of those things. People come to us just like they come to Mozilla, willingly. No one's forcing them to do so. And as we call it, it's science for people who give a shit. So you're here for some pretty specific reasons, right?

And at the same time, it's interesting. It's always been interesting since the days of Gmail, right? Where everyone goes, it's free! And it's like, sort of, depends on how you put it, right? And the same thing with all of these social media companies that have exploded forever. But you still get that pushback sometimes from folks who go, understood, this company has collected this.

They said they weren't going to. They did anyways. They've got these conversations, these transcripts. They keep them. It's not secure. And, or they sold it. And who, what low-level person has access and all this different stuff. And they go, why should I care? Like, how is this going to hurt me?

And that's besides me just yelling at people, generally. That's what I really want to focus on for a moment is like making clear to folks why this matters. Beginning with the fact that it's not just my [00:10:00] data or your data. Because of geolocation stuff, it's usually combined and played off each other. But why should people really care?

What are real world examples of how this has affected folks?

Jen Caltrider: I, you know, I get the why people care, should care question a lot. And I have a million different answers. I might just kind of vomit a bunch of them at you and hopefully some of them will make sense. I'm going to start back in, I think, 1998 with one of my favorite movies of all time. I don't know if you've seen Enemy of the State.

Quinn: Oh my god. You mean the, in the not spoken about sequel to The Conversation?

Jen Caltrider: Oh, well, I don't know about The Conversation, but I do love Enemy of the State. And it's one of my favorite movies of all time. And it's one of those movies that you can watch now.

What is it? 25 years later? And it's a tech movie. It feels just as relevant now as it did then, and no tech movie does that, right? And so for those of you who haven't seen it, it's with Will Smith and Gene Hackman. It is the perfect story of why we should all care, because you can be a perfect [00:11:00] human being and do everything right, and you could end up in a situation where you have information or data that somebody wants and they're going to take it from you one way or the other.

It's maybe an extreme case, but I'm just putting that out there that if your listeners haven't watched Enemy of the State, go out, watch it now. It's streaming somewhere, I'm sure. But kind of putting all that aside, putting the Hollywood version aside, it's a great, it's a great version, by the way, but you know, I grew up in rural West Virginia as we've also established here. I'm trying not to start with and the Earth cooled, but I might start with and the Earth cooled. And I grew up on a farm and it was private and we ran around and did stuff and you know, like we messed up and lots of times nobody knew that we messed up. You know, I spent a few hours treed in a tree surrounded by angry cattle because I was somewhere I shouldn't have been.

And then I learned my lesson. Not to go there and make the cattle angry. And I feel like, you know, those lessons that I learned as a kid happened because I had some measure of privacy. And I think about [00:12:00] kids growing up today with no privacy. One of my biggest bees in my bonnet that I get is smartwatches, smart trackers for kids. I'm like, let kids be kids. You don't have to know where they are every second. Let them mess up. I'm not a parent though, so I don't know the fear of the other side. But I do think that just in our souls as human beings, we change when we don't have privacy. And I'm not sure that it's a change for the good.

I think that we're constantly on edge, I think we're constantly overthinking things, I think we're doing things that maybe we wouldn't do because we always feel like people are watching and I don't think that's healthy for the human race. And so that's a really big picture of why I think privacy matters.

On a more like day-to-day level, you know, there's so much information out there about us now that's being collected by companies with algorithms now that are just insanely good at collecting data and mashing data points and making these inferences about us, about how smart we all are and what our abilities are and what we like or what we don't like.

And a lot [00:13:00] of that is used to get us to buy stuff, which isn't great, you know, especially I imagine you're a kid and you grow up liking something because youe parents wanted you to like it, turns out you actually don't like it, but the algorithm thinks you like it, so you can't get away from it.

You know, it's no, I don't like football, I like to paint, but football follows you everywhere, and your guilt is there, and you can't get away from it. That's one thing, but the thing that really creeps me out is the ways that the information that can be bought and sold about us, can use to manipulate us with political messages or disinformation and get us to take actions in our society that aren't healthy, that aren't good for democracy, that aren't good for humanity.

And it's where we're going where it's just, you know, we don't even realize how much the information that we put out through our headphones or our smart trackers or the things we buy online or our cars and where we drive them can be turned into weaponized information that can be used to manipulate us.

And I know that sounds a little tinfoil hatty. I try not [00:14:00] to be too tinfoil hatty, but I'm a privacy researcher. So it just comes with the territory, but that's happening. And just with the information that's out there, they're getting better and better at doing that. And that worries me maybe more than anything.

Quinn: I think that's a fantastic perspective. And again, you know, you get to watch and investigate how the sausage gets made and who gets to make the sausage and where it gets sent and the secret messages inside. So it's understandable to be a little tinfoil hatty. We need to be a little bit. I always want to pair my tinfoil hat stuff, whether that's you know, the exciting version of aliens are coming or not with some real data.

And we're seeing that out there, right? We're seeing after Dobbs, how, you know, well, I'll put the link in the show notes. There was Facebook messages, which were not encrypted between a mother and daughter and how they were taken to court for not having bodily autonomy anymore. To put it lightly, there was a report recently.

Was it in The Markup? I'm not sure. Big fan of The Markup. We'll put those guys out there as well, but about how much data cars are collecting all of [00:15:00] the time. It's astonishing. And it really does matter and your location data really does matter really truly more than ever. And like you said, the stuff affecting kids, I am a parent and I'm old school and the like, I don't know, they've been in the woods for three hours, I'm sure they're fine type of thing. If two out of three come back, we're still doing pretty well. I understand the tracking side of it too, because it's possible, but you, but it doesn't have to be black and white, you know, and it is helpful to really go down that sort of checklist like your colleague proposed, and we've got a sort of similar one here.

It's not a buyer's guide, it's just a checklist of questions you should ask before, especially if you put your kid's data out there, because it's really helpful to know that you really can't put it back in the box, right? It's pretty hard to do. So, is it true? You, again, you all have been doing this since 2017 and in just past month with ChatGPT and things like this and all of the other wrappers that have gone around it. We are just shoveling data out the door. It seems like companies are collecting more [00:16:00] than they ever have. More companies are collecting data than ever. They're all collecting more than they ever have because these devices are so much more capable from sensors to microphones, whatever it might be.

And often in products that couldn't collect anything before, right? That were dumb, does that hold true or is that me getting too tinfoily? Is it trending that way?

Jen Caltrider: Oh, absolutely. I mean, we're in the midst of launching our holiday buyer's guide and I reviewed Bose headphones and I saw a line in their privacy policy that talks about the sensors can track your head movements now.

And then they go on to say that's data that they might sell, you know, and you're like, my headphones? I mean, what are they going to know that I'm nodding along to a podcast and I don't know, but it was disappointing, you know, you're like it's nothing sacred anymore. But then you scale it up I mean obviously our fitness trackers can collect a ton of information and you expect that right?

But did you know that they can tell if you're drunk? They can tell if you're pregnant. And like you said, [00:17:00] once that data is collected, you just have to trust that whoever collects it's going to do a good job of protecting it, respecting it. And that's really hard. And then, you know, we just, we launched our cars and privacy review back in, was it September and holy hell, like cars are a total privacy nightmare.

And I don't think people realize how, you know, it's like, Oh, smart cars. It's no, every car, every new car comes with sensors, microphones, cameras on the inside, cameras on the outside, and car companies were kind of getting away with all this insane data collection. Nobody was really paying attention that, you know, Nissan and Kia say they can collect information about your sex life in their privacy policy. It's escalating really fast. AIs are something that we're going to review next year. We haven't reviewed yet. It is scary. And as a privacy researcher who's been talking about this for years now, it gets a little depressing sometimes.

Sometimes I go to bed and I'm like, does it matter what I do?

Quinn: It's easy to ask that question if you can really see the grand scope of things, right? And if you can see behind the curtain, it's easy to feel just like you [00:18:00] are fighting the battle, that is a losing battle, but it does matter because there are people that care and the more people that care and that voice these things, the more that, you know, hopefully our 80 plus year old Congress people will actually do something about it.

As complicated as it is to do something about it. We know that a bunch of States have a variety of kids privacy laws. I think California's goes the furthest, but if we pass a national one, it would supersede California's, which might water down the UK tried to pass one. It was a nightmare for 40 different reasons.

It's not simple, right? This whole thing isn't simple. We saw it a couple of years ago when Apple, who's imperfect certainly, tried to, I don't remember the exact phrase, the exact name of it, but basically to track whether there was naked pictures of children being passed around by automatically scanning.

And everyone again, freaked out on the one hand and freaked out on the other hand, and there is no easy answer not to say there isn't a right answer, but it's, I hesitate to always bring it back to the Ian Malcolm, like your scientists [00:19:00] never, you know, asked if they should question, but it's a complicated, right?

Yeah. It's complicated.

Jen Caltrider: Yeah. The tension between privacy and safety, I'm assuming it's been going on since the dawn of time. But it's really coming to a head now because so many of these products are built to keep us safer. You know, the, oh, the technology in your car is there to keep you safe. And that smartwatch that you can strap on your kid with a camera, a microphone, cellular data, GPS, and vital tracking is there to keep your kid safe.

And they make solid arguments about that. But the flip side is, we're giving up our privacy. Safety is also generally something easier to see than losing privacy. Privacy is kind of invisible where safety is more visible. And so that tension is one that, yeah, I think we all have to grapple with and kind of figure out where we all come down.

And I don't have any good answers. I wish I did.

Quinn: No, there aren't. I mean, there are better answers, which is why you have a buyers guide and worse answers, certainly. And I imagine there's companies that trend in both directions and hopefully some are stay [00:20:00] relatively on the good side. And I know one of your examples is again, the sort of non Apple watch type trackers for kids that can make calls and track them and things like that.

What a nightmare that is. I also want to empathize with. Well, mom and dad are working two jobs and they're not home till six o'clock and we can't afford an Apple Watch, which is also probably imperfect. So we're going to do something so our kid can call us when they get home because they're a latchkey kid.

It's imperfect. And that's a lot of that is human decisions. That doesn't mean folks like yourselves can't try to make a buyer's guide to make this just a little bit easier to help people make slightly better direction. So let's start with the fun part, which is the losers here. Who sort of categories, businesses, products, we're going to work to get more, more positive from here, but let's talk about who and what should people just categorically look out for? And then we'll move to some options they might be interested in. And we're not endorsing any specific thing here. This is just data, right? [00:21:00]

Jen Caltrider: Well, I mean, the worst category we've ever reviewed is cars. Because literally every car brand, the 25 we reviewed, earned our Privacy Not Included warning label.

So we couldn't recommend anything to anybody, so, but you can't stay away from cars, which is what makes it particularly terrible. You know, that's why I've banged the drum of, like you mentioned, we need a federal privacy law, a consumer, a strong consumer-focused federal privacy law to protect us from that.

But it's the holidays, you know, and so people are looking at buying more gifts. You know, I still see the cars with the big bows on them in the commercials, so somebody's getting a car as a gift, not me.

Quinn: It's always been amazing to me. I'm like, who gets those? Who gets a car?

Jen Caltrider: I mean, I don't even, I don't even know if I can afford the bow.

It's a lovely bow. It's got to cost a lot. But if you're talking about things that most of us can afford for the holidays, you know, smart speakers are still a thing people get, I think, they're less popular. They're creepy, but they're nice. You know, Amazon.

It's not a good [00:22:00] company. They collect a lot of data. They've gotten fined a lot by the FTC this year. So, you know, Apple makes a smart speaker. They're slightly better. Sonos makes a smart speaker. They used to be good. They landed on our Privacy Not Included list this year.

Quinn: They did. So could you pause?And so I think, and I try to tell people this all the time, Amazon is a nightmare and the Ring doorbell stuff, and we'll put that all in the show notes. I've talked about it a number of times, holy shit is that a nightmare? I mean, they're happy to give that data to anybody. And it's watching everything.

And obviously there's all the Alexa stuff, but so we know Amazon, not great. Talk to me about what happened sort of with Sonos, who I'm kind of surprised still exists as an independent company at this point, but what happened there?

Jen Caltrider: Let me just pull up my review of Sonos.

Quinn Yeah, let's do it. Let's do this live.

Jen Caltrider: Yeah, just live. Well, Sonos used to be, you know, okay. I mean, actually a couple of years ago, we had one of their products on our best of list, in part because it didn't collect a lot of data. Sonos, then they [00:23:00] added their own voice assistant, which processed everything locally. Again, you know, good, that's a good thing.

They were doing pretty good, but this year they get, they earn, we call them dings, privacy dings. And this year we ding them for the fact that it's not clear if every user that Sonos collects data on has the same right to delete their data, which might not sound like a big deal, but we see this a lot of times in privacy policies where it'll say, Hey, if you live in California, where they have the strong privacy law, CCPA, you have the right to get your data deleted.

If you live in Vermont or like I do, or anywhere else that doesn't have strong privacy. Well, no, good luck with that. We can't confirm that everybody has the same right to delete data. And that's a bummer because everybody should have the same right to have their data, right? If anybody wants to read privacy policies, I don't recommend it.

That's what I do for a living. It's exhausting, but a good place to start is always the California section, the California CCPA section, because it goes into more detail about what a company does. And it generally [00:24:00] covers everybody in terms of what data is collected not the rights that people have.

And so California defined the sale of data, the sale of personal information in the CCPA. And it's not just money exchanging for data. something of monetary value. And under California's CCPA, Sonos now technically sells personal information to advertisers or third parties for something of monetary value.

They get a ding for that. We ding companies that do that now. That combined, all you have to do is earn two dings out of our three and you get our Privacy Not Included warning label. And so I was a little bummed to see kind of Sonos slipping down. I mean, they're still going to be better than Amazon for sure.

And we see this, we see companies slip, you know, Wyze makes, you know, little affordable security cams that didn't used to be terrible, but they've had some serious security issues the past couple of years that they haven't handled great. That resulted in like the New York times, pulling their recommendation of us.

We gave them the Privacy Not Included warning label last year and [00:25:00] again this year. And so you're just seeing this kind of slip of private people, companies that used to be okay at privacy are getting worse, companies that were bad at privacy are getting worse. It's sad. It makes me sad.

Quinn: Yeah,it does make me sad.

And I think the other side is the thing that's helpful to understand when you folks, I think probably under, I try to empathize on understandably, I guess, within the scope of it, go, okay, and yet this hasn't affected me as far as I know, to understand that what has been built is a two-sided marketplace where and again, I know Wired has covered this and a lot of folks, like it is fantastically affordable for a third party to buy your, for example, geolocation data and things like that, and to triangulate it with other data that is being sold by these companies, by your cell phone providers and things like this to really paint a pretty vivid picture.

And they say it is anonymized and often, [00:26:00] you know, that is a shell game. Again, not being too tinfoil about it but collected together is where it really has, you know, a huge impact. So any other categories that people should just watch out for in general? Can we talk about the smartwatches?

The other ones, I keep wanting to call them off-brand. That's not correct. Just like they're incredibly popular. And that's why I feel like we should talk about them.

Jen Caltrider: Yeah, smartwatches for kids are kind of a personal thing for me because I would have hated that as a child. And again, not a parent.

So don't want to look down on parents, but we reviewed two smartwatches that are targeted at, specifically marketed on their websites to children and vulnerable adults. So, you know, older people with dementia, children with autism marketed at them, two of them, they have similar names, but as far as I can tell, they're different companies, Angel Watch and Angel Sense.

Angel Watch simply does not have a privacy policy. They have a privacy policy for their [00:27:00] website where you can buy the watch. But they don't have any privacy policy that we could find that covers the watch or the app itself. The watch and the app are collecting the GPS data, the microphone data, the, you know, location data, on and on.

And there's no privacy policy that we could find anywhere that says, okay, this is the data collected from this surveillance device and app. Here's how we handle it. Here's your rights to control it. Here's how we secure it. Nothing. It’s just, there was just literally a privacy policy that talks about the privacy for their website.

And that's, it's literally one of the creepiest things I've ever read just because of the market, the people that are marketed to vulnerable people, children and vulnerable adults, and the amount of information it can collect. And it's marketed for security, but they don't have a privacy policy for the device at all or the app.

And that's really bad. AngelSense, similar, marketed at the same [00:28:00] vulnerable groups. They don't have a privacy policy exactly. They have a paragraph section eight in their terms and conditions that they label a privacy policy and it's a short paragraph. So it's a little bit more than Angel Watch had, but it doesn't go into any great detail about the information that this again, surveillance device and app can track and how it's handled and used and what rights you have.

And so that was really shocking to see for me because a lot of these, a lot of parents are being told to buy, you know, your child's too young for a phone, but you want to be able to keep in touch with them. You know, these watches are an alternative kind of for the five to 12 or 13-year-old range, I believe.

And so to have companies that don't even bother with a privacy policy is some next level, like awful. I feel okay calling those companies out by name directly because it's bad. It's scary. And I hope that people recognize that and it's confusing. I mean, you go on the Angel Watch website and they have a link to a privacy policy.

And if you're a [00:29:00] consumer, you might not know to actually read and see, does this just cover the website? Does it cover the device and app? You know, when you go to the app on the Google Play Store and you click on the link to the privacy policy, it takes you to that privacy policy that clearly states it only covers the website.

So, but as a consumer, you might not know to look for that. It makes me sad.

Quinn: Yeah. I'm glad you keep expressing that same emotion. Just sad. It is. It's like I, I balance between like naked fury and sadness because it just doesn't. It doesn't have to be that way. It's very frustrating.

Jen Caltrider: It's hard to know if these companies are doing it out of malice, out of oh, we can collect a bunch of data and get away with it, or just out of, they don't know any better.

And I, I honestly don't know which is worse, you know, I mean creating a device that can do that much tracking and not even thinking that you need a privacy policy to cover it, you know, with great power comes great responsibility and you're not showing great responsibility. Or if you're just doing it so you can collect as much data and make money off of it and kind of hide behind that, like they're both awful.

And I [00:30:00] don't know which one it is, but it shouldn't be either.

Quinn: Yeah. Fully agreed. Are there versions of smartwatches for kids or people in general that you, that are at least marginally better here? Again, we're not promoting or, besides those two, specifically shitting on anything in particular.

Yeah. Though I'm not, I'm happy to do it. But again, people are buying these things in droves. What should they look like?

Jen Caltrider: Yeah. You know, we've looked at TikTok, we've looked at Verizon's Gizmo watch, we've looked at Space Watch Adventure. We're actually still in the process of finalizing those three reviews.

They'll be up shortly. Fitbit has their Ace Junior, I think, or their Ace. And then Garmin has two smartwatches for kids. They're kind of, Vivo Active Junior, I think, and their new Bounce. And I actually have Garmin. I have a Garmin smartwatch. I love my Garmin smartwatch. Garmin is actually one of the rare companies that's made it on our best of list, in part because they do collect a ton of personal data.

I mean, that's what smartwatch and fitness [00:31:00] trackers and GPS locators do. But it's so nice to see a company be like, yeah, we collect this, but we recognize that this is sensitive personal information. And so we aren't going to try and make gobs of money off of it by selling it or sharing it with advertisers.

We are going to do our best to protect it, to, you know, secure it. They're not perfect, but they're. they've done a fairly good job as far as we can tell. And you always get a little cautious when you want to call out a company for doing good because that can change in an instant. But so far Garmin has been good.

And so the Garmin Bounce has I think it has cellular data. It's their new watch. It has cellular data. So you can do two way texting maybe calling. I can't remember. I'd have to double check. So if you're a parent and you absolutely need one of those, like lean that direction and away from especially anyone that you can't find a privacy policy that covers the device and the app, but also, you know, the other thing is some of [00:32:00] these smaller companies don't have the same resources to secure their data that the bigger companies do.

You know, Google has a fairly decent track record when it comes to security of all that data because they have a gazillion dollars to hire security engineers, right? Some of the smaller companies might not have as much resources to put into securing data. So that's always a concern that we have when we look at some of the smaller companies that collect a lot of potentially personal sensitive information.

Quinn: It does make sense, right? And again, they're very imperfect, your Googles and your Apples, but, and it's easy though to say, well, at least they have a gazillion dollars each many times over to build the infrastructure to hire the people to do the thing. And they've gotten in trouble themselves. And it just also makes you look at Amazon and go, this has got to be on purpose.

Like it's crazy. You have just as much money. Like what a nightmare,

Jen Caltrider: Or Meta.

Quinn: Or Meta. I mean, holy Jesus it's crazy. What are any other categories to look out for? So smart speakers, smart watches, cars, [00:33:00] obviously anything else? I mean, cameras feels like one that I've always got questions about. I mean, on the b2b side we see all the time facial recognition is used everywhere and then some police force will buy something that has like a one percent success rate for and it's usually only booking Black people but on the consumer side any bad actors there, inadvertently or on purpose that people should look out for because again, people are putting these in their nurseries and they're putting them in their kids bedrooms and outside and all that stuff.

Jen Caltrider: We review the big players, the Amazon Ring, the Wyze, the Eufy, Blink, Arlo. None of them are perfect, you know, and so what I recommend people do is what you should always do with your data. Keep it as close to home as you possibly can. The less data that leaves your home on the internet and goes out into the world that you don't have control over, the better.

And so, I actually have Eufy cams here at my house because they have local storage. I can just pop a [00:34:00] SD card in it and it'll store things locally without you know, sharing it out on a Cloud somewhere, you know, video from inside or outside my home, sitting on a Cloud, where I have to trust that the Cloud is secure and who has access to it.

If it's stored locally, I feel like I have a little more control. Again, Eufy's not perfect. They've had some security issues where things that were leaking out, thumbnails of images from inside homes that they promised wouldn't were. And so it's really hard to recommend any one of them. Wyze has had some security issues where people had access into people's homes on cameras that they shouldn't due to glitches and bugs and mistakes. Amazon Ring has a whole history, as you mentioned, of issues. Arlo's not so bad as far as we can tell. But again, what it really comes down to is keep control of your data. So if you're looking to buy a camera for inside or outside your home, look for one that has local storage so you can keep control of that video.

Keep it in your house instead of in a Cloud. You can still access it. It's just not sitting on a Cloud where [00:35:00] it's more vulnerable.

Quinn: That makes sense. Is it a pain for people? Cause I've definitely run into this before, which is like, Oh, great. It's local. And me going, how often am I going to have to switch out this SD card?

We like, what are we dealing with here? What are the mechanics of that for you? And your purposes?

Jen Caltrider: Yeah, I never switch it out. I mean, it records, you can set it to record clip lengths of whatever you like and events and whatever you like. And then it records, you get a big SD card in and record that.

And then once it's full, it'll delete the oldest one. And so 30 days, if you haven't had a reason to go back and look at an event and it gets overwritten, you're probably safe. We also have game cameras here on the land that I live on that have a SD card that, you know, you have to go out and check because it gets full and that's a little bit more of a, you know, I like to spy on the beavers and the deer.

They don't know about privacy policies, so it's okay. But the ones that you have in your house will just kind of overwrite as things get old. And so, if you haven't needed to see something. Yeah it's, I've never had to [00:36:00] swap out an SD card, so it's handy.

Quinn: I might ask you about outdoor game cameras later offline, because I kind of want to spot some beavers that have been taken down my trees.

Okay. Let's talk about good stuff. Obviously you've got some best of categories, companies, products, people are going to buy shit. What should they actually be looking for? What should they feel good about spending their money on and to, you know, because it's two sides, right? It's not just, am I putting this product that tracks again, an elderly person, a vulnerable person, a child or whatever, or a young athlete, whatever it might be so they can track their running.

Not just putting in their life, but you're directly in a very small way supporting that company, right? You're supporting that category a little bit. Where should they feel good about doing that and feel like, yeah, this is an okay thing to do?

Jen Caltrider: And this is always hard for me because we don't get paid any money from any company and I will always keep it that way for the reviews we do at Privacy Not Included, as long as I have a say.

[00:37:00] So, it's always hard for me to say, good things about companies because things can change really quickly. You know, like iRobot, their robot vacuums have been, they've been very solid at privacy and security for five or six years, you know, and we've put them on our best of list. We've really, you know, they've been very receptive.

We email every company that we research and they always get back to us. Nice guys there. Amazon's in the process of buying them, you know, that deal could go through as soon as early next year. And so what happens when you care about privacy and you buy a product that, you know, and somebody like me says, oh, that's a better product, you know, you can trust it. And then they get gobbled up by a biggie like Amazon, because every privacy policy I read has some line in it that says we can share your data in the event that our business is sold, your personal information is a business asset so it’s gonna go to them. Anything I recommend, I'll add with the caveat that it could change tomorrow. You know, Amazon or Google, come in and sweep it up [00:38:00] and away you go.

You know, I've already mentioned Garmin as a good one. You know, some of the kids toys that just don't collect a lot of personal information, but are still fun. Like Tamagotchi Uni, my colleague, Misha, who's a fellow privacy researcher who lives over in Europe has asked Santa for the Tamagotchi Uni because it's fun, it's interactive, doesn't collect data, you know, you can buy some of these interactive toys that aren't going to collect a lot of personal information on your kids.

As opposed to say we reviewed Moxie. Which is a little AI robot. That's a thing now, you know? Robots that are integrating these AI chatbots and Moxie's maybe not the worst at privacy, but they do have AI, you know, like OpenAI's AI in them, a chatbot in them, and they market themselves as a friend for your kid to help them learn emotional intelligence and build healthy relationships.

Knowing that, you know, your kid's going to chat to them on that video is going to audio is going to be shared back to the likes of Google or OpenAI, as they say in their privacy policy. [00:39:00] And you just have to hope that it's not going to be used in ways that you don't want. They say it won't. You have to hope that it won't.

But at the same time they have, they tell parents, Hey, make sure to tell your child. And this is marketed for kids as young as five, I think. Don't share any personal information with Moxie.

Quinn: Do you know what five year olds do? All they do is share personal information. My kids like walk around telling people my social security number.

Jen Caltrider: Yeah, exactly. And it's also marketed as a friend, you know, to help grow emotional intelligence and build healthy relationships. So, I'm not sure how they navigate marketing that as something and then telling parents to tell their children never to share personal information. And these are just the issues that we see, you know, it's oh, maybe this Moxie could be good for children that need that sort of interaction.

But at the same time, of course, kids are going to share personal information with their robot friend. I mean, that's just what they do. So, so that's, that kind of worries us. And what else might be good?

Quinn: Besides literally like puzzles and other wooden [00:40:00] toys what what with batteries in it?

Jen Caltrider: Yeah.I'm a big fan of books, never get old as gifts. I don't think I loved them when I was a kid. I love them now. That being said, I have a Kindle, you know, Amazon knows what I read.

Quinn: I frustratingly use Kindle as well for, I mean, I just, the amount of research I have to do for this work is like a liberal arts major that's constantly trying to level up and be able to have semi-intelligent discussions with incredible people like yourself requires highlighting and notes and collecting all that stuff, and I love my physical books for fiction. A friend recently recommended Kobo. And says it, he has actually been pretty delighted with that and is a way to exit that Kindle ecosystem.

Any reports on Kobo whatsoever before I just jump into another privacy nightmare?

Jen Caltrider: We've reviewed Kobo in the past. Kobo and Pocketbook. Pocketbook is another eBook reader that's actually probably the best eBook reader we've reviewed for privacy. So if you're looking for a non-Amazon-based eBook reader, Kobo is okay.

Pocketbook is really good. We're still updating those [00:41:00] reviews. So hopefully that hasn't changed in the past year or so but there are options that are good. Pocketbook, I would say is probably a little better than Kobo, but Kobo is better than Amazon Kindle. So there are options but also, you know, it's an eBook reader and if you're a grandparent and you're looking to buy your kid some device that they asked for, you know, and your choices are like the Amazon Fire tablet or the Kindle, go with the Kindle because the Kindle doesn't have, you know, the same microphones and cameras and really you can just read. And if Amazon knows what your kid reads I don't love it, but whatever, as opposed to knowing searches, apps, games, what they say, what they ask Alexa.

Even in the bad, you kind of just look at the places where less data is going to be collected or you're like, okay. You know, I'm gonna do a, you know, an analysis of risk and see, they know my kid likes to read books about horses. Not great, but not the end of the world.

Quinn: I think [00:42:00] that's fair. The asking literally what is it collecting from sort of the categories of, on down really just like biometrics to video to voice to like you said, she really loves unicorns Okay, it's not great.

But obviously it could be substantially worse, right? That's how I've justified my Kindle so far but I was excited to get a recommendation for something else because this problem is and this is I think what adults generally run into for themselves, but also in parenting, and probably teachers, is the same thing, is there's often essential things.

You some sort of essential tool that you need to use, and obviously for probably most folks, a smartphone does a lot of that, or your computer, but, you know, an e-reader, which right now happens to be a Kindle for me, is an essential tool of my work. Because it syncs with Readwise and does all these different things and helps me do my job.

So that has been the default. But if there is a better version, I'm [00:43:00] very interested in that. And I think it's the same thing for parenting. Again, it's like your kid's home alone, coming home, whether it's in a city or rural, wherever it might be. I understand the need to go. I would like to know where they are.

I would like them to be able to call me if they're dumb and turn on the stove and screw it up. Right. But there's gotta, that's why I really appreciate this work to say there are marginally better options in a world where everything is relative, right? What would you buy if you were gifting young people under 15 this year?

What would you buy among the things you've sort of reviewed? Again, you're not endorsing. Anything, no one's getting any money for any of this.

Jen Caltrider: What would I buy if I had to buy a somebody under 15? I mean, again, Tamagotchi Uni sounds fun to me. I love the some of the coding tools like iRobot, Root, RD3000.

There's some coding we haven't reviewed them all, obviously, but there's some coding kits out there that seem like they'd be pretty fun that do a pretty good job of not collecting a ton of data. Again, an e-reader, [00:44:00] whether it's, you know, you're like, well, but if they buy, if I give them the Kindle, I can also provide the books and they can, you know, okay.

Or like you said, there's Kobo, there's Pocketbook, there's some other options for e-readers. I don't know if e-readers are too boring for kids. You know, I don't spend a lot of time around teenagers, so I don't know if that would get an eye roll. You know, if you're in the Xbox versus PlayStation 5 versus Nintendo versus Steam Deck conversation, Microsoft got fined a big fine by the FTC this year for violating the children's privacy law.

That's something to consider if you're thinking about getting a gaming console. Steam Deck is actually fairly okay at privacy as far as we can tell. We aren't 100 percent sure they use encryption everywhere, but other than that, their privacy policies don't scare us as much as some. Nintendo used to be on our best-of list, it's fallen off.

I still think that they're probably better than Xbox, but, you know, really, I would probably go with books. I'm with you, I'm with you. You know, I just, [00:45:00] I'm just at the point now where it's like there are enough devices in the world and, you know, just putting your hands on something and stopping and disconnecting from the screens and just looking and feeling and, or a subscription to a magazine, you know, those are still things.

And you look forward to it. You run to the mailbox and you check it, you know, and I think, you know, get a subscription to a magazine, you know, or buy a book or just, you know, something, Legos, you know, something physical.

Quinn: Yeah. No, a hundred percent. A hundred percent. That's really helpful. I appreciate it.

So besides you talked a little bit about growing up on a farm. Getting cornered by some cattle, making some mistakes, in private, it's the same, I, very luckily, and it's a little ridiculous, you know, in an office in Colonial Williamsburg above a candy shop, and I've convinced a few other friends who work from home to get rooms, and we always joke about oh my god, what would high school or college be like now where someone can live, live broadcast video anywhere all the time to the entire world like I [00:46:00] think you probably act differently, but for us like if we were transported we'd probably be in prison pretty quickly, right?

For anything, but at the same time besides like this isn't how either of us really grew up, being monitored in some way all the time. Why do you do this work? Why did you really decide, sort of in 2017, Alright, I'm not a privacy researcher, but I do care about this from a consumer side. And why do you keep doing this work?

Why are you so steadfast that no one will ever pay money to be featured in some way, whatever it might be?

Jen Caltrider: I mean, it's my hokey little motto, leave it better than you found it. I really think that privacy is an issue of the day. That, you know, with so many social movements. You know, there's the people on the fringes that kind of care about it and talk about it and try and make things better.

But until there's this moment in time where something sparks, the Me Too movement or Black Lives Matter to bring like bigger consciousness to an issue it's those people that toil off to the side forever that really lay the [00:47:00] groundwork and matter. And I feel like privacy is going to have its moment at some point where there's a spark that really brings it out where people get how bad things are getting, and I'll be happy to toil over here and lay the groundwork until that spark happens, hoping that when that spark does happen, that we can capitalize on it because it matters to me. You know, I'm a private person, not because I have something to hide, but because it makes me feel more like me.

And when you look around the world and you see the anxiety and the depression and the anger. You know, maybe if people could step back a little bit and find ways to feel more like them because they have some privacy, the world might be a little better place. And so that's why I do what I do. I know it sounds hokey, but I truly believe that we're better off when we have privacy and companies don't know more about us than we know about ourselves and use that to make money.

And so I'll keep toiling over here and hopefully, people will pay attention. Keep having my existential crisis as if this matters or not, but hopefully, people will [00:48:00] keep reading our reviews and supporting our work and yeah, we'll make the world better than we found it.

Quinn: I love that. I mean, we can sync up our existential crises as, you know, I'm sure my children will look up someday.

My children who are just, you know, wildly privileged and healthy will look up and be like, what did you do when the tide was turning? I'm like, I had a podcast. You know, I'm sure that'll go really well in the eulogy. But at the same time, I do want to you know, just briefly to put a pin in it, come back to what you said, which is not just like companies monetizing off this data.

It is that again, this isn't tinfoily, like we're all very aware of this and the anxiety and depression rates are through the roof and all of these different things, it directly manipulates our decision making, how we think of ourselves, how we think of other people and how we think of ourselves in relation to, and in comparison to other people.

Who are often showing, and this is just like the Instagram cliché, but a best version of themselves. Which probably is ruining their life as well. And we're [00:49:00] humans, we have compared ourselves and our tribes and our cities to others for a thousand years, for a hundred thousand years. But we've never been so willingly manipulated in this way.

And I'm hoping that there starts to be some pullback on this sort of thing, but, you know, that does matter. It's not just that they're selling it to whatever, or that the life and death stakes of some young person being taken to court because she drove to an abortion clinic and wasn't allowed to.

It's their manipulating how you make your decisions. It's not just oh, I talked to a friend about you know, leisure pants, and look, I got an Instagram ad about it. It's it often runs much, much, much deeper than that much more detailed than that. And that really matters, you know, because the decisions we make for ourselves and for our families and as a society and as a voting public, they do matter in aggregate.

You know, they're going to add up. So whether that's trying, injecting yourself with bleach after COVID because 40 people on Facebook recommended it or whatever it is, [00:50:00] it's just, these are things we might not have been privy to before and maybe we shouldn't be. So anyways I really do appreciate the work and your perspective on it.

It is very important.

Jen Caltrider: Well thank you very much. I appreciate that you appreciate it.

Quinn: All right. Well I'm not going to keep you for too much longer here. Like I said, I got a few questions I ask everyone and then we'll send you back to saving the world from creepy companies. Does that sound good?

Jen Caltrider: Sounds good.

Quinn: First one, you can take a moment to think about if you need to, but I'm always interested. Like I kind of mentioned before, there's so many folks that listen to this who might be a CEO or an investor or student or journalist or an artist, whatever it is going, how can I apply myself to something that's maybe a little more role changing, meaningful, something that moves the needle in some way.

And so one of the things I like to think about and asks my guests is, when was the first time in your life when you realized you had the power of change, or the power to do something meaningful, when it was you or [00:51:00] your crew, whatever it might have been, something where you did, where you moved the needle and thought, well, that's interesting.

That's interesting. Look what I can do.

Jen Caltrider: That is a very good question. Well, I don't know if it's the first time, but it's the one that comes to mind as it's a random story. Holy hell. This is a weird one. This is a weird story, but it's the one that popped in my head. I was in college and this was back in the early nineties cause I'm old and the early nineties were a different time for gay people, you know, it was just a different world. And I don't know if people realize it, that coming out as gay was like not just a thing people did. And I played lacrosse in college and I remember I came out and this is such a weird story. I can't believe I'm sharing it.

I remember I was at practice one day and a person that I played with came to practice with a hickey. And everybody was giving her grief and laughing about it. Ha. And I went back after practice and I told my first girlfriend, I was like, what [00:52:00] happened? And she's let me give you a hickey. And I was like, okay, you know, cause I'm like 20 and dumb.

And so I went to practice next day with a hickey, you know, and this is my first relationship. I don't know any better. And for some reason, my coach freaked out about the fact that I had a hickey, but didn't freak out about the person that had the hickey the day before. And I was like, Oh, okay, whatever.

And I practiced, I was a goalie, you know, so my job was to get pelted with the ball. And that was fun for me. And I didn't think much more of it, but my teammates were really off put by that. And it caused a big furrow. I went to school at a small liberal arts college, and as small liberal arts colleges do, it doesn't take much to cause a big to do on campus.

And for some reason, this story got out from the lacrosse team into the general population of the campus, and it caused a big to do. And everybody was really mad at my lacrosse coach. I wasn't mad at my lacrosse coach. But everybody else was really mad at my lacrosse coach. And I thought, well, this is a problem.

And I remember going to practice the next day and it, and [00:53:00] I had to stand up and talk to the whole team about how it was okay. You know, this is a learning opportunity and maybe the way the coach handled it wasn't great, but let's talk about this and not lose our minds over something that might not be that big of a deal or it might be, I don't know, but let's talk about it.

And it was just a weird situation for a pretty young, dumb, naive kid to be in, to navigate this thing where the coach is, obviously looking back the coach had obviously screwed up, but I didn't want anybody to be mad. I didn't want to be upset because it was a dumb thing. And I just remember talking to the team and trying to be like, let's not be mad.

Let's not blow this out of proportion. Let's talk about this. And that's what happened. And it got resolved and then everybody was happy. And it was just, and I don't know if that's the moment that I realized that like you can make a difference in the world, but that was the moment where I was forced to talk about something that made me really uncomfortable that I didn't want to talk about in the interest of trying to make things better because things were bad for our team because of this [00:54:00] incident. It was just a really dumb small thing looking back on it but I very clearly remember that feeling of standing and having the whole team look at me while I'm talking to them about something I really didn't want to be talking about.

It was very personal. And how uncomfortable it made me, but how it resolved the issues. And so maybe that was the first time that I realized it and that the biggest lesson that I took away is it's that discomfort, those moments where we're not comfortable, but we push through, that's where change happens.

And anytime we try and avoid that discomfort that means we're not really making change too often. And so maybe that was the lesson I learned from that.

Quinn: That's beautiful. Thank you. Thank you for sharing that. I, it's a wonderful, yeah, pretty telling story of the nineties certainly it is, it, so much has changed now and obviously a lot has not.

I appreciate you sharing that. Did someone make you stand up and talk to the team or did you say, I got to do something about this?

Jen Caltrider: I don't recall, but I think, I don't recall anybody making me [00:55:00] do it. I just think it was like, I knew that people were mad about something that there was like some misunderstanding about, there was like..

Quinn: On your behalf, right?

Jen Caltrider: On my behalf, yeah. And it was, and it was like, it's not what I wanted. And I hate to think how that incident would have played out in the digital age. You know, this happened, word of mouth, it happened. And it was a combination of you know, a little bit of ignorance, a little bit of misunderstanding a little bit of like young people ready to jump on something.

None of it felt right. And like the goal, you know, it's like when you're, you know, you step back and go, what's my goal here? Well, my goal is I want to not have people mad and I want our team to win the next game. And I don't feel like that's going to happen unless I address this. And so, you know, it sounds weird. It's not like I'm making excuses for the coach because the coach was young and made a mistake. But at the same time, like just getting mad and yelling and the way that was being handled on either [00:56:00] side wasn't productive. And so trying to find a productive solution to bring people together to have the conversation.

I think that was maybe like that was the, maybe the first time that I realized that's how you had to handle it, which, you know, when you're 19 and 20, it's like, you know, you're just used to reacting with emotions, and so, yeah, it was, it had to be very measured. I feel like there are some members of Congress that haven't learned this lesson yet.

And so, yeah, it was interesting. It was a very uncomfortable situation, but it worked out.

Quinn: Responding with measure, with temperance is not, certainly not what I did at 19 or 20 to literally anything.

Jen Caltrider: It's also not what the internet does, right? The internet doesn't do that. That's not what gets clicks.

People like to be outraged. And so, yeah it's interesting to think how that could have played out differently now as opposed to then.

Quinn: You're so right, though. I went to Colgate Small Liberal Arts School in the middle of nowhere, zero degrees most of the time. And you're so [00:57:00] right that, the smallest thing could scale across three, almost 3,000 kids so quickly and you'd be horrified that everyone knows.

And that's as far as it went, usually, versus today, scale means something, obviously, you know, it's not actually the same thing. It's completely different whether it's from the technology or whatever, quite different. Well, that's super impressive and speaks a lot to where you are today, which is trying to help people understand you know, the right thing and about privacy which matters.

Thank you for sharing that. Who is someone in your life who has positively impacted your work in the past six months?

Jen Caltrider: Past six months? My work? I know, it's a very specific one. I mean, this might sound lame, but it's not. My wife, probably. My wife is a violin maker. Which is, like I said, the coolest job in the world.

She takes a stick of firewood and turns it into a beautiful instrument. That's what she does for a living. I read [00:58:00] privacy policies for a living, so she wins. But she's also really smart and loves to do research. She'd be a way better researcher than me if that were her job. And so when I'm sitting here reading the privacy, 8 million privacy policies of cars and trying to decipher them or trying to pull a holiday guide together and I go up there and I tell her what I say here and she's so quick to pick it up and put perspective on it and give me, you know, Oh, I heard this.

She listens to podcasts. I don't, and tell me what she shared on a podcast. But in the midst of all of that too, she's kind of volunteers to help out with Privacy Not Included in a way like she enters our content into the CMS because we get really busy and we're a small team of three and we don't always have time to kind of manually enter stuff in the CMS and she's oh I need a break from, you know, carving this violin scroll.

I like to do data entry in the computer. So she'll just do it and doesn't expect anything. And it's just the kindest thing ever. And I wish everybody could have a partner like my [00:59:00] wife, who is kind of perfect. So, I will give a shout out to her. She is she's pretty wonderful.

Quinn: That's a fantastic answer. I too would be Just, I don't know, warming my hands over a trash can somewhere if it weren't for my wife, just the greatest human alive next to your wife. Same thing. There's truly nothing better. I wish it for everyone. Last one, easiest answer, maybe, what is a book you've read in the past year or so that is either opened your mind to a topic or an idea you hadn't considered before, or actually changed your thinking in some way.

We threw all these up on Bookshop.

Jen Caltrider: I read privacy policies all day long. So at the end of the day, I want to go...

Quinn: At the end of the day, you want to do what?

Jen Caltrider: So yeah, at the end of the day, I want to go and read dumb stuff. I'm a fiction reader. It doesn't have to be highbrow. It doesn't have to be anything, you know, if it's formulaic and there's 10 books in the series, the better.

I love those. So I currently I'm reading a book by, I [01:00:00] can never, I, you know, I have a Kindle, so I never know the book titles or the authors. He wrote, it's the third book in a series. He wrote a Blitz, it's called Blitz. And I think he's an Australian guy who works for like the department of transportation or something and writes in his own, he wrote the first two books in the series were called Rook and Stiletto, and this third book is called Blitz. And the idea is that there's people have these like supernatural abilities and in Britain they exist and they get found and they become part of a super secret government organization that helps fight back against supernatural, it's, and it's very funny and It goes very much to privacy.

These supernatural abilities have to be kept very private for the safety of the world. But I love, I just love those books because they're just easy to read. And he makes me laugh because he's talking about government bureaucracy in the context of the supernatural kind of government agency.

And so maybe this isn't any kind of book that you really, your readers are going to be like, Oh, I got to go read that book to [01:01:00] make me smarter or anything, but it's just, I really liked his books. I really liked his book. So I can't think of his name, but the first book is called Rook. The second book is called Stiletto and the third book is called Blitz.

And whatever your name is, author, thank you for making me fall asleep easier every night.

Quinn: Daniel O'Malley.

Jen Caltrider: That's who it is. Good work, Google.

Quinn: Perfect. Yeah, I know. Great. Now they're going to save that forever. Can't wait to get a Daniel O'Malley ad later. That's fantastic. No the Turn Off Your Brain books are, and I'm fully a believer that like, probably more important than the, like the work books I read because if I didn't have the one, the other one doesn't function.

I mean, I couldn't go further from turning my brain off at night. My wife is like, oh, someone recommended this book. It's about climate change. And it's a story. I'm like no. No, that doesn't come home. Absolutely not. I can't do it. It's good. It's got to have wizards and the magic and like some sort of spaceship and truly.

Jen Caltrider: Absolutely. Yeah. Yeah. Wizards and magic. Big fan.

Quinn: Yeah. Great. Just like you said, printed out of the machine. I'll [01:02:00] take 20 of them. I can't thank you enough for all of your time today. It's been most of your day so far. I imagine so many dogs have to go outside. This is a funny question to ask in the context of this conversation.

Where can our listeners follow you and your work online? If at all, where can we find the buyer's guide and all that jazz?

Jen Caltrider: Yeah. Well, please don't follow me. I'm a privacy researcher. So follow privacy not included. org is the website where we post all the reviews of all our products. Our holiday guide is currently going up now as we speak.

I think it launches. It's launched today. So we'll be adding products to it throughout the holidays. So go read about your fitness trackers and health exercise equipment and the like before you buy it. Sauna apps are interesting. PrivacyNotIncluded. org. I don't really put myself out there on social media because...

But you can always email me if you like. My email address is on privacy not included. org. If you have, if you do a little [01:03:00] searching. So that's about as much as I put myself out there. I do like to get emails from people. I hear a lot of fun things from people. So, yeah, old school.

Quinn: I love that.

Well, thank you so much for all this. Thank you for this work. It is vitally important, I think. It'll compound and become even more so, the more we try to make use, productive, personal use of the data that we produce every day and have been producing for hundreds of thousands of years, we just never knew how to process it or collect it or any of that jazz, but do it in a way that Where we can wrangle it and people can be protected, especially kids and vulnerable folks, of which the list is myriad, obviously.

So thank you so much for all this and thank you for your time today. I really appreciate it.

Jen Caltrider: Well, thank you. I really enjoyed talking. I hope you have a good day with your dogs.