Episode 68

Doorbells Ring Hollow

What on earth were Ring and Amazon thinking when they aired their Super Bowl advert that previewed a “there’s nowhere you can hide” type of dystopian future masked as a way to find your lost dog? With cameras everywhere, are we safer or just more exposed? When camera data is deleted, is it really gone (spoiler alert: not necessarily), and more. Are we approaching the new location of the “creepy line” or as a society are we content to trade privacy for security? And what happens when the glasses with cameras become more pervasive? Are we all on cam all the time whether we like it or not?

Show notes:

  1. Ring Super Bowl Advert - https://www.nytimes.com/2026/02/19/business/ring-super-bowl-ad-privacy.html
  2. Decoder Podcast - Let’s talk about Ring, lost dogs, and the surveillance state - https://youtu.be/QQjW68B7s8g
  3. Ring and Flock cancel partnership - https://techcrunch.com/2026/02/13/amazons-ring-cancels-partnership-with-flock-a-network-of-ai-cameras-used-by-ice-feds-and-police/
  4. Savanna Guthrie Nest Video Retrieval - https://www.theverge.com/tech/877235/nancy-guthrie-google-nest-cam-video-storage
  5. Apple San Bernardino Matter - https://epic.org/documents/apple-v-fbi-2/
  6. DJI robot vacuum cameras accessible via Internet - https://www.theverge.com/tech/879088/dji-romo-hack-vulnerability-remote-control-camera-access-mqtt
  7. Unifi protect cameras - https://geni.us/isNyY2
  8. Zuck in court to testify on social media addiction - https://apnews.com/article/mark-zuckerberg-trial-testimony-instagram-c8cbaa32ccbf4933ec3a7beebd6cf34b
  9. Glassholes are back - and forbidden in court - https://www.cbsnews.com/news/meta-trial-mark-zuckerberg-ai-glasses/
  10. Movie recommendation - Happy Gilmore - https://geni.us/v96XEgb
  11. Meta/Facebook studies on addictiveness of social media - https://www.cnn.com/2026/02/23/tech/facebook-researchers-study-addictive-features
  12. LinkedIn/Microsoft Verification data being shared with many others, including Persona -= https://thelocalstack.eu/posts/linkedin-identity-verification-privacy/

Some of the links in the show notes contain affiliate links that may earn a commission should you choose to make a purchase using these links. Using these links supports The Great Security Debate and Distilling Security, so we appreciate it when you use them. We do not make our recommendations based on the availability or benefits of these affiliate links.

Transcript
Speaker A:

Welcome to the great Security debate.

Speaker A:

This show has experts taking sides to help broaden understanding of a topic.

Speaker A:

Therefore, it's safe to say that the views expressed are not necessarily those of the people we work with or for.

Speaker A:

Heck, they may not even represent our own views as we take a position for the sake of the debate.

Speaker A:

Our website is greatsecuritydebate.net and you can contact us via email at feedbackreatsecuritydebate.net or on Twitter.

Speaker A:

Twitter at Security debate.

Speaker A:

Now let's join the debate already in progress.

Speaker B:

So part of me, I mean granted this is kind of a beat up conversation, but you probably remember like four years ago when we talked about was it Amazon or whoever was coming up with like the sidewalk, right?

Speaker B:

Oh yeah.

Speaker B:

Like oh yeah.

Speaker C:

Sensors Toronto.

Speaker B:

Yeah, yeah.

Speaker B:

We were like no good.

Speaker B:

No bueno.

Speaker B:

Like the ring commercial shown at the Super Bowl.

Speaker B:

Like I'm watching that and I'm thinking,

Speaker C:

what were they creating a ring police state?

Speaker B:

Like, oh but Eric, it was the ring dog state.

Speaker B:

This is for the dogs.

Speaker B:

Yeah, all dogs go to heaven.

Speaker B:

We're going, this is for the dogs, Eric.

Speaker B:

We did it for the dogs.

Speaker B:

This has nothing to do with gathering information on everything you do.

Speaker B:

Now granted, if my cameras can't see all the way down the street that I currently have and somebody parked at the beginning of the subdivision and walked here.

Speaker B:

Right.

Speaker B:

And I wasn't able to get their vehicle on camera, now I know that ring most likely has their vehicle on camera because every ring is gathering that information.

Speaker A:

So let, let's do a quick synopsis of what this is.

Speaker A:

The super bowl advertisement from Ring owned by Amazon put out a commercial saying if you lost your dog, activate the dog search on ring and all of our rings will keep its eyes open for your dog as it looks through the neighborhood.

Speaker A:

Neighborhoods are good neighbors and neighbors are good neighbors and they'll help you and their rings will help you find your dog.

Speaker A:

It aired and I was in a room of normal people.

Speaker A:

Like not technologists.

Speaker B:

Not normal.

Speaker C:

Yeah, no.

Speaker C:

Wow.

Speaker C:

I'm ouch.

Speaker A:

I'm hurt yet still know exactly what I mean.

Speaker A:

The so and in this room of normal people, the look of incredulity was instant.

Speaker A:

And now I'll couches with I'm in Ann Arbor.

Speaker A:

So there's a little bit more of a heightened it's university town.

Speaker A:

It's a bubble.

Speaker A:

We know, we all know it, we all love it.

Speaker C:

That's best college town in the country.

Speaker A:

The look was absolutely priceless because it was not just Daniel shaking his fist at the Cloud, it was a whole room of average humans going, this is not good.

Speaker C:

Okay.

Speaker C:

And so they're not normal anymore.

Speaker C:

They're just average.

Speaker C:

Gotcha.

Speaker C:

That makes us above average.

Speaker A:

So you create a distinction between normal Americans and average Americans for Dan's friends

Speaker C:

that were with him watching the Super Bowl.

Speaker A:

Yes.

Speaker A:

And so, so, and the, the, the blowback.

Speaker A:

And I'll put links into the, into the show notes.

Speaker A:

The blowback was monumental and almost universal because the, the comment out of one of the people in the room was, yeah, but how long until they start doing this for people?

Speaker B:

Right.

Speaker A:

And to some extent, they already are.

Speaker A:

There are things in the Ring doorbells called.

Speaker A:

I think it's like knowns or friendly people or somebody.

Speaker C:

Yeah, I can do the same thing with Nest that I can.

Speaker C:

You know, somebody knew that I can give them a name, then it'll tell me anytime they show up again.

Speaker A:

Right, and you can.

Speaker A:

And, and in a subsequent interview on on the Verge, I listened to the founder of Ring talk a little bit about this and the idea, apparently it can also be used to quash automations, enable automations.

Speaker A:

If I see Daniel come home, open the garage door, like there's all sorts of things you can use with facial recognition.

Speaker C:

Totally.

Speaker C:

It's the same guy who walks his dog by and lets them go to the bathroom on my lawn so I can turn on the sprinklers every time they walk by.

Speaker A:

There you go.

Speaker A:

Exactly.

Speaker A:

Exactly.

Speaker B:

Awesome automation.

Speaker B:

Cool workflow scenario there, guys.

Speaker A:

So let's not kid ourselves.

Speaker A:

This is already being done.

Speaker A:

Yeah, but then there's one more piece.

Speaker C:

But wait, there's.

Speaker A:

Turns out Ring, for the last six months or so, has been talking about a new partnership with Flock Security, Flock Safety, Flock Camera, Flock Flock, you the.

Speaker A:

The other camera company that cities and, and, and companies buy and put in their parking lots and have some of the most amazing LPRS license plate readers that ever existed.

Speaker A:

And since you're in public, you really have no right to privacy.

Speaker A:

Or in the.

Speaker A:

Again, these are us.

Speaker A:

There's no expectation of privacy.

Speaker A:

And so they collect it.

Speaker A:

They collect it, they mine it, they make it available for search.

Speaker A:

And it is.

Speaker A:

It's Flocking nuts.

Speaker A:

And so, you know, these kinds of things are all part of.

Speaker A:

So since then, Amazon has said we've canceled like the, this is how big the blowback is.

Speaker A:

We've canceled our, Our, our relationship with Flock.

Speaker A:

No ring data was sent there.

Speaker A:

I don't know that that makes me feel any better because Amazon is also a data hoarder in these kinds of things.

Speaker A:

So I don't know that that makes me any better.

Speaker A:

But the one thing that the.

Speaker A:

The interesting thing that the founder of Ring said, and I don't have his name off the top of my head, it's is.

Speaker A:

Was that this was all in the name of creating a security in your neighborhood through monitoring that quashes crime.

Speaker A:

Basically, if we put cameras everywhere, everyone will know they're being watched, and therefore there could be.

Speaker A:

There will be no crime.

Speaker A:

Wouldn't you want that for your neighborhood?

Speaker A:

And so that's the background.

Speaker C:

Yeah, it's really.

Speaker C:

It's.

Speaker C:

It's really no different than we had the conversation around what Apple was talking about doing.

Speaker C:

At one point, I'm being able to pick up on pictures of children that were being sent around that they backed off of.

Speaker B:

Right.

Speaker A:

Yeah.

Speaker C:

I think we mutually agreed that, hey, the intent there is correct.

Speaker C:

But are we now going too far in where we're moving if we take children out of the equation and what we're doing and moving into the phone?

Speaker C:

And it's another one of those that under the guise of good intentions, we continue to create this monolithic ability to monitor everything we do now.

Speaker C:

Dangerous.

Speaker B:

I'm gonna throw this out there.

Speaker B:

Okay.

Speaker B:

The idea, little bit conspiracy theory here.

Speaker B:

Now you take in the whole Guthrie abduction, right?

Speaker B:

In the timing of.

Speaker A:

Why don't you.

Speaker B:

Commercial.

Speaker A:

Go through some of the background on that, would you please, Brian, on the.

Speaker A:

On the how cameras play in and some of that, because I think there's.

Speaker A:

There's some ties to there.

Speaker B:

Yeah.

Speaker B:

So I.

Speaker B:

In going back to the original abduction, and I. I don't know all the details because I really wasn't privy to a lot of news feeds because I kind of don't have a lot of news feeds in terms of some of the news that comes out.

Speaker B:

And as I.

Speaker B:

As I was approached by, people are like, hey, did you hear about this?

Speaker B:

Did you hear that there was camera data and they.

Speaker B:

They have the guy.

Speaker B:

Right.

Speaker B:

And I'm like, what are you.

Speaker B:

What is this guy in this Guthrie that you speak of?

Speaker B:

Right.

Speaker B:

And so then I'm looking it up, and I'm like, wow, I really have been living underneath the rock.

Speaker B:

Like, so national news reporter, anchor, or TV show host.

Speaker B:

Don't know the name.

Speaker B:

Dan.

Speaker B:

You could probably fill me in.

Speaker B:

Her.

Speaker A:

I actually don't know.

Speaker A:

I know of her, but I don't know anything about her or what role she plays other than she's on TV news.

Speaker B:

Yeah.

Speaker B:

And, you know, was supposed to travel over for the Olympics.

Speaker B:

Had to be Replaced because mother was abducted just before the Olympics were going to start.

Speaker B:

And she was, you know, an on camera anchor, reporter somebody.

Speaker B:

Right.

Speaker B:

But mother was abducted and they were pretty sure the mother was abducted, but there was, you know, little information coming out here and there.

Speaker B:

Hadn't heard, you know, 48 hours.

Speaker B:

But then this camera data comes out, right?

Speaker B:

That's leaked and the mother had a ring camera but didn't have a subscription.

Speaker A:

It was a Nest camera.

Speaker A:

Just, just.

Speaker A:

I'm sorry, it was a Nest camera.

Speaker B:

Yeah, Nest camera, but didn't have a subscription.

Speaker B:

Right.

Speaker B:

For keeping.

Speaker B:

Whether it was the data or the monitoring, etc.

Speaker B:

But somebody was able to go back in, find this data because the data was captured and it shows somebody showing up to the house, mask, backpack on, etc.

Speaker B:

And spraying the camera system out.

Speaker B:

So now they, they're like, so there is a person that came to the house.

Speaker B:

There was somebody this height, this build, this size, spray painted the camera.

Speaker B:

Like there's some clear evidence that somebody came there, obstructed the camera and then came in.

Speaker B:

So at first this was like, this is great news, right?

Speaker B:

Like there's a camera.

Speaker B:

Well, the catch to that camera was that the person who owned it didn't pay for the subscription to keep that data.

Speaker B:

So how was it keeping the data and where was that data being kept?

Speaker B:

And now how did people get access to that data etc.

Speaker A:

And it's been shared and one other element is it was claimed that the data was deleted.

Speaker A:

Like this, this whole idea of data deletion.

Speaker A:

So how deleted?

Speaker A:

It raised the questions to me how deleted is deleted, when is it actually deleted?

Speaker A:

I mean those of us that have worked on forensics and, and, and recovery of drives know that deletion isn't deletion.

Speaker A:

It's just a removal of a pointer to a data element.

Speaker A:

Even in modern SSDs, the data spread everywhere, but you can figure out how to put it back together with enough time, energy and interest.

Speaker A:

So all of these things come together to say that there's a lot of monitoring going on and think anybody denies that.

Speaker A:

And it's back to the, back to the balance of what is security versus privacy.

Speaker A:

And when will, when will Americans say this is too much?

Speaker B:

And let's take this in conspiracy theory here, right?

Speaker B:

So that comes out and immediately there's people like thank gosh, this camera was holding this data so that they were able to see this, there's some clear evidence, etc.

Speaker A:

And there are people that think that.

Speaker B:

Yeah, yeah.

Speaker B:

And it reminds me of, remember the attack that happened out in la, right?

Speaker B:

And the.

Speaker B:

The shooting that took place, this was years ago.

Speaker B:

And the U.S. government said there is a possibility of additional terrorist cells.

Speaker B:

We need access to this terrorist phone.

Speaker B:

Apple's not giving us access.

Speaker B:

And they put on a great parade in front of America where it was like, I cannot believe Apple is not doing this and helping us.

Speaker B:

Right?

Speaker B:

And it was like, well, no, we put these things in place so nobody has access to that.

Speaker B:

You shouldn't.

Speaker B:

We did this.

Speaker B:

So there wasn't a back door and a loophole, Right.

Speaker B:

And they stood by it and fought,

Speaker A:

but, well, there was nothing to fight about.

Speaker A:

They literally didn't have the keys.

Speaker C:

Right.

Speaker A:

They designed it without the ability to backdoor it.

Speaker A:

So it wasn't a fight.

Speaker A:

It was, I. I can't do that.

Speaker A:

I do not have the thing you need.

Speaker C:

These are not the droids you're looking for.

Speaker A:

Yeah.

Speaker B:

And the sad part of that story, too, is that the US government was able to use a different tech firm out of a different country to break into the phone anyways, and had access to it in a very meaningful amount of time, but still went on this parade for months, right?

Speaker B:

Trying to draw consumer advocacy and American attention to.

Speaker B:

Well, Apple needs to do this.

Speaker B:

And this is why.

Speaker A:

And how did it work?

Speaker B:

Did.

Speaker A:

They.

Speaker B:

Did.

Speaker C:

Did.

Speaker A:

Did the.

Speaker A:

I actually don't remember what the c. What the general sentiment was at the end of that.

Speaker A:

Like, did.

Speaker A:

Did America.

Speaker A:

As though America ever says one voice that.

Speaker A:

Did America say they should have opened it or they should have built it with the back door, or was it the other?

Speaker A:

I don't actually know.

Speaker C:

Another episode of the Kardashians came on and it just disappeared.

Speaker A:

Shiny.

Speaker C:

Yeah.

Speaker B:

So it.

Speaker A:

It.

Speaker B:

There was a sentiment, right, that.

Speaker B:

I can't believe Apple isn't doing this.

Speaker B:

Like, and if you go back and watch some of it.

Speaker B:

But I thought Apple handled it pretty well and they took a pretty, you know, like, line in the sand, like, no, we're never going to go back and do this.

Speaker B:

What they didn't do was create marketing to explain to everybody why.

Speaker B:

And I think it was probably the right thing not to do, because it's almost like now we're going to fight you, the government.

Speaker B:

Right.

Speaker B:

And depending on who your leader is at the time can create life, make life very difficult for you as a business.

Speaker B:

So they took their stance, explain why that's not going to be done.

Speaker B:

There was a little bit of fallout there over that next 12 months between administration and Apple, but at the end of the day, I I don't even think people remember that or think about it, but then this happens, right?

Speaker B:

And people, you know, right away we're like, well, like, I.

Speaker B:

This is how I found out about it.

Speaker B:

A significant other and another person sitting together were like, o', Brien, did you see the news?

Speaker B:

And I'm like, what news?

Speaker B:

And they explained it and they're like, yeah, but the camera was able to catch all the information.

Speaker B:

Like, thank gosh, right?

Speaker B:

And I was like, huh?

Speaker B:

So I start to do the little bit of background research, and I'm like, actually, this guys, that's not a good thing.

Speaker B:

And they're like, what do you mean that's not a good thing?

Speaker B:

Right?

Speaker B:

And I'm explaining it.

Speaker B:

But your fir.

Speaker B:

That first feeling you have, right as this citizen, is, thank gosh, right, that that worked and that the camera caught it, because now they're.

Speaker B:

It's the first part of identifying who this could be.

Speaker B:

Which, I mean, I'm not sitting here saying, I'm not for catching who did it, and I'm not for getting the mother back, except I'm all for that.

Speaker B:

But, like, at what point then let's say your phone, that you've been taking videos of your kids and your family and right.

Speaker B:

Events, and every two years you decide to clean it up and delete some of it.

Speaker B:

Is it deleted?

Speaker A:

Depends on where you store it, right?

Speaker B:

And then in what 500 page terms and conditions that I clicked on to agree for this app, did somewhere in there say, right.

Speaker B:

Like, even when you delete it, it's still going to be there, but we're not going to use it.

Speaker B:

But guess what?

Speaker B:

We updated our terms and conditions a year later, and we are able to use that.

Speaker B:

And we do have that.

Speaker B:

Like, and is this why there's so many data centers, compute centers being built?

Speaker B:

Like, guys, we have all the data, and the new T's and C's give us access.

Speaker B:

Let's go, baby, let's dump it all in.

Speaker A:

But let's even talk without nefarious reasons.

Speaker A:

There's.

Speaker A:

There's a trend in modern SASS systems to make a deletion retention policy it says will be deleted upon demand or at the end of your contract, notwithstanding backups for which backups will age out normally over a course of some months, 612 months, but that which will contain data until they've aged out because backups are done in bulk.

Speaker A:

Backups are done, you know, whatever the reason.

Speaker A:

I mean, and so I think there's.

Speaker A:

There is a legitimization of it.

Speaker A:

There which means data exists in most SaaS services beyond the deletion, which is if you spend time with lawyers, you spend time in any kind of discovery should send you into the corner and let make you start weeping gently and rocking.

Speaker C:

But yeah, well, I've seen a lot of those contracts are usually written that it's.

Speaker C:

We have the right to delete it past this point doesn't definitively say we're going to delete it.

Speaker C:

That it's just.

Speaker C:

It's.

Speaker C:

It's our decision.

Speaker B:

And it and Eric using that in context.

Speaker B:

Right.

Speaker B:

If data is the new gold or the new oil, how many people do you see being like, yeah, I just want to get rid of this gold bar takes up space.

Speaker B:

Yeah, I want to get rid of this barrel of oil.

Speaker B:

Considering oil's at a good price, I'm just going to dump it.

Speaker B:

Right?

Speaker B:

Like, yeah, they have the ability to.

Speaker B:

But if that data is worth something to them to be able to monetize.

Speaker C:

Oh, totally.

Speaker B:

So the conundrum, are all of you guys going out and buying ring cameras?

Speaker A:

I haven't had an Alexa or a ring or anything data collecting there that I can't ensure is encrypted since, like, sure, I had an Alexa in for about a week and went nope and have not had one.

Speaker A:

All the camera work that I have is all local and it records locally into the house.

Speaker B:

Eric.

Speaker C:

I'm the opposite.

Speaker C:

I got nest everywhere.

Speaker A:

Well, actually, a quick question.

Speaker A:

Do you, Eric, do you have cameras in the house or only around the outside?

Speaker C:

Both.

Speaker A:

Okay.

Speaker A:

See, even that I won't put cameras in the house.

Speaker A:

There's a whole other level of.

Speaker A:

There was an interesting piece on what DJI's robot vacuum.

Speaker A:

Apparently vulnerabilities exist to the point where it was.

Speaker A:

The camera was remotely accessible very easily.

Speaker A:

There's just like, don't just.

Speaker A:

If you don't put the camera in the house, no one can see in the house.

Speaker B:

Yeah, And I have all unifi now, and I just replaced the ring camera with my unifi camera system.

Speaker B:

I don't have any cameras inside the house.

Speaker B:

I have cameras outside of the house.

Speaker B:

I do have Amazon Alexa and it.

Speaker B:

Dan, this one's tough because when I'm doing my cooking and I want to know what the weather is today, right.

Speaker B:

I can ask it and it tells me if I want to know if I'm making my shopping list, like from a cook's perspective when I am running out of basil or realize there's only, you know, one head of garlic left.

Speaker B:

But I have plans to make X, Y and Z.

Speaker B:

It's like you tell yourself, oh, I need to do that.

Speaker B:

But when you're done cooking and you try to remember back of all the things that you needed to do, you don't write it down, and you don't do it while you're cooking because hands are greasy, messy, etc.

Speaker B:

So I just let the device know.

Speaker B:

Please add this to the shopping list.

Speaker B:

Please add this to the shopping list.

Speaker B:

Please add this to the shopping list.

Speaker B:

Right.

Speaker A:

That you can conveniently buy right from Amazon.

Speaker B:

Well, I, I don't buy right from Amazon because, like, I have this grocery store I go to for the produce.

Speaker B:

This one for the fresh herbs, right?

Speaker B:

And it's funny because, like, every now and then when I'm at the store, I have to giggle because there's something on there that I know I didn't put on.

Speaker B:

Right?

Speaker B:

And it's a giant bag of marshmallows that the kids wanted on.

Speaker B:

Right?

Speaker B:

And it's like, those guys almost got me.

Speaker A:

Right.

Speaker B:

So I do use it.

Speaker B:

Right.

Speaker A:

Yeah.

Speaker B:

And I have the same concerns, right.

Speaker B:

I.

Speaker B:

It's probably why I'm getting notifications to take counseling.

Speaker B:

Right.

Speaker B:

They're probably like, man, that guy's really loud inside the house.

Speaker B:

We should give him ads.

Speaker A:

Are you okay?

Speaker B:

Right?

Speaker A:

Well, yeah, no, there's this nanny state that comes along with this.

Speaker A:

The, you know, and by the way, I, I.

Speaker A:

There is a place for 988.

Speaker A:

There's a huge place for 988.

Speaker A:

And the suicide hotlines and things like that.

Speaker A:

But then the question of, we've sensed that you might be having a bad day.

Speaker A:

Are you okay?

Speaker A:

Automatically coming at you.

Speaker A:

And I think there's pros and cons to that, but, like, just does not make me want to use this stuff.

Speaker B:

Right.

Speaker B:

It's like, I'm always stressed.

Speaker A:

Leave me alone.

Speaker A:

No, I, So yes, I want to commit suicide on you, the camera.

Speaker B:

But going back to the whole ring situation, though, right?

Speaker B:

Like, ring is not the first.

Speaker B:

Right.

Speaker A:

No.

Speaker B:

Very surprising to put that commercial out there and to think that everybody was a dog owner and going to be, like, greatest thing ever.

Speaker B:

Now I can.

Speaker A:

Even dog owners, even dog owners thought the dystopian next step of it was there.

Speaker B:

Like, to me, like, from a business case, what was it?

Speaker B:

Yeah, we're gonna get buy in.

Speaker B:

Or was it like, we're gonna use this as a way to let everybody know because we see some ramifications coming?

Speaker B:

Because what's about to come out on the Guthrie case.

Speaker B:

Yes.

Speaker B:

We have all your data, but we're doing it for the dogs.

Speaker A:

Yeah, but that, that.

Speaker A:

That video was vet.

Speaker A:

It was out and vetted.

Speaker A:

I saw it a week before and made the same comment.

Speaker A:

Like it's been d. The.

Speaker A:

The ind.

Speaker A:

It was independent of the other issue.

Speaker B:

But yeah, more conspiracy theories, though.

Speaker B:

Dan, I am here.

Speaker C:

I'm not.

Speaker C:

I'm not sure we're the customer on this one.

Speaker C:

I think in.

Speaker C:

I think in this case, and this is kind of going back to the last episode where we talked about that, I think the vast majority of people are willing to trade off privacy for functionality.

Speaker A:

Sure.

Speaker A:

Well, the same thing.

Speaker A:

The same thing with the.

Speaker A:

Mark Zuckerberg was in court this week talking about social media.

Speaker A:

People won't stop using social media despite its very clear, you know, impact.

Speaker A:

And some of the people.

Speaker A:

The people were stopped at the door because they were wearing those idiotic meta Facebook Zuck camera eyeglasses.

Speaker A:

And he made mention in there about the intent to put on facial recognition on those and make it tell you who you who as you run across person.

Speaker A:

If they're in your network or if they're.

Speaker A:

If they have a publicly posted Instagram account, they will show you who they are.

Speaker A:

And.

Speaker A:

And I've heard multiple people go, oh, yeah, I definitely want that.

Speaker A:

And would I love it from a utility perspective?

Speaker A:

You bet.

Speaker B:

Absolutely.

Speaker B:

But you've seen the videos of the guy who wears them, right?

Speaker B:

And just walks up, like someone passes and it's like, Eric, right?

Speaker B:

Yeah, yeah.

Speaker B:

Like I.

Speaker A:

But I take greater joy in having that.

Speaker A:

That recognition happen in my own head and go.

Speaker B:

But they're like, you work for.

Speaker B:

In the, like, name the company and it's pulling up all their information.

Speaker B:

Right.

Speaker B:

And he's doing it, and then when he explains it, they're like, whoa.

Speaker B:

And you have to imagine people see that, like, dude, that'd be cool to be able to do that right now.

Speaker B:

So not only was he in court though, right.

Speaker B:

And what was he in court for?

Speaker B:

The impact of social media, but specifically because there's a requirement or when you fill out your profile, anyone under the age is at 13, Dan.

Speaker A:

In the US it's 13.

Speaker A:

In other countries, it's other things, but.

Speaker A:

Right.

Speaker A:

Copa in the US applies at 13.

Speaker B:

Okay, so let's just say it's 13.

Speaker B:

Right.

Speaker B:

They know the number of profiles that are filled out that have nothing in it.

Speaker B:

Right.

Speaker B:

Not known.

Speaker B:

Right.

Speaker B:

Well, that's a red flag.

Speaker B:

And the data showed that they absolutely knew they had a ton of underage kids using the platform and program.

Speaker B:

And because they did studies.

Speaker B:

They also knew that this impacted young kids.

Speaker B:

So it's.

Speaker B:

Okay.

Speaker B:

So you knew it impacted young kids.

Speaker B:

You knew you had young kids and you knew you had all these profiles.

Speaker B:

Without this, what did you guys do?

Speaker B:

Like, sold them more ads.

Speaker B:

Yeah.

Speaker B:

Right.

Speaker B:

So now you're talking the cameras.

Speaker B:

Right.

Speaker B:

Like, of course.

Speaker B:

Why wouldn't you allow the camera system into the judicial hearing or the.

Speaker A:

Where cameras are forbidden.

Speaker B:

Yeah, because cameras are forbidden.

Speaker B:

Right.

Speaker B:

But if someone's walking around with them capturing your data.

Speaker B:

But let's use the idea that they're capturing someone under the age of 13.

Speaker B:

Oh, does the camera automatically say, oh, sorry, that that person's under age of 13?

Speaker B:

Right.

Speaker A:

Well, in theory, know that unless they

Speaker B:

had information on the person to know that they were under the age of 13.

Speaker B:

Right.

Speaker A:

Like, and which of that kid's 15 fake Instagram profiles, which.

Speaker A:

This just in for those of you with younger kids, they make them and they lie about their age.

Speaker A:

The, the.

Speaker A:

Which of those pro.

Speaker A:

Which of those do you go by?

Speaker A:

Which is the real version of them?

Speaker A:

There's all sorts of craziness in this.

Speaker A:

But.

Speaker A:

But the idea that people would actually buy this mishigas.

Speaker B:

Yeah.

Speaker C:

So Eric, it is, it is interesting.

Speaker C:

I mean when you go down that rabbit hole on these, all of these different platforms know this crap is going on and they don't do anything about it.

Speaker C:

Like I, I think the same thing about Venmo.

Speaker C:

I think it's garbage that we got scammed on something.

Speaker C:

At one point as I started, since everything's public, right.

Speaker C:

So you start looking at the transactions like holy.

Speaker C:

It was super apparent.

Speaker C:

Can I say that?

Speaker C:

Super apparent right away that you could see the pyramid of people passing money to the holders that were running all of the different scams and everything.

Speaker C:

Venmo doesn't care.

Speaker C:

They won't do anything about it.

Speaker A:

The, the, the Venmo public thing, by the way, I believe there was for the, for the 10 minutes I got Venmo and then quickly killed it because of this issue.

Speaker A:

There, there is a way to turn off public transactions.

Speaker A:

So if people are dumb enough to do it in public transactions, they have, they, they, they should get what's coming to them in terms of that public view.

Speaker A:

I will post.

Speaker A:

There was a great article, it's many years ago now in which they tracked drug deals through there.

Speaker A:

Like the, the journalist went and found a communication set of drug deals through the notes field an affair and a bunch of other things through Venmo comments.

Speaker A:

It's hilarious.

Speaker B:

Like again, you created a social Media platform to send money, right?

Speaker B:

Like, and people want everyone to know, Yeah, I just paid my buddy Eric for a great night out for wine.

Speaker B:

How do I know that?

Speaker B:

Two dude smiley faces, Two dudes wines, one puke emoji.

Speaker B:

SL money, right?

Speaker B:

It's like, huh, I bet they had a great night out drinking a bunch of wine, right?

Speaker B:

Like, who needs to know this, right?

Speaker B:

And how much money you spent on doing it?

Speaker B:

Which is exactly what that is, right?

Speaker B:

Like, again, the social media aspect of wanting people to know everything you're doing and what you're doing, right?

Speaker B:

Like, it's almost as if Ring was like, well, this is kind of socially acceptable.

Speaker B:

You put all your pictures up anyways inside all these other platforms.

Speaker B:

All we're doing is trying to save the world, Eric.

Speaker B:

We're just trying to make our neighborhoods.

Speaker C:

I'll take that debate, though, because just because something's built doesn't mean people actually have to use it.

Speaker C:

I blame people on this one.

Speaker C:

I.

Speaker C:

The fact that people feel like they have to broadcast to everybody, this is what I made for dinner.

Speaker C:

This is what I'm doing right now.

Speaker C:

Look at my family.

Speaker C:

Look at why.

Speaker C:

Why.

Speaker C:

Where did we ever get to that point that we felt like we had to brag about everything online?

Speaker B:

And I'm the person extremely judgmental.

Speaker B:

When Kelly shows me somebody's dinner that got a bunch of likes, and I'm like, who are the people liking that?

Speaker B:

Right.

Speaker B:

There's no grated parm over the top of that.

Speaker B:

They're not using a great finishing olive oil.

Speaker B:

You can tell it was heated.

Speaker B:

The flavor profile's gone.

Speaker B:

No, taking that way over the top.

Speaker B:

I will agree.

Speaker B:

Yes.

Speaker B:

The people problem, right?

Speaker B:

But from an education standpoint, like the age of when MySpace came out, then Facebook, etc, and the.

Speaker B:

You like the using it, like, wow, what a way to stay connected with people, you know, that I haven't seen in years from college or High School, etc.

Speaker B:

But the age that we started using it was basically in our mid-20s, right?

Speaker B:

You know, late 20s.

Speaker B:

That user demographic and profile.

Speaker C:

Well, I was younger.

Speaker C:

Younger for me, since you guys are so much older.

Speaker A:

They were baked in.

Speaker A:

Yeah.

Speaker A:

Eric was born to a Facebook account,

Speaker C:

MySpace.

Speaker C:

My first friend was Tom.

Speaker C:

Thank you very much, Tom.

Speaker B:

Love Tom.

Speaker A:

But you actually.

Speaker A:

It actually rings really true around generational studies.

Speaker A:

You know, I know I have to check my.

Speaker A:

My Gen X mindset at the door for a lot of this stuff.

Speaker A:

The, the, the.

Speaker B:

The.

Speaker A:

The generations that follow grew up with very different things that are.

Speaker A:

That are interesting to them and that they will, will do and won't do.

Speaker A:

And I know this comes into play in the workplace.

Speaker A:

It comes to play in the personal life and in the use of technology as well.

Speaker A:

And it shifts from millennials, it shifts to Gen Y, it shifts to, you know, Gen Z more and more and more.

Speaker A:

But interestingly, the Gen Z seem to be more closed because I think they're seeing them, they're seeing the mistakes of the millennials.

Speaker C:

And those are definitely a snapback.

Speaker A:

Yeah, yeah, yeah.

Speaker B:

I, I'd like to take the debate back to Eric, though, here.

Speaker B:

So basically what you're saying, guns don't kill people, people kill people.

Speaker B:

And I'm using that distinction because my

Speaker C:

pencil doesn't spell words.

Speaker C:

I misspell words.

Speaker A:

No, it's Mr. Larson in Happy Gilmore that kills people.

Speaker C:

Now, is there, is there an element here of the.

Speaker C:

Oh, my gosh, I forget.

Speaker C:

We had a neighbor who was a lawyer and we were talking, we got on this topic on the, A playset in the yard and a child coming over and getting hurt.

Speaker C:

And naturally my wife and I were like, oh, yeah, well, they're trespassing.

Speaker C:

Well, no, because it's an enticing something that actually got them to come there.

Speaker C:

And so using that, you could make the argument about social media.

Speaker C:

Right, because we know that there is an addiction level that comes from the way, hey, you haven't logged on in a while.

Speaker C:

I'm just going to drop you a little something to get you to reengage.

Speaker C:

So no doubt.

Speaker C:

I'm not arguing there shouldn't be guardrails around this, but the evidence has been there for so long now, the studies that have come out, it is generally known the impact it is having on society, and people still continue to go to it.

Speaker A:

This sounds so much like the debate on cigarettes before you were born, Eric.

Speaker A:

This sounds so much like the debate on cigarettes.

Speaker C:

Oh, yeah, yeah.

Speaker B:

This is where I'm going with it.

Speaker B:

Like, so, like the book Gunfight was a great book because backing up to when laws were put in place around guns, etc, the volume of sales, etc, wasn't there.

Speaker B:

But.

Speaker B:

And, and this has moved.

Speaker B:

This isn't, you know, politics.

Speaker B:

Red, blue, Republican, Democrat, because it shifted sides back and forth over the last 30 years.

Speaker B:

But what became very apparent was the volume that you could sell.

Speaker B:

And if you use fear as a reason, like, hey, they're going to take the guns away.

Speaker B:

Now's the, Now's the time.

Speaker B:

And creating all different levels of guns that people could go buy.

Speaker B:

And people were like, yeah, we need to go do this now because we can have them and I have the right to have it right now.

Speaker B:

Really interesting read.

Speaker B:

But there's always that philosophical debate of do guns kill people or do people kill people?

Speaker B:

And when you said, you know, like, but this is a people issue, right?

Speaker B:

Like you're choosing to put all that out there, right?

Speaker B:

Like it's your choice.

Speaker B:

You did it.

Speaker B:

Right.

Speaker B:

Like, but there is, I will say that candy piece, right.

Speaker B:

Of getting people familiar with it, right.

Speaker B:

There's so much over here in the T's and C's.

Speaker B:

We know they're never going to read all of it, but this is what we want at, like we know what we want and we want to be able to build a profile on you of all your metadata.

Speaker B:

We're not publicly saying that, but if I can get you hooked on the platform for six years, I basically know you better.

Speaker A:

You know, you want the social media

Speaker B:

so the.

Speaker C:

And it's tough when you totally over here.

Speaker A:

Yeah, exactly.

Speaker B:

So, yeah, I'm, I'm in your camp and I'm not in your camp.

Speaker B:

Like I, for me it's like, like the whole idea, yeah, it'd be great to have the safest neighborhood ever, right?

Speaker B:

Like, why do people move into different neighborhoods to get to different school districts?

Speaker B:

Because that school is quite frankly a better school.

Speaker B:

Look at all the ratings, right.

Speaker B:

Etc.

Speaker B:

People do that same thing with private school, right?

Speaker B:

Like, yes, I would want my neighborhood safer, but at what cost?

Speaker C:

Right?

Speaker C:

Yeah, well there's, I, you know, I look at this argument and there, this debate and I think it's an issue of accountability, right, that if we think about it, that we, we continue to look at the social media platforms, I'm not saying that they shouldn't be held accountable for, you know, we're talking about in the context of children that are signing up.

Speaker C:

But at the same time, Apple and Android have both had the ability on kids phones to control what apps are allowed to be on there in the first place.

Speaker C:

Where's the accountability on flipping parents, on being parents rather than just trying to be a friend or not being present.

Speaker C:

We can't solely just look at the company that's created this.

Speaker C:

Granted, we could go down the rabbit hole on the, we have to keep earning more and more money.

Speaker C:

So we're building all these futures which means that we have to create this digital drug that's continuing to draw people in.

Speaker C:

But there is a lack of account ability in other areas that draws into this problem.

Speaker C:

Now in the, the case of, you know, as we're Talking about the, the cameras.

Speaker C:

This is.

Speaker C:

I don't know, there's.

Speaker C:

For me there.

Speaker C:

There's a creep factor that's.

Speaker B:

Yeah.

Speaker C:

In there that I'm always being watched.

Speaker C:

And I, I feel like that, that we use the case in this.

Speaker C:

Oh, well, look, because of this, we have the video of, you know, somebody showing up to the house and blacking out the camera.

Speaker C:

That it is, it's an ends justify the means argument that it won't be very long and we're going to see the reverse of this, that we're going to see it used in a nefarious way and go, oh, no, that shouldn't be done.

Speaker C:

And it's, it's all of these kind of myopic looks at these.

Speaker C:

Rather than society standing up and going, no, we're tired of this.

Speaker C:

Stop monitoring us everywhere we go.

Speaker B:

There is a naive enough to not think about all the.

Speaker B:

When I say craziness that's out there in the world.

Speaker B:

Right.

Speaker B:

What the drug enforcement agencies are doing, right.

Speaker B:

The FBI, like all the different crime and fraud and everything that's being conducted.

Speaker B:

Right.

Speaker B:

And there is a whole article I just read on the different surveillance techniques across Arizona and New Mexico.

Speaker B:

Right.

Speaker B:

To try to track.

Speaker B:

Right.

Speaker B:

The way drugs are brought in.

Speaker B:

Right.

Speaker B:

And it.

Speaker B:

Yeah, you could stop one shipment, but it's almost better to be able to get that traceability.

Speaker B:

Right.

Speaker B:

And continuously watch to see the progression and the spread.

Speaker B:

So we understand the model and the distribution model and then try to take it all down at once.

Speaker B:

Right.

Speaker B:

But then it just kind of brings itself back up again and the next bad person shows up and it's like, well, and it's the same thing we talk about in cyber security, like, why does it exist?

Speaker B:

Right.

Speaker B:

Like, well, where money's transacted.

Speaker B:

So eradicate crime 100 and we're good to go.

Speaker B:

And if you can eradicate war, then there's no command and control and it all comes back to humanity.

Speaker B:

Right?

Speaker B:

And the idea of the church, right.

Speaker B:

Instilling morals in people, but just going doesn't make you moral, right?

Speaker B:

Just going to school doesn't make you highly educated.

Speaker B:

There's still a fundamental, you know, responsibility as the parent, right.

Speaker B:

To help instill those morals, to help with reading and writing, to help educate on the technology that's being used.

Speaker B:

But if they lack the education, if they lack the morals, it makes it that much harder to instill those in, which then brings back to the technology that people use today.

Speaker B:

If they lack the understanding of even the basis of the technology or how it's monetized, then they have no idea.

Speaker B:

Right.

Speaker B:

And I guarantee, though, that most of those big SaaS companies are betting on that.

Speaker B:

Most of the people a, have never read the T's and C's.

Speaker B:

I'm one of them.

Speaker B:

I do not have the time to read all of them.

Speaker B:

But with AI Buzzword, I can now plug some of that stuff in and be like, hey, let me know what I need to know.

Speaker A:

The AI written by the people.

Speaker A:

The AI written by the same people who are trying to get you to use the terms and conditions.

Speaker B:

Yeah, they're like, be careful of Brian.

Speaker B:

He's really reading the T's and C's.

Speaker B:

Should we tell him?

Speaker A:

No, it's Zuckerberg.

Speaker A:

Pays.

Speaker A:

Pays open AI so that when you search that in OpenAI, OpenAI says, yeah, Facebook is a great company with no privacy issues.

Speaker B:

That goes into an even bigger.

Speaker B:

Oh, sorry, Dan.

Speaker B:

I was gonna say bigger scenario of even when Elon had come up about X and said some of the stuff he did with.

Speaker B:

Well, we can fill in the gaps of some of the history lessons that aren't there and we can rewrite some of the other history lessons of what's actually appropriate.

Speaker B:

It's like, not really.

Speaker B:

Right.

Speaker B:

You.

Speaker B:

You really can't.

Speaker B:

Like, people need to be able to make their own decisions on what they read and have their own feelings on what that history tells them.

Speaker B:

Right.

Speaker B:

Rather than trying to demand that the way you want it to write and the way you're now going to.

Speaker B:

And I'm not saying that AI is just a brilliant search engine.

Speaker C:

It is.

Speaker B:

It is.

Speaker B:

But if that search engine is only going to spit back out the historical data that you think is appropriate, you've now created a bunch of convoluted robots.

Speaker B:

And the humans that are doing it.

Speaker A:

There is a need.

Speaker A:

You.

Speaker A:

You bring a.

Speaker A:

You bring up a good point, Brian.

Speaker A:

Two good points.

Speaker A:

Lots of good points, but two good points.

Speaker A:

One, around parents.

Speaker A:

Parents really do continue to hold the cards.

Speaker A:

Parental controls are there, but only if you use them.

Speaker A:

But combine that with.

Speaker A:

And again, American parents have ceded a lot of this to the technology to be the babysitter.

Speaker A:

I mean, it's a TV.

Speaker A:

If the TV were interactive, this would have happened 60 years ago.

Speaker A:

It's the same problem just continued, but now in a way that's a lot more manipulatable.

Speaker C:

The.

Speaker A:

I will maybe.

Speaker A:

This is my new.

Speaker A:

My new business venture where I will come into the house and scare your kids shitless about the data that they.

Speaker A:

That.

Speaker A:

That goes when they use these things.

Speaker A:

Still to this day, my son breaks out privacy policies with his friends and reads them before they sign up, before he'll let them sign up for technologies.

Speaker A:

I think that's great.

Speaker A:

Does it scale?

Speaker A:

Maybe not, but I think that's the kind of thing that parents should do.

Speaker A:

Maybe not to the same level, but parents should be informing their kids about the implications of their, you know, of their actions and parents should be parents.

Speaker A:

I, the other thing you said, Brian was around, you know, tying this back to, to information security and the idea of risk.

Speaker A:

The idea, the idea of taking some of the usage of this stuff along the risk model.

Speaker A:

And we in infosec we have a lot more control.

Speaker A:

We open up certain things, we allow for certain things, we don't allow for certain things, we recommend certain things, but we have control over the whole system.

Speaker A:

And that's the thing that doesn't seem to exist in these technologies.

Speaker A:

It happens.

Speaker A:

The data is well secured.

Speaker A:

So you know, I'm not trying to say that the data isn't well secured because it usually is.

Speaker A:

They got phenomenal security teams.

Speaker A:

But the, but the, the.

Speaker A:

How it's used, how it's used changes.

Speaker A:

So when you make an assessment to start, it isn't necessarily how it's going to go.

Speaker A:

This applies to companies as well that use technology.

Speaker A:

How it starts isn't how it is five years from now.

Speaker A:

And I, if you're not doing re.

Speaker A:

If you're not re looking at the data usage, the data flows, the third party partners.

Speaker A:

I actually just before this sent you guys a, a thing on LinkedIn LinkedIn's verification.

Speaker A:

Just today somebody posted an analysis of all the places that LinkedIn that Microsoft.

Speaker A:

LinkedIn is Microsoft.

Speaker A:

Let's not forget that shares data back to other places including OpenAI, including just this massive list of places when you do their verify.

Speaker A:

And that includes your facial recognition.

Speaker A:

It includes passport information.

Speaker A:

It includes, you know, the size of each ass cheek.

Speaker A:

I mean there's.

Speaker A:

Everything is in this.

Speaker A:

So, so I have not done it.

Speaker A:

I choose not to do it.

Speaker A:

I try to keep my face, you know, yet I go on YouTube as a, as a podcaster but, but I tried.

Speaker B:

Can you change the size of your butt cheeks by squeezing them together during.

Speaker B:

When it takes the photo verification?

Speaker A:

Yes.

Speaker A:

Maybe if you're a security person they know because you're always clenched.

Speaker A:

But I mean we don't have the

Speaker C:

control lump of coal.

Speaker C:

Two weeks we'd have a diamond.

Speaker A:

That's right.

Speaker C:

Something along those lines.

Speaker A:

I never deny that.

Speaker A:

I mean the, the, we have control over the system internally.

Speaker A:

And that's one of the things where you, there's a tie but we can take some lessons learned.

Speaker A:

Like you come off the edge, you come off the zero, the, the, the locked in.

Speaker A:

The locked in a, in a closet, turned off, no data on and no connection, no power and come and do something useful because there is some utility but you still have to make the decision about what the trade offs are.

Speaker A:

But you can't make that decision unless you're informed about what the trade offs actually are.

Speaker A:

And that transparency is not there.

Speaker A:

It is there.

Speaker A:

It is hidden, it is obfuscated.

Speaker A:

That transparency by and large in modern consumer systems, especially when it comes to the cameras, when it comes to social media and all these other scary places we've talk, it is not clear.

Speaker C:

Maybe, maybe that's almost what we need.

Speaker C:

As much as this pains me to say that we need another government agency, but it's almost like the SEC for regular people, right?

Speaker C:

Because you think about the SEC and the disclosures you got to have at a public company.

Speaker C:

It's all under the guise of transparency for investors, right?

Speaker C:

What about transparency for the normal person or in Dan's terms, his average friend that we're watching the super bowl that would have made like, maybe that's where this needs to start.

Speaker C:

Heading is, hey, these are the rules to play.

Speaker C:

These are, this is the level of transparency that you have to provide.

Speaker C:

That it's not hidden in 100 page document that nobody understands because it's all legalese.

Speaker C:

No, it's bolded it out in normal language to understand.

Speaker C:

These are the trade offs.

Speaker C:

Here's what I'm giving you, here's what I'm taking from you.

Speaker B:

I think maybe I saved this for just a private conversation, but if somebody came out with a SaaS application that was free to use at first, that literally, if you were to Download Facebook, download LinkedIn would tell you 100% what the T's and like here's what you need to be aware of in the T's and C's.

Speaker B:

Oh, and by the way, here's how you can turn off this, this and this and we can go ahead and do that for you.

Speaker B:

Can you give us a connection in.

Speaker B:

Right.

Speaker B:

And it went ahead and did those things because that's the trouble that people have is they're not going to read all that documentation.

Speaker B:

Then when someone says, well you can turn that off.

Speaker B:

It's like where, where do I do that?

Speaker B:

How do I go in?

Speaker B:

Like the only thing I know how to do is post the photo Right.

Speaker B:

Like that's where people struggle because everyone's different, meaning every person, but every application's different.

Speaker C:

But if you just post on Facebook that I do not consent to Facebook's right to do whatever, it must be binding.

Speaker A:

Well, all I know is recently I signed up on a isolated phone with a phone number not registered to me for Instagram, for this network for distilling security.

Speaker A:

And I have never been so confused by a product, by a product, security setting or settings in general.

Speaker A:

It is endless, endless, endless.

Speaker A:

And I'm a relatively informed and smart guy.

Speaker A:

Not particularly just relatively surrounded by average people.

Speaker B:

Eric, he said normal people.

Speaker A:

No, I said average first and then I said normal.

Speaker A:

The question is what is the Venn diagram of those two areas?

Speaker A:

And it this thing, this is ridiculous.

Speaker A:

I don't know how somebody finds it unless you go looking for it and then it just like oversight.

Speaker A:

Yes, I am a Democrat.

Speaker A:

Yes, I believe in oversight.

Speaker A:

Yes, I think more regulations are the only way that this, some of this stuff is going to get solved.

Speaker A:

I think that regulations are, are probably what it's going to take.

Speaker A:

But God help us in this country getting any kind of anything passed is

Speaker B:

hard or consumer blowback.

Speaker A:

Yeah, I mean the money, I mean you do you follow the wallet.

Speaker A:

But I think there's a lot of people already so embedded in some of this stuff.

Speaker A:

It's going to be hard.

Speaker A:

But I was again, come bring it back around as we, you know, as we finish up here, bring it back to the ring doorbell.

Speaker A:

The fact that, the fact that it was such a hard blowback from a wide variety of populations is, I guess it's encouraging to me.

Speaker A:

And on that note, we're out of time, guys.

Speaker A:

Thanks for joining us, listeners.

Speaker A:

Thank you, Eric.

Speaker A:

Thank you, Brian.

Speaker A:

And thanks to you, the listener.

Speaker A:

We love having you here.

Speaker A:

You can find us on our website, distillingsecurity.com all the episodes of Great Security Debate are available and we are adding back mentor core which will be starting up a new season soon and some of our new shows as well.

Speaker A:

You can find us on YouTube.

Speaker A:

YouTube.com the@sign great security debate.

Speaker A:

You can email us securitydebateistillingsecurity.com or you can just shout into the void and I'll probably hear it.

Speaker A:

Thanks a lot and we'll see you again on the next Great Security Debate.

Speaker B:

It.

About the Podcast

Show artwork for Great Security Debate
Great Security Debate
Security From All Views