The modern home is continually getting more connected. But as much as we love our virtual assistants, smart thermostats and cloud-enabled security cameras, are we really aware of the risks they invite into our homes? And how can we enjoy the latest digital technologies without compromising security and privacy? F-Secure’s Tom Gaffney joins us for episode 27 of Cyber Security Sauna to discuss why and how IoT makes us vulnerable, how we can protect ourselves, and what IoT device makers should be doing.

Listen, or read on for the transcript. And don’t forget to subscribe, rate and review!

| FOLLOW US ON TWITTER

Janne: Welcome, Tom. Why don’t you tell us what you do at F-Secure?

Tom: I work in the consumer side of our business. My main job is leading the presales teams, so engineering into operators. So a lot of that is articulating what the security threats are, how do we deliver security to them. And a lot is listening to the problems that they get from their customers and from their networks.

Fair enough. So what has the emergence of smart home devices meant for the infosec industry?

In general, it’s just exposed a bunch of new challenges in terms of Mikko Hypponen’s old line of everything that’s smart is vulnerable. It’s absolutely true. So we’re just seeing a ton of new devices coming into the customers’ homes. It depends which of your favorites analysts you listen to. I think Gartner says 75 billion devices by the year 2025. But the upshot is there are more of these smart home things in the world than there are human beings.

So why is the security in IoT such a tire fire?

Tire fire is a good expression. It comes from a few different areas. The main one, really, is that the barriers of entry to making these devices is lower than it’s ever been.

So anybody can make them.

Relatively speaking, you don’t need huge amounts of skills. The computing’s really cheap. Connectivity’s really cheap, so you can Wi-Fi anything you want. And silicon’s getting more powerful and again, cheaper depending who you beat up for price. And then on top of that, you’ve got a lot of cross scripting capability, so loads of people can just create stuff in Python these days. It’s relatively trivial compared to how it used to be. You know, if you wanted to make a consumer good way back when, you’d have to have different divisions that go into engineering. You know, unless you’re a deep dive innovator yourself, you don’t need to be there anymore.

Right. So you think it’s the whole sort of cottage industry-ness of it all?

Yeah. I mean, I can go into a local shop – in the UK we have an outfit called Maplin’s. I can go in there and buy some temperature sensors, put them in my fridge, connect to them with a board to my Wi-Fi, and I’ve got something that tells me if the fridge is suddenly defrosting when it shouldn’t.

Does that mean that there are people making IoT devices that maybe shouldn’t be making devices at all?

Yeah. The short answer is yes. I mean I’ll tell you the other thing that we see, is you’ve got really well made goods. And as soon as you get really a well made good, you get somebody who makes a cheaper version of it. And one of the big compromises there is the security. So that they’ll reduce the costs back to get the product out as quick as they can and they’ll pare back on quality and they’ll pare back on security. Cause that’s not really a main consideration.

So what kind of vulnerabilities are we seeing in these devices?

A lot of the vulnerabilities are really basic things. So, our labs did a study, came out at the end of last year or so. A third of the vulnerabilities are open ports and another third of the vulnerabilities are default passwords. So your standard admin one, two, three, four, and all your other favorite passwords.

Yeah. Hard coded into the device so you can’t even change it.

Well, some of them are like that. Some of them have no password. Actually that’s also an option. You don’t need special tools to find this stuff. You just need a web browser.

And once you bring these devices into your home, what’s the risk you’re exposing you to yourself to?

You can look at the exposure in different ways. So there is the vulnerability it brings into your network, so you’re exposing, let’s say a device that will try and harvest the Wi-Fi credentials and expose the chances of then leaking out in a lateral attack to other devices on the network. So there’s the kind of standard security flaws like that. There’s the information that device might hold about you that it shares out. That can be everything from, again, credentials, and we all know people reuse credentials. So if you’ve got the same username and password and somebody can get a username and password for one device, then they might go and try your eBay account or your Amazon account. That’s a common one. And I’m very interested also in the impact on privacy of too many of these devices. Not just from a kind of an adult perspective but also children’s privacy.

What kind of concerns do you have?

Well, to give a tangible for instance, there is an Amazon Echo Dot for kids, kids’ edition that’s coming out. And there are various groups that have looked at this, like really looked at it hard and studied it. There is a group called the Campaign for Commercial Free Childhood in the US. And they actually got a bunch of privacy experts and legal counsel to try and study the terms and conditions around it. And what they concluded after a few months is, “Oh, we don’t know what’s happening with this data.” Just no idea whatsoever. There’s no distinction between it being a child’s data against versus being an adult’s data. But it’s marketed specifically for children, but it behaves in the same way as other Amazon. So Alexa for example. So it’s harvesting that data. What it’s going to be used for, you have no idea, because the terms and conditions are so opaque.

Yeah. One of my personal favorite nightmares was the smart teddy bear a couple of years ago that would talk back to to kids and somebody was able to hack the remote access to the teddy bear and made them say horrible things to these kids.

Would that be the Cloud Pets, do you recall?

I think so, yeah.

Yeah, Cloud Pets. That was the Australian thing. So Cloud Pets was exactly as you say. I think you touch the left paw for the kid to record a message and then you can send it up in the right paw to sort of download messages. And the parent would have an application. A kid could touch the bear and send a message to the parent and vice versa.

Oh, okay.

And that had just a ton of horrible things going on in it. The application itself had some vulnerabilities in it. I think its cloud backend got ex – Yes, its cloud backend did get hacked, actually. I can’t recall the numbers, but it was about three quarters of a million people’s credentials and also voice conversations were hacked. And then the actual toy itself was also vulnerable in a separate hack, which some guys played around with and found that you could expose the Bluetooth there, the LE connection. And you could take control and use a device to upload and download images from it. Sorry, recordings.

That’s not what you want.

No, that’s not, it’s crazy.

So let’s talk about these attacks a little bit. What else do we remember we’ve seen?

If you want to talk specifics, I mean, it’s literally a case of you name it, something that is there has had a good hack on it. I think my favorite, it’s not a recent one, is the thermostat on a fish tank which led to a casino in the US being hacked. That’s just an absolutely amazing story. We have everything from the Mattel Barbie, with its back end that was exposed. Again, you know, the idea being Barbie was asking their customers or they asked the children, “What is it that you really want to do with Barbie?” And it’s like, “Well, I’d like to have a conversation with her.” So, you know, they put a chatbot in it, a mic into the doll and also Wi-Fi to connect it to the network so kids could have a to and fro. And the chatbot’s learning the behaviors, it’s going off to a cloud backend, but then that backend gets hacked. And again, you’ve got the parents’ credentials.

So really we could – and F-Secure has done a good job in mapping some of these things. So a colleague, Mark Barnes at MWR did the Alexa basic hack from last year, which has since been fixed. And a lot of these things do get fixed, as they do when you’re talking to responsible companies. It’s just the other things.

You mentioned the fish tank in a casino, and I guess the target’s HVAC system is another good example. These devices, not only are they vulnerable in themselves, but they’re sort of network devices with a foot in both the intranet and in your internal networks, so they’re also gateways into your network.

Yeah, that’s absolutely true. Yeah. So hackers generally are at this point looking really to own the compute power on the devices. So they want to make them part of some massive bot network and either to distribute denial-of-service attacks or to do bitcoin mining. But a next generation to that, and we’re starting to see this happening with the mutation of the different kinds of vulnerabilities that are out there, is how can you do something more personal? How can you get onto somebody’s network? So we’ve seen that more from a business context.

Me personally, I’m looking more at the consumer side of it, but there’s nothing to stop once you’ve infected one device, then getting into other devices within the network. Another good example of this would be the iKettle, which broadcast is by default programmed to connect to the strongest Wi-Fi network you can find. So if you could spoof a Wi-Fi network, then it will connect to you and also share all the other credentials for networks it’s connected to.

Wait, we’re talking about a device made for boiling water.

Yeah, that’s right.

Why would you want that connected to the internet?

Well, it’s a big job to get out of bed, walk downstairs and turn the kettle on, clearly.

I see. So, okay, let’s talk about –

That’s not my favorite. My favorite IoT object – and you can look for your own internet of sh*t object, you can Google that term – is the umbrella that gives you an alert when it’s raining.

(Laughing)

So, you can’t make this stuff up. (Laughing) Janne has to take his glasses off to laugh. I mean, I live in the UK. The answer is, it’s raining all the time.

Hence umbrellas. So, you mentioned botnets as a threat to these devices. Are we seeing a lot of malware that is aimed at taking control of IoT devices?

Broadly, that’s what most of it’s trying to do at this stage.

So these are automated attacks, not hackers going specifically after your device.

No, but I mean it’s interesting because that is absolutely theoretically possible. It’s just that in the consumer world, we’re not that interesting. Yet. That’s different if you’re a high net target, you know, if you’re a rich celebrity or whatever, then that’s definitely a vulnerability that I would worry about. But I’m not that guy so I don’t have to worry about it. But from a hacker’s point of view, yeah. They want the computer power.

Yeah. Okay. Who’s behind these attacks? Are we talking like criminals, or nation states, or hacktivists?

We see a mix of those. So I mean, Mirai, which I won’t go on about because this audience will know too well, but the most interesting thing about Mirai is that it was, you could say it was criminals, but it wasn’t really. It was kids trying to make a bit of money off of Minecraft users. Yes, that is criminal behavior, but it’s not what we consider organized crime at the time. So that is definitely it. But then Mirai was the game changer because obviously it was so successful. So we, we know that then criminals will start putting their heavy resources into adapting that code. So Mirai, for example, is more of a framework now rather than a specific virus because it’s got so many forks that are infection vectors.

And then we have the nation states. So VPNFilter, which I don’t think you’ve covered here, but it was another one from last year. So that affected about half a million devices globally. Chiefly routers, but also some NAS drives and a few other devices. And that was pretty much the work of the Russian state. So we see exactly the same parallels we’ve seen with other forms of crime. You know, wherever there’s money to be made or wherever there is surveillance to be made, then you’ll see those organizations that have the capability and the power moving in that direction.

Right, right. And we live in a world where some nations, I’m not gonna mention North Korea by name, are sort of financing the existence of the state through cyber crime.

Indeed. Hey, you’ve got some governments who see this as an opportunity to help protect their customers. We have the story with the Japanese government, which in advance of them hosting the 2020 World Cup and concerns about their cyber infrastructure, they want to hack IoT devices.

Sort of pre-hack them.

Yeah. So the idea being of course that if they hack the devices, they can serve up some kind of warning to the user to say, Hey, you should do something about it. Which poses all sorts of questions. I mean, from a technical point of view, actually it’s probably a good thing.

Yeah, not that I’d maybe encourage that, because patching these devices and hacking them is, is – can be tricky.

It can be really tricky. Consumers don’t know how to patch devices. I mean, one of the fundamental problems we have is that the manufacturers don’t often build a cycle for updating the software, the firmware, whenever vulnerabilities are there. And one of the things that we’ve learned over time is that, well, time is the problem. The longer something’s out there, at some point, whether it’s some tin hat from F-Secure who’s digging around inside it or whether it’s a criminal, they’re going to discover some way of getting around vulnerabilities in that device. So if your device is there for three to five years to 10 to 20 years, which is what it is with a lot of goods – I don’t know about you, but TVs should last three, five, 10 years.

Absolutely.

So you need to have a mechanism to be able to update that somehow.

Yeah. And then you’re only hoping that consumers will go and update their devices. And we’ve seen some vigilantes who break into devices only to install updates on them.

So that’s, yeah, a slightly bigger question of whether or not you should always automatically update on patch. Wow, that’s a tough one, because you are correct. There are times when that is abused. But you’ve got to look at it in a slightly bigger picture, which is if people don’t update things at all, there are more vulnerabilities.

Sure.

So I think there’s a more basic, just getting the vulnerability mechanisms inbuilt into these, let’s say dumb smart objects, in the first place. We have another issue with the replacement cycles, which is, you know, increasingly people are hanging on to mobile phones quicker. And I know this talking to the operators, that these statistics are out there freely. There’s less innovation in the mobile platforms than there is in the world of IoT. So we’re seeing, and the operators are seeing, particularly consumers starting to spend more money on peripherals that the smart gadgets than they are in their mobile phones, which means you’re hanging onto all of these devices longer and longer and longer.

So we’re talking about companies supporting their own products for the foreseeable lifecycle of that product. But that’s not something that gets printed on the tin. Like, if I’m looking at a smart device in the shop, how do I know if that one’s going to get updates or not? How do I even know if it’s secure or not? How do I know anything about it?

It’s really difficult from a consumer”s perspective to figure out. We can maybe come on in a bit to talk about some things that consumers should do. But the other question I think you’re asking there is, what’s the responsibility of the provider to keep updating, and what happens if you do that today? So there are good examples where if you don’t update the products or you don’t buy in to everything about the ecosystem of that product, then the product will stop working. Sonus, to name, is a kind of classic example. You stop updating your Sonus or you don’t buy into the sort of full Sonus terms, then you can’t use your speakers. Which is kind of crazy.

Sure. But also how do we as a society make sure that these companies are living up to their responsibilities? What should we do to make them sort of sit up and pay attention?

So in the past it’s always been a kind of carrot and stick approach. Lots of governments around the world prefer not to regulate where they can. And they prefer codes of conduct. In the UK for example, we had a code of conduct announced last year, which requires IoT manufacturers to have some kind of basic security. Changing passwords, firmware updates, these kinds of things. Which is, you know, it’s all really good. But it’s not enough.

You know, we’ve been through this world before, in the world of normal security, securing devices and getting companies to actually pay attention to security. The seismic change there of course was GDPR, which makes companies do something more than just pay lip service. And you need a similar kind of thing in the world of IoT. We are starting to see some moves in that direction. Not to shout them out again, cause I’m not generally a fan, but the UK government has now announced legislation. We’ve had the state of California and state of Oregon who both recently passed some law which allows them to kind of enforce that the IoT manufacturers have to do something.

But the trouble with all of these things is that they’re local, they’re piecemeal. If I’m buying something on Alibaba, I don’t care what Oregon thinks about it. It doesn’t matter what laws they’ve got in place because it’s not something that you can enforce legally. So it has to be a bigger framework than that. If you’re doing it in a state or an interstate basis, it has to be something super national, you know, which means that it has to go to the standards bodies.

We’re talking laws and regulations here. Does that mean that you’re ready to sort of give up on companies picking up the slack by themselves?

It’s not that I’m giving up, it’s just first of all, if you put the responsibility to compute on companies, you’re often really putting responsibility onto consumers. So consumers, you know, by using the internet today, you’ve got a binary choice of do accept the terms and conditions, yes or no, and you accept them, right?

Absolutely. But I always read the EULA, every time.

(Laughing) And so customers will automatically just do the thing that allows them to use the service. The current state we’re in, in terms of accepting cookies on websites is just not acceptable, really. And so if we kind of go down the route of just saying to companies, “You’ve got to do something about it,” you’re just by extension gonna have a big set of terms and conditions whenever you turn on your Philips Hue lightbulb. So I do think it needs to come in at a level above that, which means that governments and regulation, standards bodies, global things like the IETF and ETSI and all the guys that make the good standards, that they need to be enforced. And that is happening. I’m not saying it isn’t, we’re seeing moves pretty much everywhere to kind of move towards some standardization, but it’s a bit slow.

On the subject of EULAs and privacy policies. The companies providing these devices are also collecting a ton of data about the use of the devices, but also through things like microphones and cameras about the users and the environments where these devices are being used. Where is all this data going?

I haven’t got a clue. I was hoping you’d tell me. (Laughing) So going back to the case that the company, the guys in the US who are looking at protecting children, Campaign for a Commercial Free Childhood. So those guys, as I said, they put lawyers and experts to try and decode the Amazon terms and conditions and it is just opaque. Now they, Amazon and Google, both explicitly state that they do not sell your data. But this is kind of the Facebook response. We’ve here before. So they may not directly monetize the data in the case of I’m selling it to you, Janne, for whatever, 10 bucks per address loan detail, but they absolutely monetize it in other ways by collating and organizing it. And that is a huge concern. And the ecosystems that particularly those two companies are creating based around their digital assistants are increasingly compelling manufacturers of IoT devices to share the data from their devices with these companies.

Yeah. And these digital assistants, they’re in our homes listening, well potentially listening in on private conversations. Do we know what companies are doing with this data and how exactly they’re going to monetize it.

So it’s all about building that bigger picture. So I guess you’re obviously familiar with the phrase, “if something’s free, you’re the product.” Better, elegant, more elegant phrasing of that would be “surveillance capitalism.” There’s an excellent book I recommend to everybody who’s got the time by Shoshana Zuboff of same titled book, and she goes into describe how the collection of this data is leading to a commodification of our reality. So all of the little bits and pieces of data that we have are interesting to somebody for the purposes of testing products on us and then selling products to us. And that’s how it’s, it’s monetized. It’s not monetizing a direct, “I’m going to pay you for that,” but if I can test things against you that make my product development and my product manufacturing and lifecycle process leaner and slicker, all that’s worth money to me. And if I know enough data about you that I’ve tested that product with you, you’re more likely to buy it, then down the line again that’s interesting to other companies as well.

Hmm, yeah. So we’ve had cases where there are TV ads that have these sounds that you won’t hear, but your Alexa will. And so somebody is making sure that they know which ads you’re looking at and which not. I suppose there’s nothing stopping these companies from doing like A/B testing on commercials. Like these 30,000 consumers did see this ad. Did any of them buy it through their Alexa assistant? And these 30,000 didn’t. So what was their purchasing behavior like?

You can even weave a bigger web than, did they just go and buy a product immediately? It’s what was the behavior of that person after that? Absolutely. If I’ve got a Philips Hue, did whatever happened on TV make me stay up longer? Did it make me go to the toilet more often? Because if you’ve got Alexa in your bedroom it knows how often you go to the toilet.

I didn’t think of that.

Well you also probably want to not have sex too often. (Laughing) Or more often, I don’t know, whatever. Alexa knows all of these things. So where you put your digital assistants is kind of important in your house, if you choose to have them in the first place. You might also think about where you get them from. So it’s not an Amazon Google beta, but those boxes ship everything to the cloud. And that’s a design decision cause the basic boxes don’t have the compute. So they have to send things to the cloud. But they don’t have to. I mean there are alternatives and there’s a company called Snips. I have no involvement in Snips, but it’s interesting technology. They do the voice algorithms locally, and they have some cloud relationship to update the algorithms. But it’s separate from your data and what you’re saying to the device. So I’m quite intrigued with the rush to push things to the cloud when it’s not absolutely necessary. It’s not hugely more cost effective to do that. But you get a lot of privacy benefits if you can do certain functions on the device locally rather than sending everything to the cloud.

Can you give us like a ballpark understanding of how much more computing power do you have to have in that device? Like you’re saying, it’s not a huge cost thing.

Well, it means a slightly bigger chipset and maybe some more memory.

Right. That’s it?

Yeah.

Well, that doesn’t sound too expensive.

Well, again, I guess from Google and Amazon’s point of view, the cost would have been a consideration, but of course they want that data in the cloud. What I’m saying is that you can come at it from another perspective and not send all of that data to the cloud.

Yeah. So the explanation that we need to have this information to provide the service to you, that doesn’t fly.

Well, you can technically argue that yes, listening to more conversations can be used to optimize your bot. There’s nothing wrong with that statement, but back hauling everything to the cloud to do that, that doesn’t quite fly.

So what can people do to keep their digitally enabled homes both secure and private?

It is very difficult for consumers to understand fully what’s going on. But yeah, there are definitely some basic steps we can take. You had Steve Lord on here about a year ago talking about it and I think he mentioned a couple of really good ones. One is, you can download the application that comes with a device and just have a look at the reviews. That’s always a good sign. I would say that the most basic thing that somebody should do when they get their new internet-enabled trainers, LED trainers, which I have, my kids love them, is change the password on the device. So literally when you get it out of the box, just pause for thought and change the password. Wipes out a third of the vulnerability straightaway. So it’s the most common thing that me as a user, I can do to help protect myself. You can and should also create a separate network on your home network. So you’ve got your Wi-Fi network. If you are a geek, then I suggest that you probably already have like a guest network for people who come to visit. Then just make a third network that’s for your IoT or your smart devices. And maybe not call it smart or IoT, call it something else, but essentially connect all of those devices to that network.

Okay. So if an attacker gains a foothold in one of those devices, they’re sort of stuck in that network segment.

You can isolate that network. Yes, it’s easier.

All right. Anything else you’d like to give as an advice to our listeners?

So from the privacy perspective, think long and hard about a level of trust with companies. So a nice simple play is to have a look at the company’s privacy policy. Now I’m not saying go read 70 pages of Google’s terms and conditions or Amazon’s, but, I see – now I’m not doing an ad sponsor for them, but I’m very interested in the direction Apple are taking at the moment. So they are going very clear with a kind of privacy manifesto. They’re making big claims about they won’t share that data. Everything they do is anonymized. That’s a clear indication to me of a company that actually engages with the topic and takes it seriously. So regardless of what you think of Apple products and what have you, they do seem to be differentiating themselves at the moment, from others. Now, if I was Google or Amazon or any other, than I would kind of try and take a similar route. But the reality is it’s very difficult to find out what those companies do with your data, because they’re deliberately unclear about it. So if privacy is your your thing and you are concerned about it, then a basic Google search, have a look, what is the company’s overall approach to privacy?

Yeah, we have acknowledged that privacy policies tend to be a little bit long and may be hard to read and EULAs certainly are. And the tendency in my view has been that these have been getting longer and more complicated all the time. Do you think there’s anything that will stop that development? Like is there a line that we’ll draw and say, “No, nobody can be expected to do this, you have to find another way to inform your customers.”

There’s the push and the pull. So the push needs to come from industry. So that’s where companies can take the lead. In the past, F-Secure has had a very clear digital manifesto about usage of customer’s data. And so there’s a benefit. Lots of companies could do that. EULAs are necessarily long unfortunately because they have a lot of legal content. But you can distill that to let’s say a dozen principles or 10 principles about how you’re gonna use the data and just be very more open and transparent about that. That’s a clear opportunity. But a lot of companies won’t do that or make the effort, partly because their whole business model, coming back to the big American guys again, it’s not in their interest to do that. So the pull needs to be from the regulatory bodies, from the governments. And again, it depends where you live. We’re fortunate in Europe, for those of us that are still in Europe, that you have lots of really good regulation around that. And I do think that’s the next stage on top of what we’ve had with GDPR, which is a bit more compelling usage of how the data is used.

All right. Time will tell how that plays out. So what would you like to see companies doing more? Like what does good behavior in IoT devices look like?

That’s a good question. I think the answer to that depends where you live in that value chain. So, do you make the things or do you make the networks these things run on? So if you make the things, if you make IoT or smart goods, then there’s a long list, but you can distill it really to I’d say six things. So the first thing is, if your device connects to the internet, you’ve gotta be able to update it. If you can’t update your device, don’t connect it to the internet. Secondly, force a default password change. So no in-built hard-coded passwords. Don’t let the device get taken into use until the customer’s actually gone in and changed it. Have a patch mechanism. So we spoke about replacement cycles earlier. Your device is increasingly going to be out there longer and longer. So you need to have a mechanism for patching it. You need to run bug bounties. There is an army of – literally, an army of informed geeks, not just professionals like people F-Secure, but independent researchers who will pick the bones apart in your product and find the vulnerabilities. Too often in our experience, we’ve seen companies not pay attention, not respond. So you need to just kind of embrace that community out there, incentivize them to come to you with problems before they go public and it damages your business.

Well that’s the other thing. Bug bounties are an incentive, but they’re also just a channel for whoever stumbles upon a vulnerability to disclose it to you.

They are, but that’s a positive thing, right?

Absolutely.

You should be encouraging that. And we know from the F-Secure perspective, because our guys do dig around and find vulnerabilities, we’ve got companies who would rather put their head in the sand about it and not publicly acknowledge. Whereas obviously the good thing to do is work with a security company, find a fix and it’s a kind of win-win situation.

Okay.

Then you need to do an appropriate mapping of your attack surface. So because of the fragmented nature, you might be able to buy a bit of hardware from somewhere, you might buy a chip and you might take software from somebody else to make these IoT devices work. And each one of those components might have some seal of security around it. You know, you take an arm chip, they’re pretty good and solid. But when you’re putting it together as a service, then you might introduce new vulnerabilities into that link. So mapping the attack surface is something that should happen once you’ve created whatever your internet of thingy is to understand its vulnerabilities. And then the last one is: Only collect and use appropriate data and make that data anonymous and explicit to the service that you’re providing rather than some generic, massive thing in the background.

Right. So what about the networks these devices run on?

Okay. So if you are providing networks to these kinds of smart devices, particularly consumer networks, I mean, in the past you would have had some kind of internet security, obviously F-Secure would make endpoint security for your tablets and your laptops and your mobile phones. But that just doesn’t scale in the world of IoT. It’s just not physically possible. You can’t get customers to install it. It’s difficult for us to make those multiple devices. So if you can’t do it at the endpoint level, then you need to do it at a network level. Now for years, lots of operators have deployed network security for two reasons. One, to protect their core, which they continue to do, and increasingly to protect their customers from going to bad places. You know, what we call DNS-based security. Now, DNS-based security is kind of suffering a challenge in the sense that we are increasingly moving all of our traffic to encrypted communications. So it becomes more difficult to block that kind of traffic. So we think that the longer term fix for that is to do something at the local network in the actual home gateway itself. Cause you can do lots of traffic analysis at the home gateway level, which doesn’t have implications in terms of the privacy about you intercepting people’s requests. You can keep it local to the device. And it’s also more effective.

Okay. Thanks for being on the show, Tom.

Thanks for having me.

That was our show for today. I hope you enjoyed it. Make sure you subscribe to the podcast and you can reach us with questions and comments on Twitter @CyberSauna. Thanks for listening.