Eufy’s Privacy Blunder – Don’t Promise What You Don’t Provide
Eufy made a name for itself as a video baby monitor company that provided peace of mind – in the form of top-of-the-line security to protect your privacy. It turns out their promises were more than a little bit hollow.
- Resilience Cybersecurity & Data Privacy
- Anker’s Eufy Lied to Us About the Security of its Security Cameras – The Verge
- Eufy’s “No clouds” cameras upload facial thumbnails to AWS – Ars Technica
- Eufy’s “local storage” cameras can be streamed from anywhere, unencrypted – Ars Technica
- The 5 Core Principles of the Zero-Trust Cybersecurity Model – Imperva
- IP Cameras, VoIP and Video Conferencing Revealed as Riskiest IoT Devices – info security Group
Brian: Hey, welcome to the Fearless Paranoia podcast. Here we are demystifying the complex world of cybersecurity. I’m Brian, the cybersecurity lawyer.
Ryan: And I am Ryan, a cybersecurity architect.
Brian: And we’re here to try to make that difficult cybersecurity concept understandable. And sometimes you run into these stories, these cases where you can kind of use them to sort of help explain basic concepts. But other times you run into stories that remind you that sometimes there are things that you rely on and things that you understand and or at least think you understand, but it turns out your trust in your faith was misplaced. There are any number of situations in cases that that could, of course apply to in the world. But this week, it applies to the company Eufy. Eufy is a manufacturer of a lot of different products. But one of the ones they’re most known for and notable for is their video surveillance and video monitoring systems. Among the biggest things that they monitor our infants. They produce one of the most popular series of video baby monitors, and a lot going on with them in the past couple of weeks. And I want to start by reading directly from an article the November 30 article from The Verge. Now, I also want to give the verge a shout out because that is a phenomenal publication founded or co-founded at least by an acquaintance of ours from high school Mr. Dieter Bohn who is no longer affiliated with the verge he moved on, I think, a year or two ago, but it’s a phenomenal publication that he co-founded and we both really enjoy reading the posts and it is very authoritative. And more importantly, they do their due diligence. So I’m gonna read this:
“Their commitment to privacy is remarkable. It promises your data will be stored locally that never leaves the safety of your home that its footage only gets transmitted with end-to-end military grade encryption, and that only send footage straight to your phone. So you can imagine are surprised to learn you can stream video from a UV camera from the other side of the country with no encryption at all.
Worse, it’s not yet clear how widespread this might be, because instead of addressing it head on, the company falsely claimed to The Verge that it wasn’t even possible.
“I can confirm that it’s not possible to start a stream and watch live footage using a third-party player such as VLC,” Brett White, a senior PR manager at Anker, the parent company of Eufy, told me via email. But the verge can now confirm that’s not true. This week, we repeatedly watched live footage from our to streaming from two of our own up cameras using that very same VLC media player from across the United States, proving that Anker has a way to bypass encryption and access the supposedly secure cameras through the cloud.”
Now, the article goes on to say that it appears that it has not been exploited in the wild, and that they do rely on using the serial number of the devices which are long. But there’s also other problems including the fact that it is just your serial number encoded in base 64, which can be easily reverse engineered online, and that it includes an address with a Unix timestamp that can easily be recreated, plus a token that the server’s don’t actually seem to be validating. The funny thing that the Verge did was changing their token to “arbitrarypotato” and it still worked. There are a whole lot of additional worrying stories that have come out since this story was initially released about Eufy, and here’s where I want to bring Ryan in. Because this is a pretty critical security flaw in a company whose reputation was based on security.
Ryan: Yeah, you and the Verge their head on pretty much all the major topics. There was a variety of different misfires when it comes to just kind of general best practices and cybersecurity in this instance here. So in particular, the first one is flat out trust and messaging to your customers. Obviously you the last thing that you want to do is be caught with your pants down when you say hey, this can’t happen, and then you come to find out that it is actively happening. So something as simple as your data is only stored locally followed up by a comment that it’s only able to get to and from your phone through our end to end encrypted transmission methods then to come to find out that there are snapshots of these different video feeds that are being stored to AWS buckets that are used to trigger alerts, some of those not very securely locked down. The encryption key or encryption passphrase that they were using is a standard hard coded one that is the same across all their devices. And it was located in plain text in a it was a GitHub or GitLab bucket. So that’s a commonly known passphrase now, which means it’s probably effectively worthless from the point of being used to create proper encryption. And then to tie all that together. The fact that their video streams are actually available over the internet with using something as simple as a serial number which is printed on the outside of their box as this sits on the store shelves which means anybody can go by and effectively walk through a Best Buy whatever any place that sells these products, pick up the whole list of serial numbers for your products that are about to leave the shelf and would probably be able to get pretty close to the timestamp being used just by looking at when they actually do leave the shelf. To me that’s a, it’s a whole variety of don’t do this items that have all been just kind of strung together into one big mess. And from a company that has done their best to market themselves as being very adherent to security policies and security practices, this is a big red mark a big black eye for Eufy in particular. The fact that they denied the allegations after security researchers brought this to them and then were able to effectively pull down a solid proof of concept in quick manner just shows that they were doing their best to cover their tail rather than actually deal with the fact that there might be a security issue in their products. And that right there is reason to pull back or at least second guessed, the trust that you’ve got in a company like up very rarely do I ever go down to a company and just say don’t trust them because of security practices, again, with Uber even then I you know what, I tried to backup their security team pretty well, I think there was a lot of institutional problems there. I think that we were seeing something probably very similar here. But the fact that both companies have decided that covering their own PR image at the expense of potentially all of their customers, and that covering their backside was more important is one good reason why a lot of customers really should second guess their trust.
Brian: And one of the things that truly amazed me about these stories, too, is based on the promises that you made. There’s a report image that the verge article copies directly from Eufy’s Privacy Commitment that includes the local storage for your eyes only the end-to-end encryption, preventing anyone from looking in and the on device AI that supposedly does all of the information handling and analysis for your video in device so that doesn’t need to be sent elsewhere is in all these articles. The crazy thing you read is all of the researchers and all of the security people and everyone that they’ve interviewed all use up devices. This is not one of those situations where a company has made security promises that are vaporware that are clearly more than they could possibly promise. This is a company who had at least developed a reputation that their promises which are significant. They make significant promises about security. They heard advertising a well beyond general best practices security here, but such that security experts believe them and relied on them. This is not one of those situations where it was simply easy to discover what I mean, I’ll just say this is a general consumer user, what the hell are we supposed to do?
Ryan: Well, that’s a phenomenal question, there’s not really a good solid answer to it either. Because the average consumer doesn’t have the knowledge, the wherewithal to be able to do the testing on their own, that would be required to do the validation behind or to do the validation that’s necessary to really confirm these offerings from a company like Eufy. So they would rely either on one of two things in the past, either you’re going to have a third party that’s going to do audits of this and validate “Yes, all these measures are in place, and it’s secure,” which is probably what these security researchers were doing at one point in time. They just said, “Hey, let’s validate these claims, maybe I’m going to bring Eufy stuff into my house and I’d like to make sure that it’s as secure as what they say and is what they’re promising.” Or if you don’t have that third party validation, you really kind of have to just trust what they’re telling you.
And it’s really easy for these companies to throw out a lot of buzzwords and to just kind of mislead people nowadays, because cybersecurity is a complex industry, it changes very rapidly. And the average consumer doesn’t really have a major piece in it other than protecting their own identity and their own access to services. So they don’t take that next level to really look at, you know what’s under the hood in these different devices. Because that’s not their job. That’s the manufacturers job, or that’s the regulator’s job or that’s somebody else’s job. But the impact to the average user is immense. And so realistically, people, I think at this point, need to take note of what happened here with Eufy and need to remember this as they start to engage different things like IOT devices and their future and start to ask themselves, okay, let’s just assume that every one of these IoT devices I’m going to pick up in the future is just got security nightmares behind it. So how important is this device? And do I really need it? And far be it for me to tell somebody not to buy something that they think they need, but in this particular instance, here, I think we’re starting to get to a point where the negative downsides that can come with some of these tools might start outweighing some of the benefits you get. And being a technology enthusiast like I am I mean, I don’t know. I’ve got cameras on the outside of my house, but I’ve got them locked down pretty well. But these ones are, I mean, the only way to really get to your phone easily without having a more intense setup sometime is to trust some sort of cloud connector that the company behind it is supporting. And in this case, it was up using AWS for stuff like their alerts, sending those snapshots up or them using their end-to-end encryption to send these connections directly back to a person’s phone.
Brian: You’re listening to the Fearless Paranoia podcast. For more information on keeping yourself your family and your company protected against cyber threats, check out the Resilience Cybersecurity and Data Privacy blog. If you’re enjoying this podcast, please like and subscribe using any of your favorite podcast platforms.
Brian: Let’s walk through the things that Eufy is accused of. In this particular case now we’ll start that first thing you just mentioned, they use AWS, which is Amazon’s server farm. Basically, it’s their Amazon’s cloud up was using well, they made promises that there were no clouds, and that no one has access to your data. And then what it turned out was they were uploading thumbnail snapshots whenever they sent you an alert. So if you got an alert from your up camera with a snapshot, just an image in order to get that alert, it was coming through Amazon servers. So they were in fact uploading an image to the cloud. And then of course, the big problem there is though you made promises that you weren’t using a cloud and now you are now there’s no evidence that I’ve seen that particular cloud service has been breached. But it is still contrary to what you’ve promised. That’s the first thing that cloud based snapshots net for alerts absolutely done. The second was this promise that the streams were end to end encrypted, and no one could actually one, two, and three or a combination one, that the information, the video stream was only conveyed to your phone, that your phone was the only one that could get it. And then two is that the streams were end to end encrypted so that no one could eavesdrop the story that we discussed at the very top, which addressed the issue of being able to stream it from a VLC player. And for those of you don’t know, VLC is a phenomenal, open source media player. This is a video player that’s designed for everybody to be able to use for any basic media. And so that means that not only was the only going to your device thing, not accurate, you could stream it from anywhere across the country through this media player. But also it meant that they could access the stream penetrating the suppose that end to end encryption. So those are the three or you know, one, two, and two-A, major issues that we’re talking about. And Ryan, you mentioned something that I was thinking about earlier, I remember back in 2013, when the Snowden leaks first came out, and people were all shocked and surprised that the NSA was spying on literally everything yet anyone who had been involved in tech, like, “Look, if you’re putting something in the digital space, you have to assume that it can be observed.” And I think most Americans now operate from that perspective. But once you touch baby monitors, you’re touching a nerve. That can’t be undone. So what does it say to you that accompany that felt this comfortable selling a product that is at the core of every innate urge protecting your children, I mean, the metaphor of the mother bear. That they were willing to make these statements about security and apparently willing to make false statements to, among other publications, The Verge who by the way, the Verge is going to follow up on a tech case like this, you’re not going to get away from an investigative journalist from that publication on a story like this. What are your thoughts on that?
Ryan: Yes, I think starting back to the beginning of what you said there, I think that we over the last like, especially 20 years, kind of in the modern Internet era have been really willing to give up a lot of things, we’ve been willing to give up a lot of privacy and a lot of general freedom for the sake of usability and access to information. And so with the you know, the prevalence of the internet, and especially stuff like social media nowadays, that’s just something we’ve kind of come to accept as you start to put these things on the internet. And if you put it on the internet, it’s going to live there forever. That’s what I tell everybody. And a lot of people chuckle and laugh about that. But effectively, there’s a good possibility that that’s true. It’ll live there as long as somebody feels the need to hold that information. And there’s a lot of companies out there that are just doing nothing but aggregating information.
Brian: Let me ask you this, what do you draw out of these stories, there’s no one medium sized businesses and even individuals need to take into account when it comes to the Internet of Things generally, but technology like this technology that by definition may make your life in theory easier, but is by its very nature invasive.
Ryan: So I think the couple big takeaways are zero trust to start with don’t trust what people are telling you. Make sure that there’s validation behind it. Go to companies like Eufy that have got a strong security stance that they are presenting to you. But make sure that other people are validated that makes sure that there’s security research that’s been done against that to some extent. Now, granted, it might be difficult to find that on every manufacturer or every product offering out there. But the more important the data that it’s collecting or capable of collecting the higher level of scrutiny that’s really required in order to remain safe. So again, if you’re doing something like I’ve got exterior cameras on the outside of my house, I look at those as being slightly lower risk now something like a baby monitor, where I’ve got this inside my house listening to all of our private conversations that we’re actually turning on pointing right at our children and then trusting that we are able to view this remotely and that the manufacturer behind that has guaranteed that level of security. There’s an awful lot of trust occurring inside that equation. And one of the reasons why zero trust has become such a big thing in the industry nowadays is because we’ve learned that that trust can be abused and can be exploited, especially if it’s not really well founded. And in this case with Eufy, the lack of trust that’s going to be kind of slapping them in the face for the foreseeable future is well deserved and well-earned because they came out and they said that we hold security high, and that it’s of utmost importance to us, and that we are going to actually point this out to you in a mission and make promises to it. And then we are going to go in and backup those promises when you tell us that we’re wrong. And now they’ve gotten to a point where they’ve been effectively put on the hot seat because everything that they said, I think from the start, they had good intentions, and I think that they thought their products were as secure as what they thought they were. But the practices that they were following obviously hadn’t been audited by any sort of professional security researchers or any like third party testing.
Brian: You’re listening to the Fearless Paranoia podcast, we’re here to help make the complex language of cybersecurity understandable. So if there are topics or issues that you’d like Ryan and I to break down in an episode, send us an email at email@example.com or reach out to us on Facebook or Twitter. For more information about today’s episode, be sure to check out Fearless Paranoia.com We’ll find a post for this episode containing links to all the sources research information that we have cited to you. And also check out our older posts and podcasts as well as additional helpful resources for learning about cybersecurity. Now, back to the show.
Ryan: Any sort of item like this, especially in the IoT space, nowadays, with all of the problems we’ve had institutionally with those over the last 10-15 years, all of those should go through secondary audits of some sort, you should be getting a third party coming in and doing some sort of pen testing against your software against your devices before you ever release these out to the public to make sure that those security promises you’re putting in place are actually going to hold true when they come to the point of being tested. And they will get tested at some point in time. I think in this case Eufy is probably very fortunate that it was honest security researchers that went through the concept of proper disclosure and reached out to them when they came across this and said, “Hey, we’re gonna give you a legitimate chance to try and do something about this. But this is a big one, like you guys really, really messed up here.” And a lot of researchers wouldn’t have to do that. If you get a nation state actor that found this stuff out, the first thing they would do is just start collecting data.
Brian: Well, and importantly, the researcher who first flagged all this is a guy named Paul Moore based in the UK, and at least in one of the articles that discusses how in his Twitter posts about this particular topic, he said that he did contact you fees legal department informed them, he was going to give them a chance to investigate and take appropriate action and that he would not comment further. And by all publications I’ve seen, he has kept his word on that he has not responded to requests for further information. And you’re right, they’re getting very lucky. One thing I just wanted to say before we wrap up the topic really quickly. And Ryan, you’ve started touching on it, this idea of your promises and your practices being separate. One of the big things that I see a lot in my interaction with small businesses is I help draft policies and procedures. And there are multiple sets and one of them is referred to as digital properties. Now that’s your public facing Terms of Service, privacy policies, disclaimers, sales, notices, return policies, all those various things that you see on a website when you go there. I also draft internal policies and work with companies on refining their internal policies to comply with various privacy and cybersecurity principles and laws. What amazes me is the disconnect between the two. Most companies, it’s not their first thought to make sure that their outward facing privacy policies match their internal practices, that appears to be exactly what happened here. But the problem is you had your outward facing privacy policies being written by a very security conscious, sales-oriented team that wanted to market the company as something that is different from what the internal programmers were putting together. And so while it’s important to make sure that you have audits that help you figure out all of your stuff, you have to make sure that your company’s overall philosophy is the same from department to department. And I will say one of the most difficult things is convincing a group of programmers and a group of marketers to adopt the same ethos. But when you’re selling stuff like this, you absolutely have to. I think that’s required.
Ryan: Yeah, maybe I’m giving them too much of a benefit of the doubt here. But I think they had good intentions. And I think they tried to accomplish some of what they said, I think they obviously failed miserably when they said that none of this was cloud accessible because they were actively using a cloud service to piggyback some of their service offerings. Instead of doing it straight from the camera doing the processing locally and sending it straight to the end device. There was a middle ground in here that they said didn’t exist, but their end-to-end encryption. Technically, the devices were encrypted and the problem is your encryption was crap. And so if you have crap encryption, then that’s not doing anybody any favors either.
Brian: Yeah, calling it military grade. And it turns out that it’s Russian infantry military grade.
Ryan: Well, if it comes down to the whole, like, there was a meme that went out back in the day about security where they’ve got this old door put together with one of those slide locks and there’s a Cheeto sticking through the middle of it, and that’s what’s holding the door lock.
And that’s effectively what this kind of bad encryption is when your key is sitting out on a publicly hosted code repository system. And you’re using nothing more than really that a serial number and a token that you’re not even doing any validation against. And that’s the level of your end-to-end encryption. I mean, that’s pretty terrible. You might as well have been holding together the lock on your door with a Cheeto at that point. That’s about as effective as that encryption was. So while I think they had probably good intentions there, the deployment behind it was bad. And it was obvious that it was never tested because any sort of good software pen tester would have fared that out in moments and just said flat out hey, this is this is inexcusable, and it’s no good and definitely not good enough to be putting our kids our future and our offspring on the other side of that without at least base level protections that are out there for tons other software packages and tools nowadays, so it’s just inexcusable.
Brian: Yeah, at this point, I just have a hard time feeling anything other than Eufy set themselves up be in this bad spot. And their, I don’t know if hubris is the right word in this particular case, but clearly they had not handled everything in their own house before they decided to advertise what they were selling, and then choose to advertise it for being something different than it actually was. All right. Well, that’s all for this episode of Fearless Paranoia. I want to thank you all for joining us. We will continue to have these conversations trying to demystify cybersecurity and to let you know what is going on in the cybersecurity world. Hopefully, at some point in your future, we can find some happy stories about companies that it turns out we’re doing exactly what they claimed they were doing, and it was a good thing. But until then we’ll just keep praying. Thanks for joining us again. You can see the information in this post at Fearless Paranoia.com. And don’t forget to subscribe to us on any of your favorite podcasting platforms for Fearless Paranoia. I’m Brian
Ryan: And I am Ryan
Brian: and we’ll see you next time.
to make cybersecurity understandable, digestable, and guide you through being able to understand what you and your business need to focus on in order to get the most benefit for your cybersecurity spend.
©2024 Fearless Paranoia