Let this Yahoo hacking case remind you that nothing is truly safe in the cloud

Silhouette of man's head in front of computer monitor light at night
Shutterstock
Impact
Updated: 
Originally Published: 

You might think that the private files and photos that you save to the cloud are safe and secure, available only to you and the people you give permission to. Don't be so sure. While most companies do have policies that limit any sort of access to your information, the system is still built on trust, and that trust can be violated. That's exactly what happened with a former Yahoo engineer, who pleaded guilty this week to illegally accessing and stealing photos and videos from the accounts of about 6,000 people.

Reyes Daniel Ruiz, a 34-year-old California man who worked at Yahoo, is the nightmare scenario that should be in the back of your mind any time you upload a sensitive file or photo. According to the Department of Justice, Ruiz spent part of 2018 cracking the passwords of Yahoo user accounts in order to sift through their files and find compromising content. He searched through people's emails, and used permissions associated with the hacked accounts to access files on linked services including iCloud, Facebook, Gmail and Dropbox. When he found things like nude photos or videos with sexual content, he copied and saved them to his personal hard drive. Ruiz apparently started by targeting his personal friends and some of his work colleagues, but eventually branched out to other accounts as well. In most cases, his targets were young women.

What is perhaps most troubling about Ruiz's behavior is that it isn't clear exactly how he gained access. A sealed indictment provides little insight into this, but it seems as though Ruiz's position at Yahoo at the very least enabled the hack. Whether he had direct access to user passwords and other security measures is unclear, though. It's possible that his personal understanding of the company's security practices may have enabled his ability to break into to the accounts rather than being provided direct, unfettered access to them because of his position. Neither option is a good one.

This is not the first time something like this has happened. Employees using their positions at tech firms to violate the privacy of users is common enough that it should give you pause before you place anything particularly sensitive in the clutches of any company. In 2010, Google fired an employee who used his clearance at the company to improperly access Gmail and GTalk accounts. He allegedly used that access to harass users, including minors, on the platforms. Multiple companies have created "god's eye" tools that provide access to peoples' content and actions without explicit permission and can easily be abused by employees. Uber allegedly had a tool that allowed employees to track users in real-time, which was supposedly used by Uber workers to spy on ex-partners, politicians and celebrities. Similarly, Snapchat reportedly has an internal tool that gives detailed access to user profiles. The tool, referred to as SnapLion, was ostensibly created to fulfill law enforcement requests but could be misused without proper protections, and those abuses have allegedly happened, according to a report from Vice.

It's not just the tech industry that is guilty of this. There are many reports that have found hospitals and healthcare employees regularly violating the privacy of patients. According to a report from ProPublica, the Department of Health and Human Services deal with more than 30,000 reports of privacy violations per year. These instances of unapproved access often happen when employees access the records of people they know, or of celebrities or people in the news. More than 50 employees were fired from Northwestern Hospital for illegally viewing the profile and medical records of actor Jussie Smollett. Similarly, Sutter Health in California fired employees for looking at medical records associated with Joseph DeAngelo, the alleged Golden State Killer.

These instances, just like stories of tech companies listening to recordings from conversations with voice assistants, should just be a reminder: while what you interact with may be faceless bits of technology and code, there are always humans behind it. No matter what rules and protections have been put in place to guard against unauthorized access, at the end of the day, the key to all of these systems are trust in those humans — and sometimes they simply do not deserve that trust.