In the wake of the devastating WannaCry and NotPetya ransomware campaigns, it was hard to imagine that things could get more embarrassing for the IT profession.

That double whammy was possible because IT administrators left firewall ports 445 and 139 open, which allowed the ExternalBlue exploit to take hold. Thousands of companies around the world paid the price for IT's negligence.

Despite all the attention, many organizations still haven’t taken the simple step to close the obviously open ports.  Once they get hit, regulators and litigators will likely have a field day. Nobody can say IT wasn’t warned.

And now, just a few short weeks later, we learn that security researchers have discovered numerous preventable data leaks that exposed personal, sensitive data of hundreds of millions of users.  Where did they find this data?

On Amazon - where else?  The go-to web service for storing large amounts of data. Impacted organizations include:

In each case, the data was posted to an S3 bucket on Amazon’s storage service. And the access permissions were set to (can you guess it?) - “public.”

Illustration for blog post: AWs S3 bucket leaks: embarrassing for IT. Image shows leaky bucket. Illustration: Authentic8

A bit of background:  Amazon S3, part of Amazon Web Services, provides utility storage for a variety of web applications.  S3 is cost effective, globally available and massively scaled.  S3’s architecture allows users and applications to access data in a seamless and efficient manner.  It’s not an understatement to say that S3 has transformed the storage industry.

To use S3, you define “buckets” where you want to store data.  Unlike a traditional file system with objects living in directories, buckets hold data but aren’t stored in a hierarchical directory structure. Since buckets live in the cloud, there’s no set limit to how much data one bucket can contain.  Each bucket gets its own URL so users with the URL or a computer process can access it.

When a bucket is created, the default sharing permissions are set to private, meaning only the account owner that created the bucket can access it.  But data is meant to be shared, so Amazon lets the owner extend access to others.  

Making buckets available to other parties requires an explicit configuration setting.  I don’t know how it happened or why, but in each of these data breach instances, the affected buckets were set to allow public access. And the rest, like our data, is history.

The stunning revelation here isn’t the level of sophistication of the exploit,  or a lack of security in the cloud-delivered services.  Rather, it’s a simple IT error that led to the leak.

Imagine leaving your house and not locking the front door. Sure you can blame the burglar if you get robbed, but perhaps you should have taken a little more care in securing the house.

A contributing factor here may be that the method for managing bucket permissions, while powerful, is perhaps not as intuitive as it could be.  Within the Web Services console, the only other sharing option available is “Public,” an option that Amazon tags as “Not Recommended.”

WS S3 Bucket Permission Settings - screenshot fordata breaches blog post So Much Leaking

The middle ground, where bucket access is restricted to a known set of users, requires that the owner defines Identity and Access Management (IAM) policies. Once a policy is defined, you create an IAM user for the person who should have access to the data.

Next, create an Access Permission for that user.  If you want to share the bucket with a computer process rather than a user - like allowing a web server to pull images from a bucket to display on a page - you'd allow authorized services to access that user account, which would then have access to the data in the bucket.

Conceptually, this isn't an overly complicated process.  Still, it requires thoughtful configuration and a few extra steps.  I suspect that someone took a shortcut in either the Web Services console or another third party tool, and set the bucket permissions to “Public.”

To learn more about S3 management and avoid future headaches, I recommend reading this post, whose author takes a deep dive into AWS S3 access controls

As with the importance of managing access permissions, having a cohesive naming convention for buckets is critical.  If the path name is not consistent and if they are created by different users, buckets get scattered across the infrastructure, making them harder to track, and rendering audit and security checks more challenging.  

The default URL naming process likely contributed to the problem.  Any outsider can scan S3 URLs to test buckets for publicly available data.  That’s how these discoveries were made.

Amazon has followed up with a security alert for S3 users warning them of the issue and encouraging them to review bucket permissions.  That should help raise awareness, but it won’t stop an IT admin from taking a shortcut or an outsider from discovering a breach.  

In fact, the possibility of an outsider discovering company data should be front and center on IT's mind when publishing data to the web or managing permissions.  When buckets are created, the URLs are available to anyone who can script a query.  That alone should force administrators to alter their practices and formalize checks and balances.

Why can’t IT assume the same mindset of the hacker or researcher who is trawling bucket URLs?

IT Department, or Department of Holes & Leaks?

Shouldn’t the company itself try to identify weaknesses in its own systems before it gets beaten to the open port or bucket by the bad actors? Once a service is published or a configuration change is made, wouldn’t it be prudent to run outside-in vulnerability scans to validate settings?

Tools to perform outside-in assessments of systems are readily available and simple to use. Just search for “pen test toolkits” or “vulnerability scanner,” and you’ll find numerous utilities for scanning ports, identifying OS versions and patch levels, even for crawling Amazon buckets for open access permissions.

IT managers with less confidence in their own hacking skills can turn to external service providers that will perform pen tests and security assessments for hire.

Organizations not able or not willing to take basic steps to prevent such exploits may want to invest in a new door sign for IT. I suggest “Department of Holes and Leaks.”


Full disclosure: The author is an investor in and a member of UpGuard's Board of Directors, the company that identified some of these data breaches.