Skip links

You Don’t Have to Give Up Your Crown Jewels in Hopes of Better Cloud Security

If you’re like me, you love a good heist film. Movies like The Italian Job, Inception, and Ocean’s 11 are riveting, but outside of cinema these types of heists don’t really happen anymore, right? Think again. In 2019, the Green Vault Museum in Dresden, Germany reported a jewel burglary worthy of its own film.

On November 25, 2019 at 4am, the Berlin Clan Network started a fire that destroyed the museum’s power box, disabling some of the alarm systems. The clan then cut through iron bars and broke into the vault. Security camera footage published online shows two suspects entering the room with flashlights, across a black-and-white-tiled floor. After grabbing 37 sets of stolen jewelry in a couple of minutes, the thieves exited through the same window, replacing the bars in order to delay detection. Then they fled in a car which was later found torched.[1]

Since then, there’s been numerous police raids and a couple of arrests, but an international manhunt is still underway and none of the stolen jewels have been recovered. What’s worse is that the museum didn’t insure the jewelry, resulting in a $1.2 billion-dollar loss. Again, this is a story ripe for Hollywood.

Although we may not read about jewelry heists like this one every day, we do see daily headlines about security breaches resulting in companies losing their own crown jewels – customer data. In fact, the concept of protecting crown jewels is so well known in the cybersecurity industry, that MITRE has created a process called Crown Jewels Analysis (CJA), which helps organizations identify the most important cyber assets and create mitigation processes for protecting those assets.[2] Today exposed sensitive data has become synonymous with cloud storage breaches and there is no shortage of victims.

To be fair all of these breaches have a common factor – the human element in charge of managing cloud storage misconfigured or didn’t enable the correct settings. However, at the same time we can’t always blame people when security fails. If robbers can so easily access multiple crown jewels again and again, you can’t keep blaming the security guards. Something is wrong with the system.

Some of the most well-versed cloud native companies like Netflix, Twilio, and Uber have suffered security breaches with sensitive data stored in cloud storage.[3] This has gotten to the point that in 2020, the Verizon Data Breach Report listed Errors as the second highest cause for data breaches due “in large part, associated with internet-exposed storage.”[4]

So why is securing cloud storage services so hard? Why do so many different companies struggle with this concept? As we’ve talked to our customers and asked what makes protecting sensitive data in the cloud so challenging, many simply don’t know if they had sensitive data in the cloud or struggle with handling the countless permissions and available overrides for each service.[5] Most of them have taken the approach that someone – whether that be an internal employee, a third-party contractor, or a technology partner – will eventually fail in setting the right permissions for their data, and they need a solution that will continuously check for sensitive data and prevent it from being accessed regardless of the location or service-level permissions.

Enter in Cloud Native Application Protection Platform (CNAPP). Last month our new CNAPP service dedicated to securing hybrid cloud infrastructure and cloud native applications became generally available. One of the core pillars behind CNAPP is Apps & Data – meaning that along with Cloud Security Posture Management (CSPM) and Cloud Workload Protection Platform (CWPP), CNAPP provides a cohesive Data Loss Prevention (DLP) service.

Figure 1: CNAPP Pillars

Typically, the way security vendors perform DLP scans for cloud storage is by copying down customer data to their platform. They do this because in order to scan for sensitive data, the vendor needs access to your data from a platform that can run their DLP engine. However, this solution presents some challenges:

  • Costs – copying down storage objects means customers incur charges for every bit of data that goes across the wire which include but aren’t limited to requests charges, egress charges, and data transfer charges. For some customers these charges are significant enough where they have to pick and choose which objects to scan instead of protecting their entire data store in the cloud.
  • Operational burden – customers who aren’t comfortable sending the data over the public internet have to create tunnels or direct connections to vendor solutions. This means additional overhead, architectural changes, and sometimes backhauling large amounts of data across those connections.
  • Defeats the Purpose of DLP – this was a lesson learned from our MVISION Cloud DLP scanning; for some customers performing DLP scans over network connections was convenient but for other customers it was a huge security risk. Essentially, these solutions require customers to hand over their crown jewels in order to determine if that data has the crown jewels. Ultimately, we arrived at the conclusion that data should be local, but DLP policies should be global.

This is where we came up with the concept of in-tenant DLP scanning. In-tenant DLP scanning works by launching a small software stack inside the customers’ AWS, Azure, or GCP account. The stack is a headless, microservice (called a Micro Point of Presence or Micro PoP) that pushes out workload protection policies to compute and storage services. The Micro PoP connects to the CNAPP console for management purposes but allows customers to perform local DLP scans within each virtual network segment using direct access. No customer data ever leaves the customers’ tenant.

Figure 2: In-tenant DLP Scanning

Customers can also choose to connect multiple virtual network segments to a single Micro PoP using services like AWS PrivateLink if they want to consolidate DLP scans for multiple S3 buckets. There’s no capacity limit or license limitation to how many Micro PoPs customers can deploy. CNAPP supports in-tenant DLP scanning for Amazon S3, Azure Blob, and GCP storage today with on-prem storage coming soon. Lastly, customers don’t have to pick and choose only one deployment model – they can use our traditional DLP scans (called API scans) over network connections or select our in-tenant DLP scans for more sensitive workloads.

In-tenant DLP scanning is just one of the many innovate features we’ve launched with CNAPP. I invite you to check out the solution for yourself. Visit https://mcafee.com/CNAPP for more information or request a demo at https://mcafee.com/demo. We’d love to get your feedback and see how MVISION CNAPP can help your company stay out of the headlines and make sure your crown jewels are right where they should be.

Disclaimer: this blog post contains information on products, services and/or processes in development. All information provided here is subject to change without notice at McAfee’s sole discretion. Contact your McAfee representative to obtain the latest forecast, schedule, specifications, and roadmaps.

[1] https://www.dw.com/en/germanys-heist-that-shocked-the-museum-world-the-green-vault-theft/a-55702898

[2] https://www.mitre.org/publications/systems-engineering-guide/enterprise-engineering/systems-engineering-for-mission-assurance/crown-jewels-analysis

[3] https://www.darkreading.com/cloud/twilio-security-incident-shows-danger-of-misconfigured-s3-buckets/d/d-id/1338447

[4] https://enterprise.verizon.com/resources/reports/dbir/

[5] https://www.upguard.com/blog/s3-security-is-flawed-by-design

Source