Paid feature Here’s the irony of ransomware data breach stories that gets surprisingly little attention: cybercriminals enthusiastically encrypt and steal sensitive data to extort money and yet their victims rarely bother to defend themselves using the same obviously highly effective concept.
It should be a no-brainer. If sensitive data such as IP are competently encrypted, that not only means that attackers can’t access or threaten to leak it, in many cases they won’t even be able to see it in the first place – all encrypted data looks alike.
Ransomware is like a tap on the shoulder, telling everyone they have a problem. It’s not that criminals are able to reach the data – perhaps that’s inevitable – but that when they get there, the data is defenceless, exposed. You could even argue that ransomware wouldn’t exist if encryption and data classification had been widely adopted in the Internet’s early days.
Historically, the calculation has always been less clear cut. Using encryption (or tokenisation) across an organisation’s data is seen as adding complexity, expense and imposing a rigour few beyond elite regulated industries and government departments are willing to take on. It’s an issue that’s not lost on Thales UK’s cybersecurity specialist Romana Hamplova, and Chris Martin, IAM pre-sales solutions architect.
“Ransomware targets sensitive data. But if the attackers can’t see the contents of the file because of encryption, they can’t see that it’s sensitive,” agrees Hamplova. “On the other hand, there is no need to encrypt all data, only the data that qualifies as worth protecting. Just as you don’t want to have exposed/unprotected all/sensitive data, you also don’t want to have maximum security applied to public data because that just slows down the infrastructure.”
The catch, she says, is that organisations often aren’t always certain where that sensitive data is in an increasingly complex world where data gets moved around, deleted, changed, and re-classified. In many cases, they don’t have any easy way to identify what is and isn’t sensitive. What you’re left with is a form of data paralysis where organisations default back to trying to stop access to sensitive data rather than protecting the data itself.
The first job for organisations is to understand what data they have. “We enable them to discover the data in both structured and unstructured format and scan those locations and find out what data is there. For instance, perhaps they want to understand what GDPR data they have, or to adhere to PCI-DSS or HIPAA,” says Hamplova.
The ongoing chaos surrounding data and what to do with it was confirmed by Thales’s 2021 Thales Data Threat Report, which found that three quarters of the 2,600 global IT respondents questioned weren’t certain where all their organisation’s data was located.
Less than a third said they were able to classify or categorise it according to sensitivity. Interestingly on the data protection side, despite 42 per cent saying they’d experienced a data breach within the previous 12 months, half of victims were still able to avoid making a notification to information commissioners because the stolen data had been encrypted.
In terms of near-term spending priorities, 37 per cent of respondents mentioned encryption, only one per cent less than the percentage mentioning data loss prevention. An identical 37 per cent rated tokenisation as the most effective technology for protecting data, followed by data discovery and classification at 36 per cent, with encryption seen as the most effective by 34 per cent.
Working from home has made organisations aware of the data risks they have been taking, says Martin. “When people are in an office, there is an implicit amount of security. With working from home, the implied security is lost. You don’t have the visibility of that person sitting in front of their computer.”
Architectural changes such as cloud access exacerbate this. “What’s happened in the last 18 months is that companies are protecting their VPN. But employees are using applications that are not internal, so VPN access won’t necessarily control access to the applications or data. They are now separate.”
Another anxiety was the burden of software complexity itself, with organisations securing themselves using a mesh of overlapping tools. For example, 40 per cent or organisations admitted to using between five and seven different key management systems, with 15 per cent putting the number at between eight and ten. Much of this headache is caused by the growing importance of diverse cloud platforms.
The company’s 2021 Access Management Index uncovered a similar picture with authentication, with 34 per cent of respondents in the UK admitting that they used three or more authentication tools, with 26 per cent using three to five, and 8 per cent putting the number at more than five. That level of complexity makes management harder but also significantly raises the likelihood of misconfiguration and error.
By coincidence, just as the pandemic sent everyone scurrying to their spare rooms to work in early 2020, US super-body NIST published its first draft of SP 1800-25, which for the first time offered specific advice on coping with ransomware. This was followed in June this year with the NISTIR 8374, which related anti-ransomware strategy to the organisation’s risk-oriented Cybersecurity Framework, first published in 2014.
Built around the overarching Framework, everything NIST publishes these days is quickly funnelled into best practice presentations the world over. Its influence is being felt across an industry that can’t pretend it hasn’t been warned, agrees Martin.
“The significance of this is huge. We are used to regulations such as PCI-DSS and GDPR, but NIST is trying to raise the profile of ransomware. It affects the supply chain. NIST is trying to use its weight to do something about this sooner rather than later. The urgency has been raised.”
Frameworks work in a different way to rules. Rules create boundaries, a narrow focus, and the risk of the infamous tick box mindset that says that if the rule has been followed, the job is done. Twenty years of cybersecurity failure says rules aren’t enough. It could be that frameworks encourage more nuanced, long-term thinking.
“Even though companies don’t necessarily have to comply with the NIST recommendations, they still like to follow it because they understand that it is best practice,” says Hamplova. “We have been recommending best practice for years but unless there is a third-party body like NIST it doesn’t always have enough strength. Having a guideline like this can help companies to focus.”
A wider challenge remains the need to translate best practice into something which can be understood and implemented under real world conditions. Thales currently offers a wide range of data protection products and technology across the cybersecurity stack, bolstered by acquisitions including Alcatel Lucent’s cybersecurity division (2014), Vormetric (2016), and Gemalto (2017).
The Thales portfolio covers a large proportion of the data protection stack, starting with data classification and encryption, addressed by the CipherTrust platform. This also maps to the risk assessment subsection within the NIST Framework’s Identify risk assessment category (ID.RA). A critical element of CipherTrust is its transparent encryption approach, which means it is processed automatically without manual intervention.
“In our systems, encryption should always be transparent to an authorised user or application, to ensure business processes run uninterrupted” comments Hamplova.
As well as file encryption, CipherTrust also allows organisations to apply and manage encryption and tokenization for applications and databases using APIs. The second layer is access control and authentication, provided by SafeNet Trusted Access, which corresponds to NIST’s Protect, access control category (PR.AC). Within the context of home working, SafeNet adds a layer of security that is more reliable than naively relying on VPNs alone.
“This must go beyond simply identifying the user,” says Martin. “It’s also about the context, for example where they are located. “We can geo-locate with IP address or mobile phone. If someone is doing something from the same IP address as their home, we have a greater degree of confidence about their identity. It’s about taking authentication to the next level.”
Both Hamplova and Martin are cautiously optimistic about the latest cybersecurity bandwagon, zero trust (ZT), which can be thought of as a software-defined perimeter. The idea is a good one – assess users, credentials, or applications before allowing them access – but there are still practical difficulties in implementation. It would be perverse if an attempt to reform the naïve trust in credentials that has caused so many cybersecurity problems simply created new layers of complexity.
“Our society innovates built on trust. When we talk of zero trust, it’s not about being unable to trust anything but about establishing the right element of trust and build from there,” says Hamplova.
Martin agrees: “Is zero trust impossible? Ultimately, you have to trust someone or something in your organisations, or externally when accepting trust certificates.”
The issue of complexity remains a lurking worry with too many trust gateways being used to manage poorly integrated technologies. If authentication becomes too complex, trust becomes impossible to deliver. The Thales perspective is that the acid test for cybersecurity is whether it can protect data.
Says Hamplova: “As all cybersecurity specialists know, there is no nirvana! It’s always about making it harder for the cyber criminals to reach the critical data and ensuring your organisation is resilient enough to continue operating, should the worst happen.”
This article is sponsored by Thales.