Cloud technology has revolutionized IT infrastructure and the way businesses store and protect their data. Due to its relative infancy, companies are applying outdated thinking to how to secure it and protect against the modern threat landscape. We’ve identified the ten most common mistakes organizations are making in their cloud cybersecurity posture.
Mistake #1: Overestimating Perimeter Security Tools
Businesses tend to overestimate the ability of their perimeter security tools to identify threats. These tools are designed to look for evidence of compromise. Cybercriminals have become quite sophisticated and design their payloads to be “slept” for an extended period of time to evade these tools. If you need evidence of this fact, the question to ask yourself is this: if perimeter tools are failsafe, why do we ever hear of successful attacks, breaches, and ultimately companies paying ransoms? When threats slip through the cracks of perimeter security, they make their way into backups, which is a critical last line of defense against ransomware.
Mistake #2: Believing 3-2-1-1-0 Is Enough in the Cloud
Cloud customers mistakenly believe that the 3-2-1-1-0 rule that they’ve been using in their on-prem environments is enough in the cloud. This rule states that there should be three copies of the data stored on two different media (one of which is off-site and one in the cloud) and that the entire backup should contain zero errors.
There is a high degree of focus on immutability and air gapping. However, an immutable and air gapped backup that contains ransomware, malware, or corruption is completely useless when recovering from backup and, in fact, could send the organization into a constant cycle of reinfection.
Mistake #3: Deploying Security Tools Built for On-Premise Environments in the Cloud
Cloud offers the advantage of being able to spin up (and subsequently spin down) environments and incredible elasticity with its autoscaling capabilities. However, when security tools designed for an on-prem environment are used in the cloud, ephemeral instances and new workloads are not automatically captured in backups. To do this using tools designed for an on-premise environment requires a complex and slow process that demands a large team. Even then, they are not operating in real time. Only cloud native security tools can detect new workloads in real time and ensure that they are captured in backup data.
Mistake #4: Not Understanding the Security Risks of Ephemeral Instances
Most businesses, even if using cloud native security tooling, tend to dismiss the threat ephemeral instances (also referred to as “spot instances”) pose. The thought process is that because these instances “die” by design, there is no need to capture them in backups nor scan them for threats. The reality is that these instances can provide an entry point for ransomware and because they “die,” all of the data is gone forever. If these instances do become infected, there will be no data available for forensic analysis. The result is often a situation where an organization cannot determine the root cause of a persistent threat.
Mistake #5: Trading One Security Risk For Another
If a cloud security tool views, accesses, or in any way takes custody of data, it is trading one security risk for another. There’s a great saying in technical risk management of “who watches the watchers?” If your security tool is compromised, so is your data. A more modern and secure approach is to use a security tool that does not access your actual data in any way.
Mistake #6: Mistaking Cloud Availability For Cyber Resilience
While cloud providers offer a level of protection against threats like unauthorized access through robust IAM features, ransomware attacks pose a unique challenge and fall squarely under the customer’s purview in the Shared Responsibility Model.
While the cloud can offer high availability and redundancy for data and applications, it does not automatically guarantee cyber resilience. Cyber resilience involves more than just having access to data or services; it encompasses the ability to prevent, detect, respond to, and recover from cyber threats or incidents.
Mistake #7: Using Security Tools With A Bad ROI
Most security tools require a robust team with specialized knowledge to secure maximum protection. With the cloud industry being relatively new, employees with this level of expertise can be extremely costly. Couple this with the need to investigate endless false positive alerts, and the overhead to operate them is untenable. Look for tools that have a human component behind the scenes to investigate alerts and ensure time is not being wasted on false positives. This helps ensure your team is not going “alert blind” and are only responding to alerts that are to be taken seriously.
In addition to the overhead factor, the ROI of most tools is upside down. They typically justify ROI simply on the fact that they make the claim to help organizations avoid cost by way of averting data loss and paying a ransom. However, they are very expensive and in practical terms, represent a hit to the budget. Organizations should instead turn to tools that can pay for themselves in actuality through features such as global deduplication and compression (see Mistake #9: Paying Too Much For Backups).
Mistake #8: Telling a White Lie on Cyber Insurance Applications
Most cyber insurance policy applications ask if backups are being scanned for ransomware and malware. Security leaders wince at this question and often find a way to justify checking the box that says, “yes.” This is a big mistake. While this is technically insurance fraud, insurers are unlikely to pursue legal action. However, this is grounds for them to deny your claim in the future and/or terminate your coverage.
Mistake #9: Paying too much for backups
Backups often represent 30% of a company’s total cloud bill. In an age where FinOps is becoming a standard practice, backups are often overlooked as a place for substantial cost savings. Instead of taking advantage of tools that have global deduplication and compression capabilities, organizations often make the mistake of lessening their retention period. When this practice is combined with a lack of scanning for threats and the knowledge that cybercriminals “sleep” their payloads to outlast retention periods, it is a recipe for disaster.
Mistake #10: Not Scanning Backups for Threats and Continually Recovery Testing Them
The convenience and scalability of cloud backups have made them increasingly popular. It’s critical to maintain timely backups as they provide a last line of defense against ransomware attacks and a mechanism to roll back other undesired changes to the environment.
However, there is a crucial security gap that often goes unnoticed: the failure to implement robust ransomware, malware, and corruption scanning of backup data and regular recovery testing of those backups. Oftentimes, the first time backups are tested is during a recovery event when under duress to get all systems back online.
Modern Cloud Infrastructures Require Modern Cybersecurity Thinking
Using thinking, processes, and tools built for traditional infrastructures leaves organizations open to threats from cybercriminals. This path can be costly by way of tools that cost too much to deploy and operate as well as data loss, customer loss, reputational loss, and the need to pay ransoms in hopes of having company data returned.
Cybersecurity in the cloud requires a cloud native approach that acknowledges the differences between cloud and traditional architectures and environments.
About Elastio
Elastio detects and precisely identifies ransomware in your data and assures rapid post-attack recovery. Our data resilience platform protects against cyber attacks when traditional cloud security measures fail.
Elastio’s agentless deep file inspection continuously monitors business-critical data to identify threats and enable quick response to compromises and infected files. Elastio provides best-in-class application protection and recovery and delivers immediate time-to-value.