Misconfigured Servers Contributed To More Than 200 Cloud Breaches
Misconfigured storage services in 93 percent of cloud deployments have contributed to more than 200 breaches over the past two years, exposing more than 30 billion records, according to a report from Accurics, which predicted that cloud breaches are likely to increase in both velocity and scale.
The researchers found that 91 percent of the cloud deployments analyzed had at least one major exposure that left a security group wide open while in 50 percent unprotected credentials were stored in container configuration files, significant because 84 percent of organizations use containers.
“While the adoption of cloud native infrastructure such as containers, serverless, and servicemesh is fueling innovation, misconfigurations are becoming commonplace and creating serious risk exposure for organizations,” said Accurics Co-founder and CTO Om Moolchandani.
Private credentials with high privileges were embedded in the code in deployments at 41 percent of the organizations that responded to researchers. In 100 percent of deployments, an altered routing rule exposed a private subnet containing sensitive resources such as databases to the internet.
Respondents do not liberally apply automation, even as a manual approach creates alert fatigue – only six percent of cloud-security risks are being addressed by automated technology, the report found. And, hardcoded keys are present in 72 percent of deployments.
“The high percentage of cloud deployments with network exposure is concerning but not a surprise,” commented Brian Soby, CTO and co-founder of AppOmni.
“In more than 95 percent of [the] risk assessments [AppOmni conducts], we find exposures of highly sensitive data (frequently including insecurely stored credentials) to the public internet or high-risk / low-privilege users such as BPOs or vendor integrations,” Soby said. “So, seeing those statistics closely align isn’t surprising.”
Chris Morales, head of security analytics at Vectra, said the findings were believable.
“Cloud capabilities are developed at a rapid pace and it is near impossible for anyone to keep up with all of those features and capabilities and the impact they have on data access,” Morales said. “Much of the problem is due to lack of understanding to how cloud configuration works and the potential pitfalls by an industry historically versed in securing access to physical systems.”
While errors and misconfigurations exist in physical data centers, they are hidden behind a layer of controls and segregation from external factors. “In the cloud, we strip that layer away and a few keystrokes can unintentionally take a system from internal only to external facing,” Morales explained.
Any massive cloud security breach signifies a larger impact footprint or blast-radius.
“I do believe such events will become more and more prevalent as the adoption of public cloud continues with individuals and companies taking a short cut approach to meet time-to-market deadlines, without executing on the shared security model of the public cloud,” said Rajiv Kanaujia, vice president of operations at CloudCheckr.
Over time, IaaS vendors will make certain areas of security non-negotiable, hence restricting the success of the bad-actors, but a lack of awareness or funding to execute on the shared security model of the public cloud will continue to expose customers to such vulnerabilities, Kanaujia said.
“Now, the IaaS consumer (user of the cloud) has a big role to play in configuring and managing these layers,” he said, noting that application developers never had to deal with such responsibilities in the past.
Kanaujia agreed that a better approach is moving towards Infrastructure as Code (IAC), where such configuration changes become transparent to internal teams and go through a better change management process, including peer review. The industry will encourage concepts like encrypted data-bags that will slowly eliminate the need for having credentials in clear text anywhere in the system, he added.
READ MORE HERE