Ensure data security at the edge

Sponsored Feature Securing the corporate network has never been a simple process, but years ago it was at least a bit more straightforward. Back then, the network perimeter was clear and well defined, and everything inside it was considered trusted and safe. The security team defended against everything outside, established security protocols and deployed security tools, monitored the network gateways, and kept sensitive data as safe as possible.

The entire concept of the network perimeter has all but dissolved. Workers are now increasingly working at the edge on all sorts of devices and in virtually any location. Remote work was already well underway prior to the pandemic, but the surge in activity accelerated as COVID-19 took hold and has endured to this day. According to a December 2022 report released by IDG/Foundry, an average of 35 percent of computing resources in the United States now reside at the network edge. Respondents to that survey reported the need to process data from edge devices as a primary objective during 2022, an increase from 27 percent to 36 percent over the previous year.

So what once worked – the traditional castle and moat approach to security architecture – is no longer effective due to nefarious malware, ransomware and insider threats threatening workloads in hybrid clouds and the physical infrastructure within datacenters too. Organizations need a scalable and effective way to secure their data and applications wherever they are stored – whether in their own datacenter, a co-located datacenter, or hybrid cloud platforms. They need to ensure security when that data is in transit, and when it is being accessed and used at the edge and throughout the data lifecycle.

In its report – What Edge Computing Means For Infrastructure And Operations Leaders (gartner.com) – industry research firm Gartner predicts that by 2025, 75 percent of data generated by modern enterprises will be created outside of centralized datacenters. Moving that much data to support time-sensitive operations can lead to increased network congestion and disruption, and without careful selection of the right security controls can increase the likelihood of a breach occurring.

While those challenges exist, efficiently and effectively moving data between the edge and the cloud helps facilitate flexible and scalable data access and to generate insights where and when they’re needed. Having immediate access to this data helps the workforce generate actionable insights to better inform strategic decisions. Establishing a seamless connection also helps reduce data migration risks.

“We’ve progressed to a point where compute devices can do lot more data processing at the edge outside of the corporate datacenter,” says Simon Leech, Director in HPE’s Cybersecurity Center of Excellence. “In the past, we had to bring the data back into the datacenter. Now with edge computing, we have the capability to do data processing close to the data source.”

Edge computing has afforded such use cases as autonomous driving, deploying security cameras to monitor and manage industrial controls in real time. “This provides the capability to deliver real time insights, while reducing the impact on bandwidth.” says Leech.

Keep data safe at the edge

Companies working at the edge, working out of hybrid cloud platforms, and supporting a fully mobile workforce using a myriad of mobile devices need an equally flexible and powerful answer to their security challenges. The primary data security risk in this environment are centered around three aspects: the decentralized perimeter, cyber hygiene, and physical risks to the hardware itself. “The network perimeter has dissolved. I can’t afford the same security controls on my employees working remotely compared to when they were in the office,” says Leech.

Cyber hygiene is a critical factor. “With edge perimeter devices and Internet of Things (IoT) devices, it’s very difficult to patch these, so they’re always a challenge,” he adds. “Cyber criminals are exploiting IoT devices [for illicit purposes], for example by enrolling them into botnets, or using them as a stepping stone to gain access.”

There are simple physical risks to the hardware just because it’s easier to access. “Some are outdoors, such as a mesh of security cameras around a city. All are connected to a network. Someone could just cut the wire and steal the camera or use the connection for other purposes,” explains Leech.

Many organizations will inevitably look for ant assistance they can get in provisioning, managing and securing those IoT networks and devices, particularly in exposed edge locations.

The HPE GreenLake platform can help here, says Leech, by securing data migration between the cloud and the edge and providing application and data access to the workforce wherever they’re located and on whatever device they’re using.

That’s because it operates on three fundamental principles: shared responsibility, the Zero Trust framework, and continuous data protection. In terms of shared responsibility, it comes down to the choice of outsourcing operational responsibilities. “Customers have migrated from cloud never to cloud maybe, and now to cloud always.” says Leech.

As the public cloud has become more mature and defined, it has actually come full circle to where the public cloud isn’t always the right answer. Most companies need a hybrid cloud model, says Leech. “That’s what HPE GreenLake is trying to address; providing a public cloud experience on the customers’ own terms in their own datacenter or in a co-located datacenter.” he points out.     

Whilst the decision to outsource operations to a service provider is typically made for financial or productivity reasons, it’s important to remember that you can never fully outsource your organizational risk. “Ultimately, the company makes the decision to place workloads with a third party, and has to fully understand the boundaries of responsibility and security,” advises Leech.

Zero Trust extends beyond networking

The Zero Trust framework relates back to issue of the dissolved network perimeter. A traditional datacenter has a secure network perimeter. With the new “perimeter” no longer as defined as it was in the past, companies almost have to treat their internal network as untrusted as well.

There are different definitions of the Zero Trust framework throughout the IT industry. While initially developed as networking concept, Zero Trust now extends beyond networking. “We can extend Zero Trust past the network and all the way into the infrastructure stack,” says Leech. “The whole principle behind Zero Trust is never trust, and always verify. That means you always assume everything has been breached, perform continuous monitoring and checks to ensure that breach has been contained, and reduce your blast radius.”

HPE GreenLake handles those functions by creating chain of trust that extends through all infrastructure layers. HPE has developed a silicon root of trust, is based on a hardware-validated boot process, that ensures that their compute systems can only be started using code from an immutable source. This involves an anchor for the boot process rooted in hardware that cannot be updated or modified in any way. When combining this foundation with a cryptographically secured signature, there are no easily accessible gaps for hackers to exploit. This forms a critical foundation for a trusted environment the server boots into.

“We’ve investigated how to extend that environment to other layers in the stack. We extend that trust into the OS layer and past that into the apps for user access,” says Leech. “We need trust in all those components to extend the integrity verification model.” 

That is how the Zero Trust framework concept extends in HPE GreenLake model. “We’re working to ensure the secure hardware environment also extends to provide asecure control plane . So in theory, the HPE GreenLake environment is almost self-protecting,” he adds. 

The third area is around continuous data protection – the idea of continually backing up data so if something does happen, users can resort to a backup as close as possible to the point where it’s needed. “The old backup schedule was once a month or once a week,” says Leech. “The idea with continuous data protection is you perform an ongoing backup in real time, so you could jump back just a couple of minutes if you needed to.”

There’s still the enduring challenge of cleaning up a backup copy in the event that ransomware is replicated and remains dormant in the backup data. “The way we handle that is to create an immutable copy, so the data backup itself can’t be changed,” he continues. “We also use journaling to see if anything has changed, which makes it difficult to corrupt backup environment.”

Pay as you go

And finally, the HPE GreenLake platform is built around a consumption-based model. Traditionally, one of the primary barriers for buying physical hardware was the need to pay for it up front. This is the most significant change in the public cloud model. Companies aren’t having to purchase hardware, just paying for it as they use it.   

“This cloud-like experience is what we wanted with the HPE GreenLake model,” says Leech. “You only pay for what you use. If you use a little bit less, your monthly charge is adjusted accordingly. So it becomes an OpEx instead of a CapEx.”

As HPE monitors what any customer is using, it can appropriately size the environment as it goes along. Concludes Leech: “We always provide the customer with a bit of a buffer. If it turns out they needs more processing power, we can adjust.”

And since it’s a managed environment, that level of operational responsibility is taken care of. All systems are running the latest code, patched appropriately, and running at the proper scale. This helps take a significant burden of operational responsibility off of the modern enterprise, so they can focus on their core business functions.

You can read more about how HPE can help organizations adopt a zero trust framework here.

Sponsored by Hewlett Packard Enterprise (HPE).

READ MORE HERE