Three ways to optimize edge computing in hybrid cloud

Answer these critical questions to ensure greater value and better results when deploying edge computing.

Enterprise efforts to disperse computing resources through hybrid cloud deployments reveal a separate but related strategy: the use of edge computing, in which organizations leverage local data center resources at remote locations or colocation facilities.

Two general principles define edge computing. First, it is distributed, with computation and processing occurring far away from a centralized data center or cloud. Second, it is location-specific, with key computing elements physically placed where the data is created or used.

Why deploy edge computing?

Dave McCarthy, research vice president of IDC's cloud and edge infrastructure services practice, said there are generally four business needs that drive IT leaders to make the leap into edge computing: the need to access data faster and reduce latency; and improve security. security and compliance or achieving data sovereignty; controlling costs; and ensuring business continuity or resiliency.

Whether looking to control costs or overcome some of the barriers to data movement, enterprises are increasingly choosing to move processing and computing activities to where the data is generated. Doing so often eliminates the costs associated with consumption models of cloud storage, as data that does not need to be stored can be used for immediate or real-time insights and then discarded.

Edge computing also ensures that businesses can leverage their valuable data more immediately, rather than simply storing it for use at some point in the future, which McCarthy equates to "putting money into a mattress." Manufacturing or industrial use cases especially require faster data processing when milliseconds are critical to quality or safety outcomes.

McCarthy said that while edge computing should not be viewed as a replacement for the cloud, it is a complementary technology or approach that can address some of the limitations of centralized cloud architectures.

How to optimize edge computing in a hybrid cloud environment

Edge computing architecture consists of multiple layers of infrastructure. For example, a colocation facility operates as an edge. Many telecom providers are creating deployment locations to support edge computing, which is an example of a provider-managed edge location.

Enterprises can also choose to operate their own data centers in retail stores, factories or satellite locations. Regardless of the scenario, proper planning and consideration of these three priorities will ensure that any infrastructure deployed at the edge can be optimized to execute and deliver mission-critical priorities.

1. Determine the appropriate location of edge assets. Optimizing the edge requires understanding exactly where all required applications best run within the security, budget, and performance requirements of the organization. Many enterprises try the edge first, trying out a few different deployment scenarios and then optimizing key elements based on these initial experiments. The answer isn't always the same, McCarthy suggests.

2. Don’t assume that cloud-native applications will run the same way at the edge. Cloud-native applications often operate differently when they are run on edge computing assets. This may lead to adverse results. Understand whether an application can scale to the edge and if and how it can optimize data flow. What data must be kept on-premises, and what data can be sent to the cloud? "Many vendors now understand the need to complement cloud native," McCarthy said. "This might be called edge native, which uses the same constructs, but maybe only certain elements or functionality within the application will run on the edge rather than in the cloud."

3. Avoid deploying fragmented components and custom or customized solutions. When edge computing first started to gain traction in business, many users "cobbled together their own Frankenstein solutions," McCarthy said. “Bundled solutions now include hardware and software and are more commonly offered as turnkey services.”

Turnkey solutions can be deployed horizontally (partners provide the required infrastructure and enterprises add their own applications on top, such as HPE GreenLake) or vertically (with industry-specific applications, such as Microsoft for financial services Azure).

After all, an enterprise's primary goal when deploying edge computing is to minimize the gap between collecting data and achieving business outcomes, and these latest edge solutions can achieve that goal.

Most of the same infrastructure and equipment commonly found in data centers will be deployed as part of an edge computing environment: servers, storage, connectivity and platforms on which applications run. The main difference is that edge infrastructure will be deployed in smaller configurations. Rather than figuring out how to scale a cluster in a data center, teams need to figure out how to scale to hundreds or even thousands of locations.

Guess you like

Origin blog.csdn.net/leyang0910/article/details/132955995