5 best use cases for edge computing

c4d6e543ef2fbc7d6857f5441f8a99b0.png

Translator | Cui Hao

Planning | Xu Jiecheng

Source丨 51CTO technology stack

01

opening

When you agree that time is equal to money or security; when you face data compliance issues, edge computing is your best choice. This article will bring you 5 application scenarios of edge computing, so as to help you think about how to design edge.

Edge computing refers to locating infrastructure close to where data is produced or consumed. Rather than pushing data to a public or private cloud for storage and computation, it is processed in-place at the "edge," which can range from simple commodity servers to complex platforms such as AWS for the Edge, Azure Stack Edge or Google Distributed Cloud.

The second meaning of edge computing includes performance, reliability, security and compliance of operations. In order to support these requirements, edge computing will transfer computing, storage and bandwidth to the edge infrastructure, because these functions cannot be performed on the centralized cloud architecture.

"Edge computing offers business leaders a new way to develop deeper relationships with customers and partners and gain real-time insights," said Mark Thiele, CEO of Edgevana.

When the development team is still small and in the early stages of a proof of concept, it can be difficult to recognize the optimal infrastructure. However, as the size of the team increases and the progress of the project progresses, everyone will gradually realize the need for edge infrastructure, which will force the team to re-architect or even re-factor the application. Thereby increasing development costs, slowing down development progress, and even hindering the delivery of enterprises.

As applications become more modern and integrated, enterprises should consider edge technologies and integration early in development to prevent the performance and security challenges that arise when developing enterprise-grade applications. Devops teams should be looking for responsive metrics before the platform's infrastructure requirements have been accurately modeled. Here are five reasons to consider Verge.

02

Improve efficiency and safety

In manufacturing, what is the value of a few seconds when a delay could result in worker injury? So what if manufacturing requires expensive materials, and finding defects a few hundred milliseconds earlier can save a lot of money?

In manufacturing, effective use of edge computing can reduce waste, increase efficiency, reduce workplace injuries, and increase equipment availability.

A key factor for architects to consider is the cost of failure due to decision failure or delay. Where there are significant risks or costs, such as manufacturing systems, surgical platforms, or self-driving cars, edge computing may offer increased performance and reliability for applications that require greater safety.

03

reduce latency

Sub-second response time is the basic requirement of most financial trading platforms. Now many applications hope to have this kind of performance, shorten the time from feeling and discovering problems, shorten the time from discovering opportunities to taking action, in short, constantly speeding up decisions cycle.

"If real-time decision-making is important to your business, improving speed or reducing latency is key, especially as businesses use all connected devices to collect data," said Amit Patel, senior vice president at the consulting firm.

Providing low-latency techniques is even more important when there are thousands of data sources and decision nodes. Examples of this include connecting thousands of tractors and farm machines and deploying machine learning (ML) on edge devices, or enabling metadata or other large-scale business-to-consumer experiences.

If you need to take action in real time, start with edge computing,” said Pavel Despot, senior product manager at Akamai. “Edge infrastructure is suitable for low-latency, high-elasticity and high-throughput application scenarios to handle the work of users distributed in different geographical locations. Load, this technology involves different fields such as streaming media, banking, e-commerce, and Internet of Things devices.

Cody De Arkland, director of developer relations at LaunchDarkly, said that companies with global offices or those that support large-scale hybrid work are a typical example. The value of working on the edge is that you can assign the work to the people closest to you, and those people will share the work. If your application is sensitive to data transfer times, you should consider edge infrastructure and consider which jobs should run at the edge.

04

Improve application reliability

Jeff Ready, CEO of Scale Computing, said that we are seeing a lot of interest in edge infrastructure in manufacturing, retail and transportation industries, where there is no possibility of equipment downtime, real-time access to data and the need to leverage data has become an element of competitive differentiation.

Therefore, edge infrastructure should be considered when downtime costs are high, repair times are long, and failures of centralized infrastructure affect multiple businesses.

Ready shared two examples. Consider a cargo ship in the middle of the ocean that cannot rely on an intermittent satellite connection to run its onboard systems, or a grocery store that needs to collect data from within the store to create a personalized shopping experience. If a centralized system fails, it can affect multiple ships and logistics, and highly reliable edge infrastructure can reduce the risk and impact of downtime.


05

Local data processing and regulatory support

If performance, latency, and reliability are not primary design considerations, edge infrastructure support may still be required depending on regulations regarding where data is collected and consumed.

According to Yasser Alsaied, vice president of Internet of Things at AWS, edge infrastructure is important for local data processing and data residency requirements. For example, it benefits companies that operate workloads remotely, that cannot upload data to the cloud due to connectivity, that are characterized by data that resides in a specific region and are highly governance, or have large volumes of data that need to be processed locally.

A fundamental question that development teams should answer is, where will the data be collected and consumed? Compliance should provide regulatory guidance on data restrictions and should consult operational function leadership on physical and geographic restrictions.

06

Cost-optimized bandwidth for large datasets

Smart buildings with video surveillance, facility management systems, and energy tracking systems capture vast amounts of data every second. It is much easier to process this data locally in the building than centrally in the cloud.

According to JB Baker, vice president of marketing at ScaleFlux, all industries are experiencing a surge in data, and adapting to this complexity requires a completely different way of thinking to harness the potential of huge data sets. Edge computing is part of the solution as it brings computation and storage closer to the origin of data.

AB Periasamy, CEO and co-founder of MinIO, offers this advice: “As data is generated at the edge of the network, it creates unique challenges in terms of application and infrastructure architecture. Considering bandwidth as the most costly item in the model, And capital and opex work differently at the edge."

In conclusion, modeling edge infrastructure early in development allows for smarter architectures to be considered when the development team sees that an application needs advantages in terms of performance, reliability, latency, security, governance, or scale.

Original link:

https://www.infoworld.com/article/3683290/when-to-architect-for-the-edge.html


Guess you like

Origin blog.csdn.net/weixin_41033724/article/details/130437085