边缘计算Edge Computing

最近工作有接触到这个概念,先贴点资料在这里吧。

Overview

Edge computing pushes applications, data and computing power (services) away from centralized points to the logical extremes of a network. Edge computing takes advantage of microservices architectures to allow some portion of applications to be moved to the edge of the network. While Content Delivery Networks have moved fragments of information across distributed networks of servers and data stores, which may spread over a vast area, Edge Computing moves fragments of application logic out to the edge. As a technological paradigm, edge computing may be architecturally organized as peer-to-peer computing, autonomic (self-healing) computinggrid computing, and by other names implying non-centralized availability.

To ensure acceptable performance of widely dispersed distributed services, large organizations typically implement edge computing by deploying server farms with clustering and large scale storage networks. Previously available only to very large corporate and government organizations, edge computing has disseminated technology advances and cost reductions from large-scale implementations and made the technology available to small and medium-sized businesses.[7] Small, low-cost cluster hardware and freely-available cluster management software have increased accessibility.

The target of Edge Computing is any application or general functionality needing to be closer to the source of the action where distributed systems technology interacts with the physical world. Edge Computing does not need contact with any centralized Cloud. Edge Computing does use a similar or the same distributed systems architecture as centralized Clouds but closer to or directly at the Edge.

Edge computing imposes certain limitations on the choices of technology platforms, applications or services, all of which need to be specifically developed or configured for edge computing.[8]

Advantages

Possible advantages of edge computing are:

  1. Edge application services significantly decrease the volumes of data that must be moved, the consequent traffic, and the distance the data must travel, thereby reducing transmission costs, shrinking latency, and improving quality of service (QoS).
  2. Edge computing eliminates, or at least de-emphasizes, the core computing environment, limiting or removing a major bottleneck and a potential single point of failure.
  3. Ability to ride the same cost curves and improvements by exploitation of the same architecture and fundamental underlying computing technologies as other Clouds whether centralized fee-for-service Clouds or closed private clouds which are also centralized. Cost accounting models based upon how shared resources are billed in fee-for-service clouds (timesharing) often expressed by the phrase "as a Service" should not be confused with the common architectural basis of centralized Clouds, Edge Clouds, and increasingly Edge nodes as well. Ultimately all IT systems, distributed or not, must provide viable services regardless of how or where they are implemented. Clouds, however, do share common distributed system architecture and technology forming three modes defined by distance from the edge: Centralized Clouds, Edge Clouds, and Edge nodes taken collectively also known as fog computing.

ISO/IEC 20248 provides a method whereby the data of objects identified by edge computing using Automated Identification Data Carriers [AIDC], a barcode and/or RFID tag, can be read, interpreted, verified and made available into the "Fog" and on the "Edge" even when the AIDC tag has moved on.

Challenges

  1. Edge computing requires applications to be built for horizontal scalability. Generally the recommendation is to build applications that follow the 12-factor application guidelines.[9]
  2. Edge computing requires operations to be able to deploy to a distributed set of edge nodes, coordinate cross-node state and storage, or handle inconsistent state gracefully.

Reference: https://en.wikipedia.org/wiki/Edge_computing

Edge computing allows data produced by internet of things (IoT) devices to be processed closer to where it is created instead of sending it across long routes to data centers or clouds.

Doing this computing closer to the edge of the network lets organizations analyze important data in near real-time – a need of organizations across many industries, including manufacturing, health care, telecommunications and finance.

“In most scenarios, the presumption that everything will be in the cloud with a strong and stable fat pipe between the cloud and the edge device – that’s just not realistic,” says Helder Antunes, senior director of corporate strategic innovation at Cisco.

What exactly is edge computing?

Edge computing is a “mesh network of micro data centers that process or store critical data locally and push all received data to a central data center or cloud storage repository, in a footprint of less than 100 square feet,” according to research firm IDC.

It is typically referred to in IoT use cases, where edge devices would collect data – sometimes massive amounts of it – and send it all to a data center or cloud for processing. Edge computing triages the data locally so some of it is processed locally, reducing the backhaul traffic to the central repository.

Typically, this is done by the IoT devices transferring the data to a local device that includes compute, storage and network connectivity in a small form factor. Data is processed at the edge, and all or a portion of it is sent to the central processing or storage repository in a corporate data center, co-location facility or IaaS cloud.

Why does edge computing matter? 

Edge computing deployments are ideal in a variety of circumstances. One is when IoT devices have poor connectivity and it’s not efficient for IoT devices to be constantly connected to a central cloud.

Other use cases have to do with latency-sensitive processing of information. Edge computing reduces latency because data does not have to traverse over a network to a data center or cloud for processing. This is ideal for situations where latencies of milliseconds can be untenable, such as in financial services or manufacturing.

Here’s an example of an edge computing deployment: An oil rig in the ocean that has thousands of sensors producing large amounts of data, most of which could be inconsequential; perhaps it is data that confirms systems are working properly.

That data doesn’t necessarily need to be sent over a network as soon as its produced, so instead the local edge computing system compiles the data and sends daily reports to a central data center or cloud for long-term storage. By only sending important data over the network, the edge computing system reduces the data traversing the network.

Another use case for edge computing has been the buildout of next-gen 5G cellular networks by telecommunication companies. Kelly Quinn, research manager at IDC who studies edge computing, predicts that as telecom providers build 5G into their wireless networks they will increasingly add micro-data centers that are either integrated into or located adjacent to 5G towers. Business customers would be able to own or rent space in these micro-data centers to do edge computing, then have direct access to a gateway into the telecom provider’s broader network, which could connect to a public IaaS cloud provider.

Edge vs. Fog computing

As the edge computing market takes shape, there’s an important term related to edge that is catching on: fog computing.

Fog refers to the network connections between edge devices and the cloud. Edge, on the other hand, refers more specifically to the computational processes being done close to the edge devices. So, fog includes edge computing, but fog would also incorporate the network needed to get processed data to its final destination.

Backers of the OpenFog Consortium, an organization headed by Cisco, Intel, Microsoft, Dell EMC and academic institutions like Princeton and Purdue universities, are developing reference architectures for fog and edge computing deployments.

Some have predicted that edge computing could displace the cloud. But Mung Chaing, dean of Purdue University’s School of Engineering and co-chair of the OpenFog Consortium, believes that no single computing domain will dominate; rather there will be a continuum. Edge and fog computing are useful when real-time analysis of field data is required.

Edge computing security

There are two sides of the edge computing security coin. Some argue that security is theoretically better in an edge computing environment because data is not traveling over a network, and it’s staying closer to where it was created. The less data in a corporate data center or cloud environment, the less data there is to be vulnerable if one of those environments is comprised.

The flip side of that is some believe edge computing is inherently less secure because the edge devices themselves can be more vulnerable. In designing any edge or fog computing deployment, therefore, security must be a paramount. Data encryption, access control and use of virtual private network tunneling are important elements in protecting edge computing systems.

Edge computing terms and definitions

Like most technology areas, edge computing has its own lexicon. Here are brief definitions of some of the more commonly used terms

  • Edge devices: These can be any device that produces data. These could be sensors, industrial machines or other devices that produce or collect data.
  • Edge: What the edge is depends on the use case. In a telecommunications field, perhaps the edge is a cell phone or maybe it’s a cell tower. In an automotive scenario, the edge of the network could be a car. In manufacturing, it could be a machine on a shop floor; in enterprise IT, the edge could be a laptop.
  • Edge gateway: A gateway is the buffer between where edge computing processing is done and the broader fog network. The gateway is the window into the larger environment beyond the edge of the network.
  • Fat client: Software that can do some data processing in edge devices. This is opposed to a thin client, which would merely transfer data.
  • Edge computing equipment: Edge computing uses a range of existing and new equipment. Many devices, sensors and machines can be outfitted to work in an edge computing environment by simply making them Internet-accessible. Cisco and other hardware vendors have a line of ruggedized network equipment that has hardened exteriors meant to be used in field environments. A range of compute servers, converged systems and even storage-based hardware systems like Amazon Web Service’s Snowball can be used in edge computing deployments.
  • Mobile edge computing: This refers to the buildout of edge computing systems in telecommunications systems, particularly 5G scenarios.

Reference: https://www.networkworld.com/article/3224893/internet-of-things/what-is-edge-computing-and-how-it-s-changing-the-network.html

猜你喜欢

转载自blog.csdn.net/weixin_42631919/article/details/81481772