What is Serverless Architecture?

With the passage of time, the serverless architecture has become more and more popular. With the characteristics of extreme elasticity, pay-as-you-go, and low-cost operation and maintenance, it is playing an increasingly important role in many fields; the field of machine learning has played an increasingly important role in recent years. It is also very hot and has been applied in more and more industries. In fact, there have always been two major challenges in machine learning projects:

  • The resource occupancy rate is high and the utilization rate is low, especially in projects with a large difference between traffic peaks and troughs, resource waste is more significant;
  • High complexity of deployment, update and maintenance;

Serverless has the characteristics of extreme flexibility, pay-as-you-go, and low-cost operation and maintenance. If the serverless architecture is applied to machine learning projects, it is very worthwhile to reduce costs and improve resource utilization while ensuring the performance of machine learning projects. topics of research and exploration. Based on this, officially produced by Alibaba Cloud, four experts from Alibaba Cloud and Ant Group, Liu Yu, Tian Chudong, Lu Mengkai, and Wang Renda (in no particular order), systematically sort out Alibaba's AI experience under the serverless architecture, and work together Launched a new book **"AI Application Development under Serverless Architecture: Introduction, Practice and Performance Optimization"**.

The new book will be serialized on the public account of Serverless (serverlessdevs) for free. Everyone is welcome to subscribe and follow!

foreword

This book is a technical book about the actual combat of machine learning under the serverless architecture. Through the basic introduction of the serverless architecture, the summary of project development experience, and the learning of common machine learning algorithms, models, and frameworks, the application of machine learning projects to serverless. The architecture, the combination of different machine learning projects and serverless architecture, and the development of machine learning applications based on serverless architecture are explored.

We hope to introduce readers to the basics of serverless architecture and machine learning through simple and clear language, real-world examples, and open source code. I hope that readers can truly understand the important value of the combination of serverless architecture and machine learning through this book, and can successfully develop and launch machine learning projects under the serverless architecture, so as to more directly obtain the technical dividends brought by cloud computing.

This book not only has basic theoretical knowledge, but also a lot of experience sharing, as well as practical application of the latest technology points, including but not limited to getting started with GPU instances under the serverless architecture, multi-dimensional cold start optimization solutions, and multi-mode debugging capabilities of the serverless architecture. Wait.

We hope that readers can have a more comprehensive and intuitive understanding of serverless architecture and a deeper understanding of machine learning under serverless architecture through the study of this book. At the same time, I hope that through the introduction of this book, I can help readers implement machine learning projects under the serverless architecture and obtain the technical dividends of cloud computing development.

**Chapter 1: **Introduce the foundation of Serverless architecture, including the development, advantages, challenges, etc. of Serverless architecture; **Chapter 2: **From the development process of Serverless architecture, comparison with ServerFul development process, traditional framework Migration and other aspects, introduce the application development under the serverless architecture; **Chapter 3: **Introduce the exploration related to machine learning, including the learning and research of algorithms and models such as support vector machines and neural networks; * *Chapter 4: **Introduce common machine learning frameworks and their applications in practical projects, so that readers can understand common machine learning frameworks and solutions to deploy to serverless architectures; **Chapter 5: **Introduce several Project actual combat in the fields where machine learning is widely used, including the field of image recognition, sentiment analysis, and the exploration of related fields of model upgrade iteration, involving container images, reserved instances, GPU instances and many other new functions and features of serverless architectures; * *Chapters 6 and 7: **Introduce two complete cases of the combination of serverless architecture and AI, from the project background to the design of related modules, project development and deployment, and explain machine learning under the serverless architecture through a complete process The whole process of project start-up, development, and maintenance; **Chapter 8: **Share related development experience for serverless architecture and summarize serverless application tuning methods, including cold start optimization solutions under serverless architecture, development considerations, etc. content.

Chapter 1 Introduction to Serverless Architecture

Through the exploration of the concept of serverless architecture, this chapter analyzes the advantages and values, challenges and dilemmas of serverless architecture, as well as the sharing of application scenarios of serverless architecture, to introduce the basic content of serverless architecture to readers. Through the study of this chapter, the reader will have a certain understanding and understanding of the theoretical basis of the serverless architecture.

Concepts of Serverless Architecture

With the development of cloud services, computing resources are highly abstracted. From physical machines to cloud servers to container services, computing resources are gradually refined.

In 2012, Iron.io VP Ken Form first proposed the concept of serverless in the article "Why The Future of Software and Apps is Serverless", and pointed out that "even though cloud computing has gradually emerged, everyone is still revolving around servers. Turn. However, this won't last long, cloud applications are moving towards serverless, which will have a major impact on application creation and distribution."

In 2019, UC Berkeley published the paper "Cloud Programming Simplified: A Berkeley View on Serverless Computing". In the article, the author asserts sharply that "new BaaS storage services will be invented to expand the types of applications that can run on serverless computing more suitable. Such storage can match the performance of local block storage, and has temporary and persistent optional features. The price based on Serverless computing will be lower than ServerFul computing, at least not higher than ServerFul computing. Once Serverless computing achieves technological breakthroughs, it will lead to a decline in ServerFul services. Serverless will become the default in the cloud era This computing paradigm will replace ServerFul computing, which also means the end of the server-client model."

The serverless architecture first entered the public eye in 2012 and became the protagonist of UC Berkeley's sharp assertion in the field of cloud computing in 2019, completing the transition from a "new point of view" to a "high-profile architecture". In the past 7 years, the serverless architecture has gone from little-known to being commercialized, and then leading cloud vendors have deployed the serverless architecture as a cloud computing strategy, gradually becoming a well-known new technology paradigm.

Of course, in the past 7 years, Serverless has not only been gradually upgraded and improved in terms of technical architecture, but also the concept has become more and more clear, and the development direction has gradually become clear and clear. Regarding the definition of Serverless, Martin Fowler pointed out in the article "Serverless Architectures" that Serverless is actually a combination of BaaS and FaaS.

This straightforward definition lays the foundation for what constitutes a serverless architecture. As shown in Figure 1-1, Martin Fowler believes that in the serverless architecture, part of the server-side logic of the application is still completed by the developer, but unlike the traditional architecture, it runs in a stateless computing container, driven by events, The life cycle is very short (even one call), and it is completely managed by a third party. This situation is called Functions as a Service (FaaS).

In addition, the serverless architecture also has some applications that rely on third-party (cloud) applications or services to manage server-side logic and state. These applications are usually rich client applications (single-page applications or mobile applications), built on On top of the cloud service ecosystem, including databases (Parse, Firebase), account systems (Auth0, AWS Cognito), etc., these services were first called Backend as a Service (BaaS).

1-1 Serverless Architecture Composition

It is also believed that Serverless is a combination of FaaS and BaaS. CNCF further improves the definition of Serverless architecture in CNCF WG-Serverless Whitepaper v1.0: Serverless refers to the concept of building and running applications that do not require server management; it describes A more fine-grained deployment model in which an application is packaged into one or more functional modules, uploaded to the platform, and then executed, scaled, and billed in response to the exact needs of the moment.

At the same time, the 2019 UC Berkeley article "Cloud Programming Simplified: A Berkeley View on Serverless Computing" also supplemented the description and definition of what Serverless is from the perspective of Serverless architecture characteristics: Simply put, Serverless = FaaS + BaaS , must have the characteristics of elastic scaling and pay-as-you-go.

The concept of serverless is also described in the "Cloud Native Development White Paper (2020)" issued by the Cloud Native Industry Alliance of the China Academy of Information and Communications Technology (hereinafter referred to as the "Institute of Information and Communications Technology"). The core idea of ​​the architecture concept is to abstract the infrastructure that provides service resources into various services, and provide users with on-demand calls in the form of API interfaces, so as to truly achieve on-demand scaling and charging based on usage.

This architectural system eliminates the need for traditional massive continuous online server components, reduces the complexity of development and operation and maintenance, reduces operating costs, and shortens the delivery cycle of business systems, enabling users to focus on more value-intensive businesses. logical development.

So far, the definition of serverless architecture in terms of structure, behavior, and characteristics can be summarized as Figure 1-2.

1-2 Define Serverless Architecture from Different Angles

Features of Serverless Architecture

As we all know, things have two sides. Today, the development of cloud computing has made great progress, but as the latest product of cloud computing, Serverless architecture still faces challenges that cannot be ignored behind its huge advantages. Advantages and Values

Fei Lianghong, chief cloud computing technology consultant of Amazon AWS, once said: When most companies develop applications and deploy them on servers, whether they choose a public cloud or a private data center, they need to know in advance how many servers and how much capacity they need. Storage and database functions, etc., and need to deploy, run applications and dependent software onto the infrastructure.

Assuming you don't want to spend the effort on these details, is there a simple architecture that can meet this need? Today, as the serverless architecture gradually "enters the homes of ordinary people", the answer is already obvious.

In the process of launching the project, we generally need to apply for host resources. At this time, we need to spend a lot of time and energy to evaluate the peak maximum overhead. Even if we apply for resources according to the maximum consumption for some services, we must have special personnel to expand the resources in different time periods. Or shrink the capacity to achieve a balance between ensuring business stability and saving costs.

For some services, sometimes the requested resources need to be evaluated on the basis of the maximum overhead. Even if there may be many traffic troughs and a lot of resource waste, this has to be done. For example, applications such as databases are difficult to expand. "Although wasting resources is better than when the peak comes and the application cannot be served due to insufficient resources".

As Fei Lianghong said, under the serverless architecture, this problem has been solved relatively well. There is no need to plan how many resources to use, but to request resources according to actual needs, pay according to the time of use, and calculate according to the computing resources applied for each time. To pay, and the granularity of the billing is smaller, it is more conducive to reducing the cost of resources.

Serverless architecture has 6 potential advantages:

  • Unlimited computing resources on demand.
  • Eliminate the upfront commitment of cloud users.
  • Pay for the ability to use computing resources on a short-term basis as needed.
  • Massive cost reduction.
  • Simplify operations and improve utilization with resource virtualization.
  • Improve hardware utilization by multiplexing workloads from different organizations.

Compared with the traditional architecture, the serverless architecture does have the advantages of business focus, elastic scaling, and pay-as-you-go. These advantages are often an important reference for developers in technology selection.

1. Business focus

The so-called business focus refers to allowing developers to focus more on their own business logic instead of paying more attention to the underlying resources.

As we all know, in the era of monolithic architecture, applications are relatively simple, and the resources of physical servers are sufficient to support business deployment. As the complexity of the business soars, the functional modules are complex and huge, and the monolithic architecture seriously blocks the efficiency of development and deployment. As a result, business functions are decoupled, and microservice architectures that can develop and deploy separate modules in parallel have gradually become popular. The refined management of business inevitably promotes the improvement of the utilization rate of basic resources.

As shown in Figure 1-3, after the virtualization technology has been continuously improved and widely used, it has opened up the gap between physical resources and reduced the burden on users to manage the infrastructure. Containers and PaaS platforms are further abstracted, providing application-dependent services, running environments, and underlying computing resources. This improves the overall efficiency of application development, deployment, and operation and maintenance. The serverless architecture abstracts computing more thoroughly, entrusting the management of various resources in the application architecture stack to the platform, eliminating the need for infrastructure operation and maintenance, enabling users to focus on high-value business areas.

1-3 Evolution diagram of virtual machine, container and serverless architecture

2. Elastic scaling

The so-called elastic scaling refers to the automatic allocation and destruction of resources according to the fluctuation of business traffic, so as to maximize the balance, stability, high performance and improve resource utilization.

As we all know, in the process from IaaS to PaaS to SaaS, de-serverization is becoming more and more obvious. With the serverless architecture, de-serverization has risen to a new level. Compared with ServerFul, Serverless emphasizes Noserver's mind to business users.

The so-called Noserver does not mean that the server is separated from the server or that the server is not needed, but to remove the concern and worry about the running status of the server, which means that the operation of expanding and reducing the capacity of the server is no longer required. If the personnel are concerned, they will be handed over to the cloud mall for management. As shown in Figure 1-4, the broken line is the traffic trend of a website on a certain day.

1-4 Schematic diagram of traffic and load comparison between traditional cloud host architecture and serverless architecture in elastic mode

The analysis of Figure 1-4a is as follows:

  • The technician needs to evaluate the resource usage of the website. The evaluation result is that the maximum traffic peak of this website is 800PV/hour, so the corresponding cloud server is purchased.
  • However, at 10:00 that day, the operation and maintenance personnel found that the website traffic suddenly increased, gradually approaching 800PV/hour. At this time, the operation and maintenance personnel purchased a new cloud host online and configured the environment. Finally, the corresponding policy was added to the Master machine, and the traffic peak from 10 to 15 hours was passed.
  • After 15 hours, the operation and maintenance personnel found that the traffic returned to normal, stopped the cloud host that added the policy later, and released the extra resources.
  • At 18:00, I found the arrival of overloaded traffic again...

It can be clearly seen from Figure 1-4b that the load capacity always matches the flow (of course, there are certain problems in this figure, that is, the real load capacity may be slightly higher than the current flow to a certain extent), that is, it is not necessary to Like the traditional cloud host architecture, it responds to the peaks and valleys of traffic with the intervention of technicians, and its elastic capabilities (including capacity expansion and contraction) are provided by cloud vendors.

From the analysis of Figure 1-4, it is not difficult to see that the elasticity of the serverless architecture comes from the vendor's operation and maintenance technical support to a certain extent.

The serverless architecture advocates "to give more professional things to more professional people, so that developers can focus more on their own business logic", which is also a very intuitive embodiment of the elastic mode.

3. Pay as you go

The so-called pay-as-you-go means that the serverless architecture supports users to pay according to the actual resource usage, which can maximize the efficiency of resource usage on the user side and reduce costs.

Under the traditional cloud hosting architecture, once a server is purchased and run, it continues to consume resources and incur costs. Although the available resources of each server are limited and usually fixed, the load of the server is different at every moment, and the resource utilization rate is also different, which leads to the traditional cloud host architecture, which will obviously produce certain waste of resources.

Under normal circumstances, the utilization rate of resources is relatively high during the day, and the waste of resources is less; the utilization rate of resources at night is relatively low, and the waste of resources will be relatively high. According to the statistics of "Forbes" magazine, typical servers in commercial and enterprise data centers only provide an output of 5% to 15% of the average maximum processing power, which undoubtedly proves the correctness of the analysis of the resource utilization rate and waste degree of the traditional cloud host architecture. sex.

Serverless architectures allow users to delegate service providers to manage servers, databases and applications, and even logic. On the one hand, this approach reduces the trouble of users' own maintenance, and on the other hand, users can pay for the cost according to the granularity they actually use.

For service providers, they can process more idle resources. This is very good from a cost, "green" computing perspective. 1-5 Schematic diagram of comparison of traffic and expenses in elastic mode of traditional cloud host architecture and serverless architecture

As shown in Figure 1-5, the broken line is the flow chart of a website on a certain day.

Figure 1-5a is a schematic diagram of traffic and expenses under the traditional cloud host architecture. Usually, resource usage assessment is required before a business goes live. After evaluating the resource usage of the website, the staff purchased a server that can withstand a maximum of 1300PV per hour.

In a whole day, the total amount of computing power provided by this server is the shadow area, and the cost required is also the cost of the shadow area corresponding to the computing power. But it is obvious that the real effective resource usage and cost expenditure is only the area under the flow curve, and the shaded part above the flow curve is the resource consumption and extra expenditure.

Figure 1-5b is a schematic diagram of expenses in the elastic mode of the serverless architecture. It can be clearly seen that the cost and flow are basically proportional, that is, when the flow is at a low value, the corresponding resource usage is relatively small, and the corresponding cost is also relatively small; At higher values, there is a positive correlation between resource usage and expenses.

In the whole process, it can be clearly seen that the serverless architecture does not generate obvious resource waste and additional cost expenditure like the traditional cloud host architecture.

Through the analysis of Figure 1-5, it is not difficult to see that the elastic scaling capability of the serverless architecture is combined with the pay-as-you-go model, which can minimize resource waste and reduce business costs.

4. Other advantages

In addition to the aforementioned advantages of business focus, elastic scaling, and pay-as-you-go, the serverless architecture also has other advantages.

  • Shorten the business innovation cycle: Since the serverless architecture is to a certain extent a model of "cloud vendors strive to do more and let developers pay more attention to their own business", we can think that developers will spend less time and energy on ServerFul The architecture needs to focus on the OS level, cloud host level, and system environment level, and focus more on its own business logic. The direct effect of this is to improve the online efficiency of the project, reduce the business innovation cycle, and improve the speed of R&D delivery.

  • Higher system security: Although the serverless architecture has a sense of "black box" to a certain extent, but because of this, the serverless architecture often does not provide the function of logging in to the instance, nor does it expose the details of the system to the outside world. At the same time, the maintenance of the operating system and other levels is also handed over to cloud vendors, which means that the serverless architecture is more secure to a certain extent: on the one hand, the serverless architecture only exposes predetermined services and interfaces that need to be exposed, relatively Cloud hosting avoids the risk of brute force cracking to a certain extent; on the other hand, cloud vendors have more professional security teams and server operation and maintenance teams to help developers ensure overall business security and service stability.

  • More stable business changes: The serverless architecture is a natural distributed architecture provided by cloud service providers. At the same time, because of the characteristics of Noserver, developers are free from concerns and worries about the running status of the server. Therefore, under the serverless architecture, developers can The operation of changing business code and configuration is very simple. It only needs to be changed through the tools provided by the cloud vendor. After the new business logic takes effect stably, developers no longer need to pay attention. Therefore, the serverless architecture has great advantages in smooth business upgrades, changes, agile development, functional iteration, and grayscale releases.

Of course, even though many of the advantages of serverless architecture have been exemplified above, we still can't enumerate all of its advantages and values. But it is undeniable that the serverless architecture is attracting more attention and is being accepted and applied by more teams and individuals, and its value has been quickly highlighted.

For more content, pay attention to the Serverless WeChat official account (ID: serverlessdevs), which brings together the most comprehensive content of serverless technology, regularly holds serverless events, live broadcasts, and user best practices.

{{o.name}}
{{m.name}}

Guess you like

Origin my.oschina.net/u/4611872/blog/5569856