Why is Serverless the main battlefield in the next decade?

lu-lu-f-gGPNOZbRI-unsplash.jpg

Author | Not ugly Alibaba Cloud Serverless person in charge
Source | Serverless official account

"Only transcendence can we go on."

This is the tenth year in Ali. Since joining Alibaba Cloud in 2010, Bu Yu has participated in the research and development of Alibaba Cloud Feitian distributed system. He has served as the architect of batch computing and the research and development manager of Table Storage (NoSQL). He has deeply participated in the whole process of Alibaba Cloud system research and development and product iteration. In 2016, Bu Yu became the head of Alibaba Cloud's functional computing product research and development, dedicated to building a next-generation flexible and highly available serverless computing platform.

Serverless is a technical problem to be overcome in the next decade. In this wave of serverless waves, Alibaba Cloud has always been at the forefront. Whether it is technology or product, its richness in China is the first. "Never dare to take it lightly. Serverless is still in its early stage in China. Only by polishing the technology and products to make the user experience better, this battle can be considered a victory."

We did a simple interview with Fu, and listen to his thoughts on the development, technical difficulties, and implementation of Serverless that everyone is more concerned about.

Accept or wait and see?

Cloud computing will definitely become the infrastructure of the entire society and business in the future. By then, the use of cloud computing should be as simple as we use hydropower and coal now. There is no need to understand a series of issues such as where the water comes from, how to filter, how to lay pipelines, etc. Just turn on the tap and get a glass of water. The concept of serverless can just help cloud computing to take a step forward in this direction. It advocates that people don't need to care about service-related matters other than application logic, including management, configuration, operation and maintenance, etc., and pay as much as you use.

From this point of view, Serverless is a path that truly makes cloud computing into a social business infrastructure, and it is closer to the cloud-native approach advocated in the industry. Therefore, people should naturally follow the Serverless approach when using cloud computing. Way to use.

The minds of foreign developers in the serverless field are obviously better than those established by domestic developers. Because many foreign companies started their business based on the Lambda ecosystem at the beginning, and some large domestic companies have begun to use Serverless tools and products, and a large number of companies are on the sidelines.

The emergence of a new product also has an adaptation period, so after the emergence of a series of products such as Serverless, users have many concerns about whether to use, whether to migrate, and how to migrate. There are often companies inquiring about how to ensure the safety of function computing, how to ensure the stability of function computing, and whether there is a relatively large transformation cost and transformation risk in the migration of traditional projects to the serverless architecture. These concerns are normal, but I believe that with the development of Serverless, the definition of FaaS becomes more extensive and the construction of the tool chain becomes more complete, and these problems will gradually be solved. In theory, the problem that technology can solve is not a problem.

No scale, don’t build your own serverless

Serverless brings extremely flexible experience, cost savings, and development efficiency improvements, which are all very attractive. In the process of traditional business development and online, teamwork is required. Each person develops a part, merges the code, develops joint debugging, and then conducts resource evaluation, test environment construction, online environment construction, test launch, operation and maintenance. However, in the serverless era, developers only need to develop their own functions/functions, and then deploy them to the test environment and online environment. A large part of the later operation and maintenance work does not need to be considered or worried.

It is no exaggeration to say that if an enterprise builds its own database service through a cloud host, the availability is generally not as good as the database service provided by cloud vendors. In addition, API gateways, data storage services, etc. are also products provided by cloud vendors with better performance. More safe and reliable.

Small businesses are best not to build Serverless by themselves. Because the core element of Serverless is to use according to quantity, this means that if today's quantity is small, you will use very few resources; if today's quantity is large, you need to mobilize more resources. During the "Double Eleven" period, traffic was on the order of 100 million. If your company does not have machine resources of this kind of traffic in 100 million units, how do you schedule these resources for others to use? If there is no way to achieve scheduling by volume, let alone Serverless. Those enterprises that do not have the scale of resources are not recommended to build their own serverless capabilities, but they can practice serverless by using public cloud products.

Nowadays, all major manufacturers have seen that Serverless is the future. Even if it is not the end state of cloud computing, it is also a way to the end state. On the one hand, because Serverless can solve many practical problems, it is more "like" or even more. Close to the real cloud computing; on the other hand, everyone does not want to fall behind in the wave of cloud computing development. Therefore, Serverless has become a battleground.

There are three main parts to the competition for serverless capabilities:

The first is performance , including security, stability, flexibility, etc. If performance is not done well, I don’t think we can do serverless or not, even cloud computing, because performance is the core capability of serverless and everything is built. On top of safety, stability and performance.

The second is function . To make Serverless well, function is indispensable. Because Serverless is not just FaaS, even FaaS is not only online operation, but also includes many things, such as BaaS, triggers, logs, monitoring, alarms, etc. Developers are only likely to be willing to use it only if they meet the requirements of developers in terms of functionality.

Finally, there is experience . Serverless experience is too important. Experience includes all aspects, such as ease of use, stability, security, product flexibility, and tool chain integrity. In addition to the three points mentioned above, I think community, ecology, and openness are also very important.

Alibaba Cloud is one of the first public cloud vendors to launch Serverless platforms in China, and its FaaS platform products are called functional computing. In terms of event triggering, language support, and user experience, there are a lot of data in function computing that deserve attention:

  • Event triggering : Alibaba Cloud function computing can be triggered by service events on Alibaba Cloud, such as Alibaba Cloud Object Storage (OSS), Log Service (SLS), Message Service (MNS), Table Storage (OTS), API Gateway, CDN, etc., Its characteristic is that the unique Callback mechanism greatly reduces the developer's architecture and code cost for the asynchronous model;

  • Supported languages : Alibaba Cloud Function Computing currently supports mainstream development languages ​​such as Node.js, Java, Python, and supports Go, C/C+, Ruby, Lua, etc. through Custom Runtime;

  • User experience : Alibaba Cloud Function Computing provides a web-based console and SDK; users can manage function applications through the web console, or operate through interactive command lines;

  • Service mode : Functions can be managed by services and applications, and a single function instance can execute multiple requests in parallel, effectively saving computing resource costs.

Thorny problem

The pain points of Serverless are very difficult. For example, how to quickly migrate traditional projects to Serverless, how to smoothly transition, how to serverless, how to perform better debugging under the serverless architecture, how to better save costs, etc., each is a problem. My colleague Xu Xiaobin mentioned the challenges facing Serverless in the article "Behind the Noise: The Concept and Challenges of Serverless":

It is not easy to implement Serverless on a large scale in mainstream scenarios. There are many challenges. Let me analyze these challenges in detail:

Challenge 1: Difficulty in lightweight business

To achieve complete automatic elasticity and pay for resources actually used, it means that the platform needs to be able to scale out business instances in seconds or even milliseconds. This is a challenge to infrastructure and puts forward very high requirements for business, especially for relatively large business applications. If it takes ten minutes to distribute and start an application, then the automatic and flexible response capability will basically be unable to keep up with changes in business traffic...

Challenge 2: The infrastructure response requirements are extremely high

Once serverless applications or function instances can be expanded in seconds or even milliseconds, related infrastructure will soon face tremendous pressure. The most common infrastructure is service discovery and log monitoring systems. Originally, the change frequency of the entire cluster instance may be several times per hour, but now this frequency has become several times per second; in addition, if the responsiveness of these systems cannot keep up with the instance The speed of change, then the whole experience will be greatly reduced.

Challenge 3: The business process life cycle is inconsistent with the container

The Serverless platform relies on a standardized application life cycle to achieve fully automatic container movement and application self-healing features. In a system based on standard containers and Kubernetes, the life cycle that the platform can control is the life cycle of the container. Therefore, it is necessary for the business to keep the life cycle of the business process consistent with the life cycle of the container, including starting, stopping, and the specifications of readiness probe and liveness probe, etc...

Challenge 4: Observability needs to be improved

In the Serverful mode, if there is any problem in the production environment, the server will not disappear, and users will naturally think of logging in to the server. In the serverless mode, users don't need to care about the server, which means that the server is not visible by default. What if the system is abnormal and the platform cannot complete self-healing at this time? ……When the comprehensive observability surrounding the serverless model is insufficient, users will certainly not feel at ease.

Challenge five: R&D, operation and maintenance minds need to change

Almost all R&D, when deploying one's own application for the first time in a career, is for a server, or an IP, which is a very powerful habit. In the process of Serverless's gradual implementation, R&D needs to change some thinking modes, gradually adapt to the mentality of "IP may change at any time", and switch to more operation and maintenance from the perspective of service version and traffic. Own system.

To use an analogy, Serverless currently has a form, that is, a framework, but there are still many grids (problems) in this framework that have not been filled (solved). This is why you should use Serverless today. Where there are doubts, one of the reasons is that we have not seen enough successful cases. But in fact, Ali has successfully implemented Serverless during Double Eleven in 2019. Not only that, but Alibaba Cloud has also led a number of enterprises to use functional computing products, thereby saving a lot of IT costs. 

"Being the serverless that users need"

Function computing has several very typical application scenarios, such as Web applications, AI reasoning, audio and video processing, graphics and text processing, real-time file processing, real-time stream processing, etc. At present, function computing has a large number of customer groups, such as graphite documents, mango TV , Sina Weibo, Malong Technology, etc.

Taking Weibo as an example, function computing carries billions of microblog requests every day on average, and its millisecond-level scalable computing resources can ensure that the application can still guarantee a stable delay when a hot event occurs, and the user experience is completely independent of the number of visits. influences. By running image processing services through function computing, Weibo has achieved continuous cost savings. It is no longer necessary to reserve a large number of idle machine resources in advance to smoothly handle the surge in traffic caused by business peaks. At the same time, because there is no need to maintain complex machine states, Engineers can concentrate on working with product teams to increase business value instead of spending time managing infrastructure.

Not only have early Internet companies like Sina have landed Serverless, some emerging startup companies are also joining the Serverless camp.

Lanmo is a high-tech company founded by American students returning to China, focusing on new technology research and platform operations in the field of digital publishing and mobile learning in the era of mobile internet. With the explosive demand for online education, Lanmo has increased its efforts to integrate high-quality course resources in the industry and continuously expand its business boundaries. While winning opportunities, its technical team also faces unprecedented challenges.

Video processing related business is one of the most difficult problems encountered by the blue ink technical team. Lanmo handles a large number of video teaching materials every day, involving a series of complex technical tasks such as video editing, segmentation, combination, transcoding, resolution adjustment, and client adaptation. In the previous years of technical practice, the blue ink technical team has established a complete set of independent and controllable video processing mechanisms through FFmpeg and other technologies, supporting the rapid development of the business. However, this year's business growth rate was unexpected by Lanmo's engineers. The video processing demand at the peak period dozens of times higher than in previous years overwhelmed the existing architecture and seriously affected the user experience.

Lanmo's current core demands can be summarized in three: cost saving, extreme flexibility, and free operation and maintenance. These are precisely the problems that Serverless is best at solving. After investigating various aspects of serverless services provided by domestic cloud vendors, the blue ink technical team agreed that Alibaba Cloud Function Computing is the most suitable solution for them in the field of video processing.

Because FC is fully compatible with the existing code logic and can also support various mainstream development languages, the blue ink technical team can migrate the code logic from the original architecture to FC in a nearly seamless manner, and the cost is extremely low . By docking with the OSS trigger, as long as a new video source file is uploaded on the OSS, the function calculation instance can be automatically pulled up to start the life cycle of the video processing business.

Through the integration of serverless workflows, distributed tasks can be organized uniformly, and complex operations such as parallel processing and final merging of large files after slicing can be realized. The computing resources of tens of thousands of instances can be quickly mobilized in a short time to realize video processing. Fast execution of tasks;

On the other hand, compared to traditional methods, the serverless solution based on function computing FC can help Lanmo save about 60%  of IT cost in video processing scenarios  .

The main battlefield of the next decade

The ideal serverless should be: more complete product form, more extreme flexibility, better tool chain, more cost-saving, more efficient development efficiency, more convenient and fast migration speed, more convenient and powerful upgrade Cloud experience. To enable developers to focus on the development of business code in one way, without paying attention to the differences in operating platforms, one writing can run everywhere, and developers only need to master one way to have no learning costs between different businesses To switch.

From the perspective of developers, the entire R&D model of Serverless also brings challenges to the R&D system. For the front-end, Serverless not only complements the existing capabilities of front-end engineers, but may also change the positioning of the entire front-end industry. It turns out that people often think that the front-end work is simple, just do a good job in the development of the UI, and the rest can be left to the back-end. But after the front-end and serverless are combined, everyone's demand for the front-end is not just to develop a page, but to be able to deliver the development of the entire application.

But correspondingly speaking, the first reaction of the back-end classmates might be, does this revolutionize me? I don't need to work anymore? This is not the case. The evolution of the Serverless R&D model helps them evolve to a lower level, allowing them to focus on the parts that really need to do technical research. For example, how to make these data capabilities and service capabilities better and more solid? This is what we expect to see.

Alibaba Cloud is playing a very interesting and beneficial card for the overall development of Serverless through the combination of tool chain, community, and product capabilities. The goal of Alibaba Cloud Serverless is to become "the serverless everyone needs", which is completely different from other cloud vendors . Only serverless vendors that put user needs first can make serverless products well.

In the future, serverless will be ubiquitous, and any sufficiently complex technical solution may be implemented as a fully managed, serverless back-end service . Not only cloud products, but also services from partners and third parties. Cloud and its ecological capabilities will be reflected through API + Serverless. In fact, for any platform-based product or organization that uses API as a way to reveal functions, such as DingTalk, Didi, WeChat, etc., Serverless will be the most important part of its platform strategy.

The Serverless official account releases the latest information on Serverless technology, gathers the most complete content of Serverless technology, pays attention to the trend of Serverless, and pays more attention to the confusion and problems you encounter in your practice.

Guess you like

Origin blog.51cto.com/14902238/2562966