Alibaba Cloud Rongbei: DCDN Helps Application Construction and Best Practices in the Cloud Native Era

Today, as the speed of digital transformation continues to increase, scenarios with large bandwidth, low latency, and high concurrency are emerging, and the demand for content delivery network (Content Delivery Network, CDN) applications is still rising. Creating higher-quality CDN services will become a new trend. The key to the industry competition of the times. It has been 11 years since the first Asia-Pacific CDN Summit in 2012, and now it is the ten-year transformation period of China's CDN development. Improving the core competitiveness of CDN and creating higher-quality CDN services will become the key to promoting the development of the industry .

The 2023 Asia-Pacific Content Distribution Conference and CDN Summit was grandly opened in Beijing on June 29. The Asia-Pacific CDN Industry Alliance joined hands with China Mobile, Volcano Engine, Tencent Cloud, Wangsu Technology, Alibaba Cloud, Huawei Cloud, ZTE, Baishan Cloud, Tianyi Cloud Leading enterprises in the CDN field, experts, scholars and industry leaders will jointly discuss the creation of higher-quality CDN services to promote the development of the industry, and jointly grasp the demand for computing power and the network brought by the new technology of artificial intelligence (AI) in the next decade. The huge economic value of traffic growth.

picture

On June 29, Rong Bei, head of Alibaba Cloud CDN products, gave a keynote speech on "DCDN Helps Application Construction and Best Practices in the Cloud Native Era" at the Asia-Pacific Content Distribution Conference and CDN Summit.

Rong Bei, head of Alibaba Cloud CDN products, pointed out in his speech that in the Internet age, everyone is getting more and more connected. Information travels from one end of the earth to the other within milliseconds. The transmission of information has changed from the core competence of the past to the current The optimization of basic capabilities and information transmission has fully entered a deep water area. Different network operators, network devices, network protocols, etc. make today's network environment more and more complex, so from protocol optimization to algorithm optimization, routing-related optimization, the purpose is to make network transmission more efficient .

In the past, everyone thought that edge computing was the future trend, but now it has become a reality, and the country has also issued relevant policies to promote the development and implementation of edge computing. The advent of the cloud-native era has opened up the acceleration button for edge computing. As user access gradually sinks from the edge, computing also gradually sinks, and data security at the edge has become a part of it. In the future, there must be more than just traffic at the edge, and the proportion of computing power will become larger and larger. The cloud and the edge are no longer two independent entrances. They will become a relationship of interconnection and mutual trust.

Alibaba Cloud's full-site acceleration continued to invest in expanding the scale of edge infrastructure last year. On the basis of 2022, the infrastructure scale of the full-site acceleration has also increased by 12.5%. Now it has 2,300 nodes in China and 900 nodes overseas. In China, it covers the whole area, and overseas, it covers some key areas such as Southeast Asia and Europe. As the customer base grows larger, the diversity of business becomes more apparent.

Site-wide acceleration is not only a product that provides dynamic and static acceleration, it can also provide edge computing and edge security capabilities to meet different customer demands. In the past year, the whole site has been continuously upgraded in terms of transportation capacity, computing power and defense power. First of all, transportation capacity is the ability to accelerate network transmission across the site, and it is also the most basic capability. It can help API services or dynamic and static content to realize customers' nearest boarding points at the edge boarding points. When customers get on the nearest car, the edge node can greatly offload the transport layer and reduce the number of round trips for network interaction. At the same time, a batch of nodes were selected for independent management from node construction, operation and maintenance, and management and control.

In addition, the routing decision-making system strategy has been upgraded between nodes, and the 0RTT mechanism can be implemented in the internal link, taking the fastest path back to the data center. The whole site accelerates the entire network architecture to be upgraded as a whole, and the cloud edge is networked to make the network architecture between the cloud edge a controllable network architecture, which solves the unreliability of the cloud edge that can only be accessed through the public network before. problem situation.

Rong Bei introduced a set of on-device access data provided by customers. After customers switch from central data access to edge full-segment acceleration, the average TCP time-consuming per 100 seconds and the proportion within 30 milliseconds have been greatly improved. The customer data center is located in the Jiangsu and Zhejiang regions, and the gap between the center and the edge is not very large. Once it leaves the Jiangsu and Zhejiang regions, the difference in data is very obvious.

picture

While leading the development of standards for transmission protocols, the Alibaba Cloud team is also practicing with the industry and teams. Multipath QUIC means that in different network environments of 4G, 5G or WIFI, to transmit data at the same time will optimize the data packets with faster network environment. Multipath QUIC supports uplink and downlink dual-channel transmission, and selects different scheduling algorithms and mechanisms according to different scenarios to meet different performance requirements. In XR application scenarios, 3D model downloads require very large bandwidth downloads, or 3D live broadcasts are very sensitive to delays, and at the same time have some demands for large bandwidths, so multiple parallel transmissions can be used to optimize the effect.

Transmission through Multipath QUIC is better than single-channel video transmission in terms of start-up speed and playback smoothness, and when used by end users, it will bring a performance improvement visible to the naked eye.

picture

Rong Bei explained the computing power of the whole site acceleration. The edge computing product is ER, which is currently connected to hundreds of customers, and is widely used in the top customers of Mobile Taobao and Youku. On Double 11 in 2022, ER successfully passed the time test of tens of millions of QPS and high concurrency. In 2023, ER products will be recognized by the China Academy of Information and Communications Technology, and the first batch will pass the certification of edge function-level services. If lightweight functions can no longer implement some computing rules, ER can support uploading images at the edge through APIs, uploading images to edge nodes, and completing edge computing through edge containers. Full-site acceleration can schedule the entire edge computing power, and supports edge container services such as 4-core 8G and 8-core 16G. ER has built a set of scheduling system for edge computing power, which can dispatch the computing power request on the end to the nearest node to complete the calculation. ER will sense the computing power demand and the heat on the end in real time, and expand or shrink in time. Intervention is required. In the future, ER will officially launch an edge computing service that supports uploading items to support more complex edge computing scenarios. At the same time, the existing ER scenarios will support more programming languages, cover mainstream development languages ​​and frameworks, and strengthen the interaction with the cloud center, including cloud products such as databases and storage. Alibaba Cloud will continue to enrich the types of edge computing power, conduct unified management and scheduling for different computing power resources, and continuously enrich and improve the capabilities of edge computing.

picture

Rong Bei shared two cases of edge computing in edge applications in his speech. In the first case, a financial customer realizes AI cloud-side collaboration on edge computing. The client is a relatively well-known financial company. The business logic of the company is limited by reasoning. Some reasoning that consumes a lot of computing power cannot be completed on the end. If all reasoning is moved to the data center, in addition to increasing bandwidth pressure and computing power load, the waiting time of clients on the end will be very long, which will cause timeouts and request failures. After the customer uses the ER service to upload items, part of the reasoning logic is deployed on the edge to complete, and some attributes of the edge area are used to form the area characteristics of the reasoning, and then the area characteristics are applied to the business judgment. After this technical architecture is running, the request processing of enterprise clients will be more timely and accurate, and the business success rate will also increase by more than 5%. When using, customers don't need to care about the technical actions related to how much computing power is used. ER has expanded the flexibility of computing power, and more containers will pop up for services during peak periods. It will be released during low peak times, so that computing power is paid according to volume.

picture

The second case is that a SaaS enterprise deploys high-concurrency API gateway services at the edge. Ali Edge will automatically expand or shrink the edge gateway according to the popularity of visits. At this time, the client will be closer to the edge, and the response will be more timely during the service process, and the performance is better than the gateway directly in the center. After processing most of the requests at the edge, the bandwidth pressure in the center decreases simultaneously. The customer also reserves the edge gateway link and the central gateway link. When any link fails, the system can automatically switch to the other link, which improves the overall availability of customers.

picture

After accessing traffic and computing applications sink, considering security issues, the site-wide accelerated edge security provides complete, native edge and protection capabilities, and also supports basic protection, whitelist and other related capabilities at the edge level. It also supports layer 3/4/7 protection, supports intelligent CC protection, and supports protection capabilities up to Tbps level globally. About 30% of the customers in the site-wide acceleration have already enabled edge security. As traffic and computing sink further, the proportion of edge security will definitely become higher and higher. In the future, Ali will continue to build security capabilities applicable to global distribution, put security forward to the edge, reduce the risk of downtime caused by attacks in the center, and provide an escort for transportation and computing power.

picture

Rong Bei shared two cases of edge security for explanation. The first case is that a large financial enterprise chooses the whole site to accelerate the edge security protection of the whole site business. The customers of financial enterprises are distributed all over the world. The business types of customers need to support multiple protocols, and the access of malicious users and attack sources need to be blocked frequently at the edge. All-site acceleration provides standard product access capabilities, global node coverage, and massive IP bans at the edge, so that customer business can run smoothly around the world.

picture

The second case is that an e-commerce customer built a new security boundary based on edge computing and edge security. The customer used edge computing and edge security to build a protection rule against scalpers. Before the customer places an order, the e-commerce company will judge whether the purchase behavior on the end is a scalper according to the user's information. If the scalper rule is hit, additional verification will be added, or the service will be rejected directly. The e-commerce company judges the user's business data such as account number and mobile phone information through ER, and at the same time invokes the edge WAF capability to allow users to return to challenges such as sliders and verification codes. This set of mechanisms makes up for some shortcomings of traditional WAFs that can only judge end-end behaviors through request characteristics. After using this framework, the effective interception rate of business characteristics of e-commerce enterprises has increased to more than 95%.

Rong Bei finally concluded that full-site acceleration can provide users with high-performance capacity (full-site acceleration), computing power (edge ​​computing), and defense (edge ​​security). Alibaba Cloud will continue to serve more customers in the future, providing customers with stable, easy-to-use, and easy-to-use products.

iQIYI client "White" TV, the background uploads TIOBE's July list at full speed: C++ is about to surpass C, JavaScript enters Top6 GPT-4 model architecture leak: contains 1.8 trillion parameters, using mixed expert model (MoE) CURD has been suffering for a long time in the middle and backstage front-end, and Koala Form will be used for 30 years, and the Linux desktop market share has reached 3 % . Aiming to steal data across the board , SUSE spends $10 million to fork RHEL
{{o.name}}
{{m.name}}

Guess you like

Origin my.oschina.net/u/4713941/blog/10088124