With annual revenue of 40 billion US dollars and 45% market share, can AWS still innovate?

At the end of 2020, the annual global conference re:Invent held by Amazon Cloud Services (AWS), the global cloud computing pioneer and cloud computing market leader, opened online. In view of the global epidemic in 2020, this year's AWS re:Invent has moved from offline to online, and it has been extended from the original one-week offline meeting to three-week online meeting, and it is open to the world for free! This is the first time for AWS and the first time for the global cloud computing industry. If AWS insists on reinventing reinvent, at least this time it will re:Invent the conference again!

As a leader in the global cloud computing industry, AWS is now among the top five corporate IT companies in the world along with Microsoft, Dell, IBM, and Cisco. According to the "Global Public Cloud IaaS and PaaS Market Share Report 2019" released in August 2020 by the consulting agency Gartner, AWS's market share is 45%, surpassing the sum of the second, third, fourth, and fifth places (34.3 %). In the third quarter of 2020, AWS reached an annualized revenue of 46 billion U.S. dollars, a year-on-year increase of 29%, which is equivalent to an annual increase of 10 billion U.S. dollars!

After 14 years of development, is AWS today "cheap and old"? Gartner's key IT indicator data "2020: Industry Measures" shows that cloud computing is still in the early stages of development. In the overall IT expenditure, cloud expenditures account for only 4%, and traditional IT expenditures still account for 96%. AWS, which has always opened up the market with continuous innovation, has released more than 80 important services and functions in 2011 to 2,345 in 2019. In recent years, it has maintained the release of more than 400 new services and functions each year compared with the previous year. . So, can AWS, which enjoys the early dividends of the cloud computing market, continue to innovate next? The answer is yes!

Reinvent computing

Amazon EC2 Simple Computing is one of AWS's earliest and oldest cloud services. As one of the three core technical pillars of enterprise IT, computing, network, and storage, EC2 is also one of the most widely used AWS cloud services. On AWS re:Invent 2020, the AWS EC2 product family will push 400G bandwidth per second, the strongest machine learning training instance P4d, up to 24TB SAP memory instance, up to 336TB local storage instance D3en, and the strongest machine learning inference instance Inf1, G4ad, an image-intensive example with the best price-performance ratio, will be launched soon.

In addition to being higher, faster and stronger, EC2 is also adapting to the most extensive computing platforms and chips. At this re:Invent conference, AWS launched a new Amazon EC2 Mac instance, allowing 28 million developers in the Apple development community to better develop software and use elastic computing services. Developers can also integrate the development of cross-platform Apple, Windows and Android applications onto AWS, thereby increasing developer productivity and shortening the time to market. Similar to other Amazon EC2 instances, developers can easily combine EC2 Mac instances with other AWS services and functions.

In his speech, Andy emphasized that in addition to providing services for Apple's tens of millions of developers to easily use cloud computing for development, AWS is the only cloud service provider that supports both Intel and ARM processors. And AWS can do this, an important reason is that the virtual computing architecture Nitro launched in 2017, Nitro architecture uses the ASIC customized chip launched by the chip company Annapurna Labs acquired by AWS in 2015, used to carry EC2 Hypervisor. Simply understand, Nitro is to offload the virtualization program that was originally tightly bound to the hardware to the dedicated hardware, so that it can support a variety of chips and hardware, while maintaining the elasticity and scalability of cloud services. In addition, based on Annapurna Labs' chips, AWS has also developed more chips to significantly improve performance and reduce costs. Now, AWS can re-invent some computing instances according to customer needs every few months, instead of waiting for several years.

In the computing field, in addition to computing instances such as virtual machines and physical machines, there are also containers on hardware and serverless computing. The so-called container, a new type of virtualization technology, is mainly for cloud-native applications and microservices; while serverless computing is a computing method that has nothing to do with the underlying hardware resources. The system takes over the scheduling of the underlying hardware resources and allows developers to Just focus on the software logic itself, which is especially suitable for event-driven applications such as the Internet of Things. Andy said that currently two-thirds of containers are running on AWS, because other vendors only provide one container hosting service, while AWS provides three: open source Amazon EKS, Amazon ECS integrated with AWS, and no need to worry AWS Fargate for server and cluster management. Although AWS provides a variety of container services, enterprises still need to run containers locally. Therefore, AWS has released ECS Anywhere and EKS Anywhere this time. Enterprises can get the same experience as ECS and EKS in local data centers. The AWS Lamdba serverless computing product launched a few years ago, the billing unit was changed from 100 milliseconds to 1 millisecond, which can help customers save 70% of their fees!

Of course, there is also this heavy release of AWS Proton. Zhang Xia, chief cloud computing enterprise strategy consultant for AWS Greater China, said that its role is to simplify the development and deployment of container and serverless applications. AWS Proton provides more fine-grained development and deployment management tools, which is of epoch-making significance. Containers and serverless applications are composed of very small code segments. Each code segment is usually developed and operated by a different team, and has an independent infrastructure that needs to be updated and maintained. With the increase in containers and serverless applications, the coordination of infrastructure configuration, code deployment, and operation and maintenance monitoring among infrastructure teams, development teams, and operation and maintenance teams has become increasingly complex, slowing down application development. AWS Proton allows the central platform team to build a call stack (Stack). The call stack is a file. Except for the specific application code, all information about microservice deployment is in the call stack. Therefore, Proton greatly facilitates the development and deployment of container and serverless applications.

Reinventing data management

For enterprise IT, data management and database management are a core task. Whether or not the database can be reinvented is also a challenge for technology vendors such as AWS that originated from Internet companies. Databases have always been dominated by giants such as Oracle and Microsoft, but since AWS launched its managed database service Amazon Aurora, it has begun to change this situation. Today, Aurora has 100,000 customers, and a large number of customers are helping AWS further innovate database services.

The newly released database service Amazon Aurora Serverless v2 is a new version of Amazon Aurora's serverless database. It can achieve real-time expansion and can be expanded to support hundreds of thousands of data processing transactions in less than 1 second. Developers do not need to configure the capacity according to the peak value of the business but expand on demand, which can save up to 90% of the cost. Gu Fan, general manager of AWS Greater China cloud service product management, said that currently there is no data service on the market that can achieve this magnitude. Amazon Aurora Serverless v2 has promoted the evolution of database services as a whole.

Andy emphasized, "How many old-school database vendors will build such a serverless database? No old-school database vendors will do this, because it will take away a lot of revenue from their core databases. And AWS hopes to build a business that will be better than us. Live longer’, so we look at the long-term together with our customers. What to do? It’s to really listen to what customers care about and improve efficiency and effectiveness. Although it may reduce revenue in the short term, we are still willing to do this because we are Long-term relationships. This is why customers trust AWS-they have trusted us in the database field for the past few years."

In the past few years, 350,000 databases have been moved to AWS, and AWS also provides specialized algorithms to help customers migrate. Because some codes are connected to traditional databases, and because customers have also requested AWS, in the process of migrating traditional databases to AWS, corresponding applications are also migrated. This is the release of Babelfish for Aurora PostgreSQL this time, allowing customers to run SQL Server applications directly on Amazon Aurora PostgreSQL with almost no code changes. Andy emphasized that this allows customers to get rid of the common database of old-school database vendors. Punishment in licensing. In addition, AWS will open source Babelfish into a PostgreSQL project in 2021, and Babelfish for PostgreSQL will use Apache 2.0, and all work and plans will be completed on Github to ensure the transparency of project progress.

In addition to relational databases, non-relational databases and data lakes are also the focus of AWS customers. In the field of non-relational databases, AWS has launched a variety of data services. For example, the famous Amazon DyamoDB is a typical non-relational database service, Amazon Athena can query data in the S3 data lake, and the Amazon RedShift cloud data warehouse. Andy emphasized that AWS will soon introduce more than 10 times the speed and performance optimization for these data services. In response to customer needs for data extraction, processing and joint analysis across different databases, data warehouses, data lakes, etc., especially in the current hottest machine learning scenarios, AWS launched AWS Glue Elastic Views this time, which can Help developers build applications that use data from multiple data stores, and use powerful visual views to automatically merge and copy data between storage, data warehouse, and database.

AWS Glue Elastic Views can create virtual tables (also called materialized views or materialized views) from multiple different data sources, simply and efficiently connect data islands to form a unified data perspective. When the data source changes, it can be Sync within seconds. Andy said that AWS Glue Elastic Views will be a "game changer", which allows developers' data to flow freely in all data storage, which will release the ability of massive data.

For machine learning data services, this conference introduced Amazon SageMaker's new function Data Wrangler, which can increase the speed of machine learning data preparation. To prepare data through Data Wrangler, you only need to point Data Wrangler to the appropriate AWS data storage or third-party data storage. Data Wrangler has more than 300 built-in data conversions, which can automatically identify data types and recommend appropriate conversions. It can merge and combine multiple functions in the control panel, preview the data conversion in SageMaker Studio, and then apply to the entire data set.

Reinventing the hybrid cloud

As the originator of public cloud, AWS's view of hybrid cloud and corresponding products will have a significant impact on the entire enterprise IT market. This is why when AWS announced the launch of AWS Outposts in 2018, it caused a great sensation in the industry. Previously, the industry believed that AWS would never launch hybrid cloud products for internal/hybrid deployment of enterprises, but the launch of Outposts once again proved that there is absolutely no "absolute".

In the field of hybrid cloud, Azure Stack Solutions, Google Anthos, and Oracle Cloud at Customer are all strong competitors. Microsoft and Oracle themselves started with local enterprise IT solutions, while Google’s hybrid cloud focuses on Kubernetes, Istio, and Knative. Container and microservice technology. What exactly is a hybrid cloud? In fact, there have been various discussions in the industry over the past few years, and each has its own opinions. Andy believes that some vendors claim that hybrid cloud is a company's local infrastructure plus cloud, in order to promote those companies' local infrastructure. AWS always believes that enterprises will no longer have their own data centers in the end. Of course, this requires a long evolutionary process.

What is on-premise deployment. AWS believes that "local" should not only refer to local data centers. IT needs in restaurants, warehouses, and even farmland should be considered "local." The hybrid infrastructure is composed of cloud and various edge nodes, and the local data center is a kind of edge node. The hybrid infrastructure that customers want to have is that "on-premises" use a set of hardware and tools to manage local data centers and clouds like the cloud. Therefore, AWS believes that the cloud should be pushed to edge nodes.

In the early days, AWS launched its virtual private cloud service Amazon VPC and network service AWS Direct Connect to build a bridge between the cloud and the local data center. Later, AWS cooperated with VMware to launch VMware Cloud on AWS. Customers can use the same VMware software and tools on AWS, saving 40% in IT infrastructure costs, 43% in operating costs, and an expected 5-year return on investment of 479%. For those programs and data that need to reside in the company's local data center, AWS released Outposts two years ago, using the same server hardware as the AWS cloud data center, equipped with AWS services and fully managed by AWS, and using the same use as on the AWS cloud. APIs, control panels, tools, functions, etc.

Next, AWS is further defining the "edge". Two small-sized Outposts, 1U and 2U, were released at this conference. The 1U-sized Outposts are the same size as Pizza, and the volume is 1/40 of the classic Outposts, but they have the same functions. Outposts of small size can be adapted to places with limited IT equipment space such as restaurants, hospitals, retail stores, and factories. AWS further pushes the service to areas with limited network connections and harsh environments for local deployment, such as remote areas, mountainous areas, military bases, ships, rescue vehicles, etc. This is the Snow series of equipment. The smaller Snowcone provides 8TB storage and 2 vCPUs. , You can put it in your backpack, after the equipment collects the data, take it to the headquarters for processing. As for the 5G edge, AWS has cooperated with operators to launch the Wavelength service, pushing AWS services to the edge of the 5G network for local deployment, reducing the latency of mobile applications to less than 10 milliseconds, and meeting the requirements of applications such as smart manufacturing, autonomous driving, and gaming. demand. AWS has also launched the Local Zone near major cities, reducing the access latency of local end users to the millisecond level.

In addition to the above updates, this re:Invent conference also launched a large number of updates in storage, machine learning, data analysis and other fields. AWS CEO Andy Jassy mentioned "reinvent" more than 50 times in a three-hour speech. (Re-invent)" This keyword shows the industry the determination and confidence of AWS to continuously re-invent. As for this re:Inven online conference, the three-week conference will have 5 keynote speeches, 18 executive speeches, and more than 500 sub-forum speeches. The long time and rich content have also reinvented again. Cloud Computing Industry Conference.

To sum up: Since the first cloud service was released in 2006, AWS, which has a history of 14 years, has not been "cheap and old", but has given the industry a feeling of "gradual improvement". Standing at the turn of 2020 and 2021, re:Invent 2020 fully demonstrates the endless innovation of cloud computing to the world, and today's cloud computing market has just started. Under the global epidemic, digital transformation is accelerating. Both companies and individuals are using digital technologies such as cloud computing to reshape themselves, and AWS is the companion of this self-reshape journey. (Text/Ningchuan)

Guess you like

Origin blog.csdn.net/achuan2015/article/details/110917490