Annual revenue exceeding 1 billion US dollars? OpenAI's "money trick"

Click to follow

Text丨Hao Xin, Liu Yuqi, editor丨Liu Yuqi
 

"I saw him for less than three minutes, and I was thinking, ah, 19-year-old Bill Gates is probably like this!"

In the eyes of YC founder Paul Graham, 28-year-old Sam Altman is the same as 19-year-old Bill Gates. He went to a prestigious school, is maverick, firmly believes that the world can be changed, and is intertwined with altruism and extreme ambition. , but Altman is more radical, and his ambitions even exceed the boundaries that Silicon Valley can accommodate.

Counting, Graham is Altman's first Bole. In 2014, Altman was selected by Graham to serve as the CEO of the startup incubator. As we all know, after Altman established OpenAI, he found his second Bole: Nadella, the current CEO of Microsoft.

Facts have proved that Altman is indeed a great horse. In the five years since Altman took over as the CEO of YC Incubator, he has opened up a number of new businesses, leading YC's total market value to about 150 billion US dollars, and his investment network covers more than 4,000 start-up parks and more than 1,900 company. Today, OpenAI, which has been in the two-way extreme evaluation, has also initially handed over the report card.

On August 30, according to foreign media "The Information", OpenAI expects to earn more than $1 billion in revenue through the sale of artificial intelligence software and its computing power in the next 12 months.

As soon as the news came out, there was an uproar.

After all, as early as 3 months ago, OpenAI was still struggling on the "line of life and death". According to a report by the Indian news media platform Analytics India Magazine, OpenAI only costs about 700,000 US dollars a day to run its artificial intelligence service ChatGPT. OpenAI is currently It is in a state of burning money. If it does not accelerate its own commercialization process, it is very likely that it will have to file for bankruptcy at the end of 2024.

This speculation is not groundless. Public data shows that in 2022, OpenAI’s revenue will be approximately US$36 million, but this year, they spent US$544 million. In other words, last year alone, they had a net loss of US$500 million. .

OpenAI, which has always been hailed as a "gold-swallowing beast", suddenly began to have large-scale revenue, which made everyone suspicious: What kind of money-making tricks did Altman change?

More importantly, OpenAI is a barometer of the commercial potential of large language models. While exploring the mystery of OpenAI's revenue, it also revealed the long-awaited "Pandora's Box" of the commercialization of general large-scale models.

For the entire industry, it is one thing to boost confidence. On the other hand, once OpenAI’s business model works well and sets a new template for the industry, it is expected that soon there will be another wave of “100-model commercialization wars.” , pushing AGI to quickly enter the second stage.

01 OpenAI’s business landscape

From the moment OpenAI launched ChatGPT, it focused on commercialization with a hunter's ruthless gaze.

On November 30 last year, GPT-3.5 was born. Just over two months later, OpenAI quickly launched the charging mode and launched the ChatGPT Plus subscription plan.

Since May this year, OpenAI has gone further and further on the road to business, frequently making “big moves”:

  • On May 15, the ChatGPT iOS application was launched.
  • On May 31, GPT-4 third-party plug-in functions (plugins) were fully opened.
  • On June 21, it was revealed that it planned to launch a large model store similar to the “App store”.
  • On June 23, it was revealed that it planned to launch a ChatGPT version of “Personal Work Assistant”.
  • On August 29, OpenAI released the enterprise version of ChatGPT, which is fully functionally aligned with Bing Chat.

(Source: OpenAI official website)

According to OpenAI's official website, at present, its products are mainly divided into two categories: one is API-based products, including callable GPT models, DALL E models (Vincent graph models); Whisper (speech recognition models) and Chat (dialogue), Embeddings (vectorization), Analysis (analysis), and Fine-tuning (fine-tuning) functions provided by developers; one type is a product with ChatGPT dialogue robot as the core, which is divided into personal version and enterprise version.

Based on the information on OpenAI’s official website and the compilation of public information, Light Cone Intelligence found that OpenAI currently has two main revenue pillars.

The first is the charging model based on API calls that OpenAI has relied on most since the birth of ChatGPT. In this mode, users can almost use the multi-modal capabilities developed by OpenAI, running through the underlying large language model, model deployment, model development and other processes, and the price is also very friendly, only a few cents per call. OpenAI officials did not specify whether users are individual users or enterprises, but according to foreign media reports, in addition to a large number of individual users, well-known companies such as Jasper, Slack, Salesforce, and Morgan Stanley are all early users.

It is worth mentioning that under this charging model, OpenAI also provides its largest "funder" Microsoft with a number of functions including encoding, GPT-4, Vincent graph, ChatGPT, etc., and integrates them into Microsoft cloud services, search , Office software and many other products. How much OpenAI can benefit from it is unknown, but taking the Azure cloud business as an example, Microsoft’s cost of using the above-mentioned OpenAI functions is consistent with the quotation. At the same time, all OpenAI technologies are still running for free on Microsoft’s Azure cloud infrastructure.

The second is the subscription charging system based on ChatGPT products. In the early days, OpenAI obtained a large amount of training data for free. With this, within 9 months, it refreshed the user growth data of TikTok and Instagram, becoming the fastest application to reach 100 million users.

After all, "making a wedding dress" for Microsoft and giving users "early adopters" is not the ultimate goal of OpenAI. If you want to make a profit, you need to find a way to increase your payment rate. In June, the number of ChatGPT users exceeded the peak and then declined. OpenAI began to shift its business ideas from the C-side to the B-side in an attempt to "steal" the business of the sponsor.

OpenAI said that many large enterprises are interested in its new enterprise-level products. Since the launch of ChatGPT, it has been adopted by more than 80% of Fortune 500 company teams, including Block, Canva, Estee Lauder, PricewaterhouseCoopers and other large enterprises. Tried the Beta version of ChatGPT Enterprise Edition in advance. After that, OpenAI will also launch a business version of ChatGPT for small institutions and provide more customization options.

According to Guangcone Intelligence, since the launch of ChatGPT, the most anticipated one is undoubtedly the enterprise version. After being criticized for "data privacy security", OpenAI has made adjustments to its products.

ChatGPT Enterprise Edition is currently driven by OpenAI's most advanced language model, GPT-4. Enterprise users have priority access to GPT-4 and the upper limit of usage has been removed. The execution speed is twice as fast as that of ordinary GPT-4. In addition, the enterprise version allows more content to be entered, and the context window is expanded to 32,000 Tokens and approximately 25,000 words.

OpenAI promises that customer prompts and all other data will not be used for model training, users can control the data retention time, and any deleted conversations will be automatically deleted from ChatGPT's system within 30 days.

In addition, in terms of deployment, the Enterprise Edition provides a brand-new management control platform that can manage users in batches, including single sign-on, domain verification, and a dashboard containing usage statistics, etc., suitable for large-scale scalable deployment. At the same time, it also adds the supporting use of a full tool chain such as vectorization tools and advanced data analysis tools.

From this point of view, OpenAI is trying to shift from the previous low-fee and low-frequency API model charged by Token to a diversified charging model with high pricing and sticky 2B subscription fees and customized solution fees.

02 Will the more you grow, the more you will lose money?

The increase in revenue does not mean that OpenAI has really started to make money. After all, 1 billion US dollars in the investment cost of OpenAI can only be "sprinkled water". And after the commercialization is fully launched, with the number of OpenAI users and GPT4's continuous research on the demand for computing power, the cost will continue to rise with the scale of users, and many technology companies will find it difficult to escape the curse of "the more the revenue grows, the more the loss".

The high cost of OpenAI is obvious to all. According to Light Cone Intelligence, the current cost investment of OpenAI is mainly divided into the following parts:

  1. Talent cost: OpenAI has 375 permanent employees in San Francisco, most of whom are big names in the field of machine learning, and their annual salary alone is about 200 million US dollars. According to a survey by a foreign salary website, the median salary of OpenAI software engineers is US$920,000.
  2. Training cost: According to data, they spent 4.6 million US dollars to train GPT3 once, and the corresponding cloud resource cost is almost 9 figures (that is, hundreds of millions). According to data from SemiAnalysis, a semiconductor consulting research company, if the cost of OpenAI cloud computing is about $1/A100 hours, then under such conditions, the cost of only one training is about $63 million, which does not include all Experiments, failed training and other costs, e.g., data collection, RLHF, labor costs, etc.
  3. Inference, operating costs: Citing Forbes, the operating or inference costs of ChatGPT’s large language models “far exceed the training costs when deploying models of any reasonable size,” “In fact, ChatGPT’s inference costs per week will outweigh the training costs.”
  4. Investment: According to foreign media The Information, at the beginning of the year, OpenAI invested in at least 16 companies through a $100 million venture fund supported by Microsoft and other investors, and its accelerator Converge invested in 10 companies. Lightcone Intelligence also found that in the first half of this year, OpenAI made three public investments in the name of a company. Before it, there were established companies such as Microsoft, Google, and Nvidia.
  5. Acquisition: On August 17, OpenAI announced the acquisition of a game company called Global Illumination. It is reported that this is OpenAI's first public acquisition since its establishment in 2015.


According to public information, since the establishment of OpenAI, it has received more than 15 billion US dollars in investment alone, which is used to fill the "holes" in high-cost training and development of large models.

To the destination of AGI, OpenAI does need money, but "burning money" is like a bottomless pit, and "bleeding" may not be exchanged for growth. It is precisely because of this that OpenAI wants to accelerate the commercialization process.

But having revenue does not mean turning a profit. Founder Securities has carefully calculated the relevant indicators of ChatGPT based on public data, and pointed out in the analysis: The general logic of OpenAI’s profitability is to increase the payment ratio of GPT-4 and reduce the cost of GPT-3.5, which is the main cost source of OpenAI.

Under the cost compression of GPT-3.5, if the daily active and monthly active ratio reaches 35% and the monthly active payment rate exceeds 12%, it may be possible to achieve breakeven. For the GPT-3.5 model and GPT-4 model after cost reduction, if the monthly payment rate increases by 0.5% per month, it may be possible to turn around losses.

As of July 12, 2023, the number of daily visits to the ChatGPT webpage has basically remained flat at more than 50 million. As of June 19, 2023, the average daily active users of OpenAIChatGPTiOS in the United States in the first 30 days was 946,000.

According to data.ai data, as of June 19, 2023, the average daily active users of the ChatGPT iOS terminal in the United States from May 21 to June 19 is about 946,400, and the cumulative number of paying users is about 41,300. Therefore, the daily active payment rate (number of monthly paying users/daily active users) is approximately 4.36% (4.13÷94.64). According to Questmobile data, the ratio of daily and monthly active users of Baidu APP is about 37%. Therefore, if the ratio of daily and monthly active users of ChatGPT is 37%, the number of monthly active users is about 2.5578 million (94.64÷37%). Number of paying users/monthly active users) is approximately 1.61% (4.13÷255.78).

The above data shows that the commercialization of OpenAI is still difficult and long-term. In the future, only by increasing the user payment ratio to a certain level can we achieve a break-even balance. Revenue alone cannot explain the problem.

03 OpenAI and Microsoft "grab food"

It is gratifying that the business model is running smoothly, but problems also arise. What was once a "hidden concern" has been brought directly to the forefront, and that is the delicate relationship with Microsoft.

The premise that needs to be clarified is that no matter how sweet the "honeymoon period" between OpenAI and Microsoft has been, these are two independent institutions and companies. Although the cooperation between the two is more special than the relationship between acquisitions or investments, Nadella even tried his best to shut down some businesses to build a smart computing center for OpenAI, but once OpenAI starts to be commercialized independently, it means that the two will share the same cake, and war and fighting will inevitably be avoided.

Unlike facing competition directly, the relationship between OpenAI and Microsoft is closer and more complicated.

In 2019, OpenAI transitioned from a non-profit to a hybrid model, one of which is called OpenAI LP, which is responsible for the commercialization of products developed by the Open AI Research Laboratory. In the same year, Microsoft invested $1 billion in the partnership, using its Microsoft Azure AI supercomputing technology as the infrastructure for training GPT models. In 2022, this partnership will be further strengthened, with Microsoft investing US$10 billion in OpenAI.

(OpenAI LP, a for-profit entity established by OpenAI)

After OpenAI repays its first investors, Microsoft will get 75 percent of profits until its main investment is repaid, and 49 percent thereafter until it hits a theoretical cap, according to a person familiar with the terms of the partnership. At the same time, people familiar with the matter broke the news to foreign media that starting around 2025, the upper limit of profit sharing will be increased by 20% every year, instead of setting a hard upper limit for profit sharing-basically their return on investment. Investors familiar with the deal said Microsoft would effectively own more than a third of the company.

Since March this year, Lightcone Intelligence has also learned from many companies that if they want to use the ability of ChatGPT, there are two roads ahead: one is to directly call the API interface of OpenAI, and pay OpenAI according to the number of tokens used; two . It is on Azure, based on the computing power of the public cloud, using OpenAI's services. Compared with the former, Azure's services are more secure, and the supporting facilities are more complete when the price is the same. This is what the sales staff of Microsoft Cloud has always emphasized. "difference".

But according to the survey, even so, there are still a large number of companies that choose to directly call API services. One of the reasons is that it is simple and convenient. The pay-as-you-go model is more cost-effective and more important for individual developers and small and medium-sized enterprises. What is interesting is that the logic of calling APIs is completely different from that of purchasing cloud services. The former developers can decide on their own, while the latter needs to be approved and reported to the superior and consider the combination with the business.

As mentioned above, in the eyes of the outside world, the two sides are complementary. Microsoft provides funds, resources and technical support to OpenAI, and OpenAI helps Microsoft become the top technology giant again. However, this situation, with OpenAI in 2023 The gradual commercialization of the industry has also changed.

In June of this year, according to "The Information," an internal document from Microsoft directed Azure salespeople to tell customers that Microsoft can provide more services than OpenAI; And so on, to defend. This is the first time the subtle rift between the two sides has been shown to the outside world.

When engaged in the brutal market competition, the conflict of interests between the two parties is inevitable. A sales rhetoric can't prove anything, but the strategy behind it can really explain the problem.

For example, in March this year, after OpenAI first signed contracts with companies such as Snap and Instacart, Microsoft Cloud Services released a preview of the ChatGPT function after a week; After paying, Microsoft's cloud service gained access to GPT-4.

The technology is in the hands of OpenAI, and it must be shared with Microsoft, but there are no specific instructions and regulations on the time, node and degree of sharing. OpenAI wants to use this "time difference" to snatch some benchmark customers.

To put it bluntly, how OpenAI handles the relationship with Microsoft, the two sides draw boundaries, formulate standards, find cooperation methods and balance points, if not handled carefully, it is likely to be a common crisis for both parties.

However, in the business world, there are no eternal friends or enemies, only eternal interests. In the face of interests, "ghosts" as smart as Altman and Nadella, people are more looking forward to how they can create miracles again, rather than Falling into the cliche of "tearing each other down".

Welcome to follow Light Cone Intelligence and get more cutting-edge knowledge of science and technology!

Guess you like

Origin blog.csdn.net/GZZN2019/article/details/132607635