Wang Chuan: Thoughts on chatGPT (1)

Original link: https://chuan.us/archives/886?utm_source=rss&utm_medium=rss&utm_campaign=%25e7%258e%258b%25e5%25b7%259d%25ef%25bc%259a%25e5%2585%25b3%25e4%25ba% 258e-chatgpt-%25e7%259a%2584%25e9%259a%258f%25e6%2583%25b3-%25ef%25bc%2588%25e4%25b8%2580%25ef%25bc%2589

This article is the continuation of Wang Chuan: Thoughts on GPT-3 (1) .

1/ In the past three months, most people have heard all kinds of news about chatGPT. If you are not clear, chatGPT, as an artificial intelligence-based tool for man-machine dialogue in natural language, can help you:

Seriously write poems, novels, and plays; help students with homework and papers;

Write a business plan that looks good; write all kinds of soft articles;

According to natural language instructions, directly produce high-quality code, user interface design;

Passed the medical and bar exams;

Translating tens of thousands of words of English books into other languages, it is said that the quality of translation has surpassed manual labor in some places.

Derived applications include uploading pdf documents, automatically reading information, helping you provide summaries, quickly answering various questions about this article, and so on.

2/ The emergence of this new tool is so fast that no matter what functions are described in the current article, what kind of costs and expenses are estimated, in a few months, there is a high probability that it will become better.

3/ According to the synthesis of all aspects of information, the technical improvement from Gpt-3 to chatGPT is not mainly in the increase of training parameters, but in the special training for dialogue. There is a concept here called “Reinforcement Learning from Human Feedback, RLHF for short”. In short, every time the model generates text, human feedback is used as the performance measurement standard to optimize the model. The first challenge of RLHF practice is that the cost of human feedback is relatively expensive; the second is that different people may have different feedback on the same output, making the language model at a loss. Proper use of social media can also be seen as RLHF to yourself. The more fans you have, the lower the cost of obtaining feedback, so the cost of training yourself is also lower, and the progress will be faster. (Of course, it should be noted that there is no value in quickly blocking unfriendly feedback and bullying.)

4/ The computing power cost of AI is mainly divided into two parts, one part is training cost, and the other part is inferencing. The training cost is like the education cost of a child from birth to 22-year-old university graduation. The reasoning cost is like paying wages to college graduates to help you do things. Microsoft/Openai has taken a deliberately vague and vague attitude towards the specific training and inference costs of models like Chatgpt, presumably not wanting competitors to know too many substantive details. From the public information, it can be known that Microsoft has invested at least 10,000 GPUs in Openai’s computing, and the cost of computing power is worth more than one billion US dollars.

5/ On February 9, there was an analysis of the article “Inference cost of search disruption – LLM cost analysis” with Dylan Patel as the first author on the Internet. Based on the apportioned cost of 13 million daily active users, the cost of inference services provided by open-ai was calculated. The hardware cost is about $694000 per day (requires 28936 GPUs), and the cost per search is around 0.36 cents. The article analyzes that if Google provides ordinary users with a homogeneous large language model search service, it will increase the reasoning cost by 36 billion US dollars per year, which is what it is currently unwilling to do.

6/ On the other hand, openAI launched the latest price for chatGPT API calls on March 1st, and the input processing per thousand words is only $0.002. The reverse conclusion is that the current marginal cost should be slightly lower than this price. Estimates of actual cost figures are complex. But one thing is certain, the marginal cost per unit will fall every month. Any estimates, by the time you see them, are outdated and higher than actual costs.

7/ chatGPT now provides ordinary users with a monthly paid service of $20, which is called chatGpt plus. But the main revenue of openAI should come from the chatGPT API service provided to developers and enterprises. Early enterprise customers include Snapchat, Instacart and Shopify. The founder of the famous hedge fund Citadel announced a few days ago that he is negotiating with OpenAI to purchase a software license for using the chatGPT API within the entire enterprise. Slack, an instant messaging software with more than 10 million daily active users, also announced that it has integrated the chatGPT function into its own software.

8/ On Twitter, an author named debarghya_das released a very rough model for calculating openAI’s revenue and profit on March 2nd. Calculated, the revenue brought by the A100 GPU per hour is at least 2.1 times its cost, and the revenue brought by 10,000 GPUs exceeds 200 million US dollars a year. At this stage, openai’s main strategy should be to continuously reduce prices to attract more developers into its ecosystem, share the cost of computing power, and the profit is relatively secondary.

9/ A reference data: In 2022, AWS’s cloud computing revenue will be about 80 billion U.S. dollars (2015 is 8 billion U.S. dollars). Some organizations estimate that the global cloud computing market size is expected to reach 1.7 trillion U.S. dollars by 2029. As openai is the leader in the AI ​​market, the revenue from the chatgpt API may exceed one billion, five billion U.S. dollars, or even higher within five years. The ceiling is currently not visible. Another data can be compared: Microsoft acquired LinkedIn at a price of 26 billion US dollars in 2016. After the latter was acquired by Microsoft, its annual revenue will reach 14 billion US dollars in 2022. OpenAI is popular with users and The potential of product development is obviously much greater than that of LinkedIn. With Microsoft’s sales channels, the imagination of revenue and profit potential is naturally much larger. There are two key points here, the largest developer community, Github with 100 million active users (far more than all other competitors), its owner is Microsoft; and Microsoft’s cloud service Azure, which accounts for 23% of the US market share, Second only to AWS. Developing artificial intelligence applications on Github, calling chatGpt’s API, and then deploying to azure cloud services will become the least resistance choice for most third-party developers.

10/ After OpenAI has accumulated revenue data for several quarters, it may choose to go public sometime in 2024. The current revenue data is not that important, as long as it can show strong market demand and growth, as long as there are investment banks willing to draw a big picture and outline a roadmap for revenue exceeding US$5 billion, then there is a high probability that there will be enough hot money willing to contribute Paying the bill supports 20 times revenue, which means a market value of at least 100 billion U.S. dollars.

11/ If openAI is successfully listed, the return of the original venture capital institutions in 2015 may reach more than 50-100 times. They must use this performance to brag around, and raise funds from institutional investors to build larger, billions, and tens of billions of dollars in funds that specialize in investing in artificial intelligence and related applications.

12/ Microsoft is one of the shareholders of openAI, and has a complicated profit-sharing agreement with openAI, but basically on the issue of chatGPT, the two of them can be regarded as one. Microsoft will firmly bind this technology with other tools in its ecosystem, such as Bing, Edge browser, Office, Github, Azure, etc., to help build user habits and dependencies, and expand the influence of the ecosystem. Microsoft has first-mover advantages, scale cost advantages and channel advantages. For most users and software service providers, investing in the arms of Microsoft will be the evolution direction of least resistance.

13/ In addition to Microsoft, Nvidia and TSMC are direct beneficiaries of this wave. Nvidia will have a share of more than 80% in the GPU market in 2021, and has a strong CUDA software development ecosystem. TSMC’s market share in the global wafer foundry will exceed 50% in 2021.

14/ The generation of the chatGPT language model, Microsoft has invested at least 10,000 GPUs in the calculation of Openai, so if there is no original investment of hundreds of millions of dollars, it is impossible for outsiders to directly compete with chatGpt in terms of computing power. This does not include the need to obtain A large amount of data is used for training, software engineers adjust algorithm models and other costs. chatGpt itself is also constantly improving. If startups want to start an arms race with Microsoft, GPU investment cannot save money. Nvidia’s salespeople are sure to give you plenty of encouragement, too. For artificial intelligence venture capital funds, a considerable proportion of the money will definitely go to buy Nvidia GPUs; this is just like the money from web2 venture capital, a considerable proportion of which goes to Google and Facebook for advertising; a considerable proportion of the money from web3 venture capital goes to Ethereum. Burn it into gas.

15/ The essence of artificial intelligence is to do a lot of matrix multiplication calculations, which is a compulsory course in linear algebra in universities. If you want to escalate the arms race, you need to make your own dedicated artificial intelligence chip and do some special optimization in order to surpass the existing GPU in terms of computing power cost. Google has made its own TPU, and Tesla has dojo. However, in order to surpass Nvidia, which has an annual revenue of 20 billion U.S. dollars, in terms of unit computing power cost, it must have a huge production scale. This is a game that ordinary small companies cannot participate in. No matter who makes their own dedicated chips, there is a high probability that they will fall into the arms of TSMC in the end. The wafer foundry built by TSMC in Arizona must be very busy.

16/ The main source of income for chatGPT is corporate users with deep pockets. Microsoft’s Azure’s cloud computing revenue in the second half of 2022 is 41 billion US dollars, and its cloud computing market share is about twice that of Google. Many functions of chatGPT can greatly improve the efficiency of enterprise users. openAI is already working with consulting firm Bain to help Coca-Cola improve the efficiency of marketing and operations with chatGPT technology. This trend is only just beginning. On the one hand, this can increase Microsoft’s cloud computing revenue, and on the other hand, it can share the cost of computing power, increase the advantage over Google in terms of unit computing power cost, and further expand its cloud computing market share. Microsoft’s other major business, “Productivity and Business Process”, will have a revenue of US$33 billion in the second half of 2022, including Office suite, ERP, LinkedIn and other services. This business will also benefit greatly from chatGPT’s technology , increase revenue, and help share computing power costs.

17/ chatGPT may directly subvert the business model of “search click advertising”. More than 80% of Google’s revenue comes from search advertising. If it directly uses the same method to meet Microsoft’s challenge, the additional cost of computing power will greatly reduce profits. However, Microsoft’s market share in the search field is very low, and it can gain market share in search without considering profits at all. On the other hand, with the continuous improvement of chatGPT’s functions, online users’ usage habits may gradually deviate from the “search-oriented” model, and eventually marginalize the search business model. Google will be in an embarrassing state of “Damned if I do, damned if i donot” (stretching the head is also a knife, shrinking the head is also a knife).

18/ The successful listing of OpenAI will promote the influx of a large amount of hot money, creating more demand in the AI ​​​​field in the short term, and the revenue growth of some companies is extremely fast, pushing up valuations, causing waves of rising tides and boats for the entire industry In a certain period of time, there will be an illusion of “everyone is getting rich, if you don’t join me, it will be too late”. But most investors will eventually lose money in this field, because: First, the competitive advantage of the companies they invest in is narrow and short-lived, and it is easy to be caught up and eliminated by new competitors suddenly. Second, the source of profit of the invested companies is itself supported by the hot money brought by venture capital. Once the increase in venture capital funds slows down or even shrinks, revenue and profits shrink quickly. Third, the PE value of the company is often already high at the time of investment, and the most optimistic expected growth in the next four to five years is taken into account. Once the growth rate is lower than expected, the market value will easily drop sharply.

19/ Investing in high technology does not mean making a fortune, and the probability of distributing wealth is actually more than 90%. Fortunes are only possible when high technology allows a very small number of companies to achieve total monopoly. Talking about high technology without discussing whether and how it can bring power and monopoly is “not even wrong”. You can refer to the author’s old article

 Wang Chuan: Looking at Investment from the Evolutionary Mechanism of Power and Monopoly (1)

20/ A very small number of core competitiveness is not in the field of AI, but it is impossible to be replaced by AI within five to ten years. Enterprises that can use AI to greatly improve efficiency, increase revenue and reduce expenditure, and strengthen their core competitiveness and monopoly will also It is one of the winners of this wave of AI. When the Internet emerged in the late 1990s, some traditional consumer companies such as Proctor & Gamble, Walmart, and Coca cola were such examples. What similar winners this time need to be observed slowly.

21/ But the meaning of chatGPT is far beyond that. Many people in highly skilled industries (from programmers to lawyers to game designers, translators, etc.) report efficiency improvements ranging from 30% to 80% using this tool. And that’s just the prelude. The tool itself is accelerating and getting better. The improvement of efficiency in all walks of life will continue to be combined and superimposed.

22/ In comparison, Watt launched a commercially available steam engine in 1776, and the first train powered by a steam engine appeared in Britain in 1804, with an interval of 28 years. According to Wharton Business School professor Ethan Mollick, the efficiency of various small factories in the United States generally increased by 25% after using steam engines in the early 19th century. In 1776, Americans could not imagine and understand at all. Nearly a hundred years later, the railway ran through the Americas The continent’s revolutionary impact on economic models and social structures. Stanford University was established in 1885, and its initial main funding came from the wealth of railroad tycoons.

23/ chatGPT has begun to have an immediate effect on all walks of life, and the efficiency improvement is greater than that of the steam engine, which means that the huge impact of AI on human society will be destined to far exceed the impact of the steam engine in the nineteenth century. We don’t need to wait 28 years for the emergence of the “train” in the AI ​​era, it will surpass everyone’s wildest imagination.

(to be continued)

———

About the author: Wang Chuan, an investor, currently lives in Silicon Valley, California. Wechat ID 9935070, Twitter account “Svwang1”, Sina Weibo “Silicon Valley Wang Chuan”, website chuan.us. All articles express the author’s personal opinions for reference only, and do not constitute investment advice on the assets mentioned. Investment is risky, and market entry requires cautious.

How to brainstorm on the road of investment and entrepreneurship, with ease? Welcome to join Wang Chuan’s club, which is a high-end fee-paying community gathering elites from all walks of life from all over the world with independent thinking and unique perspectives.  For details, please click the link below

  RTFM – Instructions for Use of Investment Clubs (Second Edition)

Those who are interested in applying for membership, please contact Wang Chuan (WeChat ID: 9935070 or private message “svwang1” on Twitter) directly.

This article is reproduced from: https://chuan.us/archives/886?utm_source=rss&utm_medium=rss&utm_campaign=%25e7%258e%258b%25e5%25b7%259d%25ef%25bc%259a%25e5%2585%25b3%25e4%25ba% 258e-chatgpt-%25e7%259a%2584%25e9%259a%258f%25e6%2583%25b3-%25ef%25bc%2588%25e4%25b8%2580%25ef%25bc%2589
This site is only for collection, and the copyright belongs to the original author.