One day in 1965, an engineer named Gordon Moore from Fairchild Semiconductor Corporation of the United States was invited to write an article entitled “Let Integrated Circuits Fill More Components” to make an economical analysis of the future development of integrated circuits. Prediction, and published in the “Electronics” magazine.
This prediction was gradually perfected later, and influenced the development of the global integrated circuit industry for more than 60 years, and had a profound impact on the entire information technology industry.
Predicting the uncertain future with definite methods is the simplest pursuit of human beings, but it is also a matter full of risks and challenges. Those companies that have leading technologies but have gone bankrupt in the torrent of development of the times are reviewing their experience and lessons. Most of the time, this one will not be missing: misjudgment of market trends.
The just-concluded 2022 is already turbulent enough, and 2023 is destined to be full of more uncertainties. We have never been more eager to accurately grasp the development trend of science and technology, and resist unknown risks through qualitative predictions.
On January 11, the Alibaba Dharma Institute released the top ten technological trends for 2023, including multi-modal prediction training large models, Chiplets (chiplets), storage and computing integration, cloud-native security, software-hardware fusion cloud computing architecture, and end-network integration. Predictable networks, dual-engine intelligent decision-making, computational optical imaging, large-scale urban digital twins, and generative AI are among them.
Among them, chiplets, integrated storage and computing, and cloud computing architectures integrating software and hardware just provide a reference development direction for the computing power industry in the post-Moore era.
Chiplet interconnection standards are gradually unified, and the chip R&D process is being restructured
Dharma Academy listed the “paradigm reset” caused by Chiplet as a bottom-level breakthrough as one of the top ten technological trends in 2023.
The report pointed out that in the post-Moore era, Chiplet may be the most realistic technical path to break through the current predicament. It can reduce dependence on advanced technology, achieve performance close to advanced technology, and reconstruct the chip R&D process, from manufacturing to packaging and testing. From EDA tools to IP design, it affects the pattern of the chip industry in an all-round way.
In the chip world, Chiplet, which is translated as a chip or a small chip, is no longer a new and unfamiliar term. It can be understood as “deconstruction-reconstruction-multiplexing” at the silicon level.
The current mainstream system-level single-chip is to manufacture multiple computing units responsible for different types of computing tasks on the same wafer through lithography. Chiplet is to decompose the traditional SoC into different cores and select the appropriate process. Manufactured separately, and then packaged with 2.5D and 3D advanced packaging technologies, it is not necessary to use advanced technology for integrated manufacturing on the same wafer.
It is not difficult to see that Chiplet has changed the chip manufacturing process. The interconnection between chips, especially the electromagnetic interference, signal interference, heat dissipation and many other complex physical problems brought about by 2.5D and 3D advanced packaging, also need to be taken into consideration when designing chips. , also put forward new requirements for EDA tools.
It is precisely because Chiplet technology can realize the reuse of IP modules and allow different cores to adopt the most suitable process, so it has become a system-level design concept that many chip manufacturers strive to balance economy and performance when designing new products.
AMD deployed the Zen2 core based on small chip technology in the Ryzen3000 series released in 2019, which has been recognized by consumers for its high cost performance. AMD’s latest next-generation data center-oriented APU product Instinct MI300 launched at the CES 2023 exhibition also It adopts Chiplet design and has 13 small chips. Intel also uses chiplet technology, and its Ponte Vecchio GPU integrates 47 chiplets. At the Yunqi Conference in 2021, the Yitian 710 released by Ali Pingtou also adopted this technology. It integrated 60 billion transistors with 2.5D packaging technology and achieved a significant increase in energy efficiency ratio.
The most impressive thing is the M1 Ultra, which was made by “bonding” two M1 Max chips at Apple’s conference in March last year. It claims to surpass Intel’s top CPU i9 12900K and GPU performance ceiling Nvidia RTX3090, directly showing the industry How chiplets can reduce costs and increase efficiency.
Chiplet has already moved from the laboratory to the industry, but each company has customized interfaces and protocols before, just like the chaotic “Spring and Autumn and Warring States”. The lack of unified standards makes it difficult to unleash the true potential of Chiplets.
In March 2022, Intel, AMD, TSMC and other chip companies jointly established the Small Chip Interconnection Industry Alliance, and customized the UCIe 1.0 standard, which is expected to solve the biggest problem in the development of Chiplets for a long time, that is, to realize the interconnection between high-quality chip modules , marking the real arrival of the Chiplet era.
Wang Hongbo, the founder of Huafeng Technology, a semiconductor equipment company, once viewed the significance of the UCIe standard in this way: the Chiplet era is actually reducing and engraving the logic of establishing an ecosystem in the PC era into chips. As a chip combination, Chiplet needs to rely on UCIe standards to integrate different companies The advanced chip design is conveniently combined in one chip, and in this way, the ecology is established and the entire industry is pushed forward.
Qin Xi, a member of the 2023 Top 10 Technology Trends project team of DAMO Academy, told Leifeng.com: “Isomorphic and heterogeneous core particles will coexist for a long time, and the mixed use of all processes is one of the phenomena that all manufacturers and even the entire industry should pay attention to. In the future, enterprises need to make choices about chiplet technology according to their own needs.”
The integration of storage and computing has triggered a change in computing architecture, which will be commercialized on a large scale in vertical fields
Breaking through the limitations of the separation of storage and computing in the von Neumann architecture has been a topic of research for many years in academia and industry.
In the 1990s, the academic community proposed the concept of “integration of storage and calculation”, hoping to achieve data storage and direct calculation at the same time through the fusion of computing units and storage units, and improve computing efficiency as one of the methods to break the limitations of Feng’s architecture.
However, due to the lack of application scenarios at that time and the economics of Moore’s Law still applicable, architectural innovation has little significance for industrial development, and the research on the integration of storage and computing has been slow in the past few decades.
It wasn’t until the popularity of artificial intelligence and the failure of Moore’s Law in recent years that the integration of storage and computing really entered the public’s field of vision.
In the past year, in January, Samsung published the world’s first research on in-memory computing based on MRAM (Magnetic Random Access Memory) in the top academic journal Nature. In February, Hynix also published the latest research results on DRAM in-memory computing based on the GDDR interface. , TSMC co-published six papers on in-memory computing memory IP at ISSCC.
Not only that, start-up companies such as Mythic, Syntiant, Zhicun Technology, and Flash Semiconductor have poured into this track, and financing continues. The world’s first DRAM-based 3D-bonded stacked storage-computing integrated chip, and the domestic start-up Zhicun Computing will also officially mass-produce the world’s first in-memory computing SoC chip in March 2022.
The integration of storage and computing has also formed three mainstream branches: near-storage computing that uses 2.5D or 3D advanced packaging technology to package computing logic chips and memory together; based on traditional storage media, but the memory has built-in independent computing units, which can directly complete calculations In-memory computing; and new storage elements based on new non-volatile memory technology, allowing calculation and storage to be performed simultaneously in a more innovative way.
Among them, near-memory computing is relatively mature and widely used in various CPUs and GPUs. Investment in in-memory computing is relatively high, and new types of memory are still in the exploration period.
Dharma Academy predicted this: Driven by the two wheels of capital and industry, in-memory computing based on mature memories such as SRAM and NOR Flash will usher in large-scale commercial use in vertical fields, and scenarios with small computing power and low power consumption are expected to be given priority Ushering in the upgrade and iteration of products and ecology, the general-purpose computing scenario with large computing power may enter the initial stage of technology productization.
It needs to be clear that the integration of storage and computing is still an emerging technology. The test standards, mass production methods, test methods, and computing paradigms are completely different from the existing methods. There is still a long way to go before becoming a mainstream technology.
“In-memory computing, which has subverted the von Neumann architecture, will be implemented, but it may only be a transitional technology. For the industry, while paying attention to these technologies that can be industrialized and applied on a large scale, we must also pay attention to the possible future in the next ten years.” Become a mainstream technology trend.” Qin Xi said.
Software and hardware integrated cloud computing architecture, comprehensive acceleration of cloud applications
Over the past ten years, the development of cloud computing has undergone two innovations in distributed architecture technology and resource pooling technology.
The distributed architecture of the first stage replaced the mainframe to meet the computing power requirements of the enterprise at that time;
In the second stage of resource pooling technology, computing, storage, and network resources can be expanded on demand, breaking through the bottleneck of scale and stability, and providing ultra-large-scale cloud computing services.
However, with the popularization of data-intensive computing scenarios, users have higher and higher requirements for low latency and high bandwidth, while the traditional CPU-centric computing architecture not only needs to undertake computing tasks, but also is responsible for logic control, resulting in Computing and network transmission delays are large, and high bandwidth cannot be provided. CIPU/IPU/DPU accelerated computing chips emerged as the times require, and cloud computing has entered the third stage.
“The background of the birth of the DPU is the imbalance between the growth rate of bandwidth and computing performance. The performance of the CPU increased by 30% per year 5-10 years ago, and it was only less than 3% per year three years ago. The network bandwidth is still the same every year. There is still an increase of about 35%. The ratio of processing performance and bandwidth growth rate has changed from about 1:1 to about 1:10 now.” Yan Guihai, CEO of Zhongke Yushu, commented.
From another point of view, this is actually the result of the CPU performance improvement under the slowdown of Moore’s Law tending to the ceiling, unable to meet the explosive growth of computing power demand. The cloud computing system architecture that integrates software and hardware is an industry triggered by technology integration. innovation.
According to the 2023 Top 10 Technology Trends Report of DAMO Academy, the cloud computing system architecture with CIPU as the core has mainly achieved three breakthroughs in engineering:
One is the comprehensive hardware acceleration brought about by the integration of the underlying hardware structure; the other is the innovative implementation of eRDMA in the technology of full-link hardware acceleration, which not only enables large-scale networking, but also allows users to accelerate without modifying the load code. , making inclusive service of high-performance computing on the cloud a reality; the third is the system combination of CIPU and server, which can be one-to-many or many-to-one, efficiently meeting the flexible ratio of east-west traffic in different computing scenarios on the cloud Spend.
In recent years, both traditional chip giants, cloud service providers, and start-up companies have poured into this track. Nvidia launched BlueField-2 DPU as early as 2020. Amazon already has a system based on self-developed DPU. Google Cloud also cooperates with Intel on IPU. Baidu, Byte, and Tencent Cloud have joined the team of self-developed DPU. June 2022 , Alibaba Cloud also released its own CIPU processor, and DPU start-up companies have also become the sweet pastry in the eyes of investors for a while.
From the perspective of the entire industry chain, the in-depth evolution of cloud computing to a new cloud computing architecture centered on CIPU/IPU/DPU is an unstoppable technological trend in the future.
“The continuous innovation of the basic technology of cloud computing architecture is pushing the basic computing capacity on the cloud to greatly surpass that of offline servers. As long as enterprises go to the cloud, they can obtain these ever-expanding resources at low cost from cloud computing resources or cloud services. System bonus.” said Jiang Linquan, head of Alibaba Cloud Research Institute and Aliyun Shenlong Computing Platform.
Published the technology trend report for the fifth consecutive year, and several trends have been verified by the market
The cloud computing architecture of core particles, storage and computing integration, and the integration of software and hardware is the prediction of the Bodhidharma Institute for the post-Moore era. The underlying logic and background of these predictions are not only focused on “the failure of Moore’s Law”, but more The vicissitudes and laws of the great industrial revolution.
Qin Xi said that the three industrial revolutions in history have a very obvious upper and lower halves. Each half lasts for about 50 years. The first half is the emergence of various technologies, and the second half is the fusion of technology and traditional industries. technology solidification.
“For example, cloud computing is a master of technologies produced in the first half of the entire information revolution, and the second half may be started by quantum computing. It is very difficult to judge whether a technology can be applied in the short term, and the accuracy of prediction results is more than half. It is rare. “Qin Xi said.
This year is already the fifth consecutive year that Dharma Institute has predicted the top ten technological trends. Many trends have been accurately verified by the market before. Widely accepted, the judgment of the large model is also affirmed by the industry.
This makes us can’t help but look forward to which fields the world depicted in the top ten technological trends in 2023 will hit the future.
This article is transferred from: https://www.leiphone.com/category/chips/w81WAkoYg6gqgA5r.html
This site is only for collection, and the copyright belongs to the original author.