Two new reports from ABI Research provide a detailed analysis of the state of the AI chipset market today. In it, Lian Jye Su, principal analyst at ABI Research, talks about the companies and technologies that are entering this potentially lucrative market.
Artificial intelligence in the cloud
The first report, \"Cloud AI Chipsets: Market Landscape and Vendor Positioning,\" highlights the rapid growth of cloud AI inference and training services. As a result, ABI Research predicts that the AI chipset market is expected to grow from $4.2 billion in 2019 to $10 billion in 2024. The current leaders in this space, Nvidia and Intel, are being challenged by companies such as Cambricon Technologies, Graphcore, Habana Labs and Qualcomm.
According to Su, Nvidia remains the clear leader in this market, largely due to Nvidia's mature developer ecosystem and first-mover advantage.
\"As AI models, libraries, and toolkits continue to change and update, Nvidia is a great choice because of its ability to offer general-purpose AI chipsets. Of course, these advantages will diminish as the market matures, but Nvidia will remain in a strong position for at least the foreseeable future.\"
Today's cloud AI chipset market can be divided into three segments: First, cloud service providers that host public clouds, including AWS, Microsoft, Google, Alibaba, Baidu, and Tencent; Second is the enterprise data center, or private cloud; Then there is hybrid cloud, which is a combination of public and private clouds (VMware, Rackspace, NetApp, HPE, Dell).
The report also identifies another emerging market segment - telecom cloud, which refers to cloud infrastructure deployed by telcos for their core network, IT and edge computing workloads.
Su said this new market segment presents a huge opportunity for AI chipset manufacturers.
\"We've seen network infrastructure vendors like Huawei, as well as Nokia, launch ASics that are optimized for telecom network functions. It's a huge market, and one that Nvidia has been trying to get into lately.\"
Total annual sales of AI chipsets from 2017 to 2024 (Source: ABI Research)
Although Su does not believe that other vendors will be able to replace Nvidia's dominant position in cloud AI training anytime soon, the specific AI reasoning field is not dominated by one vendor, which is partly determined by the different nature of reasoning workloads in the vertical direction. ASIC is expected to see strong growth in the segment starting in 2020, he said.
For now, the trend to move AI reasoning to edge devices means that devices like smartphones, self-driving cars, and robots rely less on the cloud, but that doesn't mean inference workloads - which some cloud service providers see as larger than training workloads - will decrease, Su said.
\"Some AI will never go to the edge, such as chatbots and conversational AI, fraud monitoring, and cybersecurity systems. These systems will evolve from rule-based systems to deep learning-based AI systems, which will actually increase the inference effort enough to replace those inference workloads that are moving to the edge.\"
In addition, Google's TPU can solve training and reasoning problems in the cloud and is seen as a strong challenger to CPU and GPU technologies (dominated by Intel and Nvidia, respectively). As the report states, Google's success with Tpus provides a blueprint for other cloud service providers (CSPS) that are developing their own AI accelerator ASics, such as Huawei, AWS, and Baidu, which are already on the move.
If cloud service providers are using their own chipsets, is there room for other chipset providers in this segment?
\"This is extremely challenging for CSPS who are just starting out with their own chipsets, and we even predict that the CSP market will decline by 15 to 18 percent by 2024.\" The opportunity is more in the private data center space. Banking institutions, healthcare organizations, R&D Labs, and academia will still need to run AI, and they will consider chipsets that are more optimized for AI workloads, which gives some advantages to newcomers like Cerebras, Graphcore, Habana Labs, and Wave Computing.
Others who will benefit from these trends are IP core licensee vendors such as ARM, Cadence, and VeriSilicon, who will be responsible for helping companies that are even starting their own development with chipset design.
Artificial intelligence at the edge
ABI's second report, titled \"Edge AI Chipsets: Technology Outlook and Use Cases,\" states that the market for edge AI inference chipsets in 2018 was $1.9 billion, and the market for edge training was $1.4 million.
What applications are trained at the edge today? Among these data, Su explains, are gateways (historical databases or device hubs) and on-premises servers (in a private cloud, but physically located close to where the AI data is generated). Chipsets designed for training tasks on on-premises servers include Nvidia's DGX, Huawei's gateways and servers, which include the Ascend 910 chipset, and system-level products for on-premises data centers such as Cerebras System, Graphcore, and Habana Labs.
\"The 'edge training' market is still small because the cloud is still the top choice for AI training,\" Su said.
Total annual sales revenue of AI chipsets for reasoning and training, 2017-2024 (Source: ABI Research)
Edge AI reasoning is the main driver for the edge AI market to achieve a 31% CAGR between 2019 and 2024. Su mentioned three major markets (smartphones\/wearables, automotive, smart home\/white goods) as well as three niches.
The first niche market is robotics, which, because of its reliance on multiple types of neural networks, often require heterogeneous computing architectures, such as SLAM (simultaneous localization and mapping) for navigation, conversational AI for human-machine interfaces, and machine vision for object detection, all of which use cpus, Gpus, and ASics to varying degrees. At present, Nvidia, Intel and Qualcomm are engaged in a fierce competition in this field.
The second niche market is smart industrial applications in manufacturing, smart buildings, and oil and gas. We see FPGA vendors excel in this space because of legacy devices, but also because of the flexibility and adaptability of FPGA architectures.
The final niche is the \"very edge,\" where ultra-low-power AI chipsets are embedded in sensors and other small end points in the WAN. Due to the focus on ultra-low power consumption, this space is dominated by FPGA vendors, RISC-V design, and ASIC vendors.
So who is leading the field of edge AI reasoning so far?
\"What is unexpected - or expected - is that smartphone AI ASIC manufacturers take the lead in this field, because smartphone shipments are large, such as Apple, Hisilicon Semiconductor, Qualcomm, Samsung and Mediatek, if it is a start-up company. I think Hailo, Horizon Robotics and Rockchip seem to be gaining quite a bit of momentum relative to the end device manufacturers.\"
Su also said that software will be critical to the commercial implementation and deployment of edge AI chipsets, and that Nvidia is upgrading compilation tools and building a community of developers, in contrast to Intel and Xilinx's strategy of partnering with startups or acquiring software-based acceleration solutions.
\"Chipset manufacturers should consider making toolkits and libraries available to the developer community through developer training programs, competitions, forums and conferences, as this can attract developers to work with chipset manufacturers to develop relevant applications, all of which are not easily available to startups.\"
The report concludes that in addition to providing the right software and support for the developer community, vendors should also provide a good development roadmap and support for the rest of the technology value chain, in addition to allowing their chips to have large-scale use cases and competitive pricing.
Artificial intelligence has become an important driving force in the current chip industry. Looking back at the development of AI in the semiconductor industry, we can clearly see an evolutionary route from the cloud to the terminal.
< br >
Initially, AI was mainly deployed as a service in the cloud. This generation of artificial intelligence is based on big data and neural networks, so it needs a lot of computing power during training, and it also needs computing power support when deployed in the cloud. Therefore, GPU-accelerated artificial intelligence represented by Nvidia in the field of cloud artificial intelligence has become the focus of attention. At the same time, there are also cloud dedicated artificial intelligence chip companies represented by Graphcore and Habana that compete with GPU. After 2018, with the optimization of model and chip design, artificial intelligence gradually sank from the cloud to strong intelligent device terminals such as mobile phones, and applications such as super resolution, beauty, and face recognition based on artificial intelligence algorithms on mobile phones have gradually been recognized by the mainstream, and the corresponding chip (IP) has become an indispensable part of the mobile phone SoC
With the development of big data and the improvement of computing power, artificial intelligence has ushered in a new round of outbreak in the past two years. The realization of artificial intelligence relies on three elements: the algorithm is the core, the hardware and data are the foundation, and the chip is the most important part of the hardware. It actually includes two calculation processes: 1. Train; 2. Inference.
< br >
Why do we need AI chips? With the continuous development of neural network algorithm application, the traditional CPU has been unable to afford the geometric increase in the amount of computation. As a branch of machine learning, deep learning is the mainstream way of current artificial intelligence research. Simply put, it is to use mathematical methods to simulate the human brain neural network, and use a large number of data to train the machine to simulate the human brain learning process, its essence is to transform the traditional algorithm problems into data and calculation problems. So the requirements for the underlying chip have also changed fundamentally: AI chips are not designed to execute instructions, but to train and apply computations with large amounts of data.
With the rapid development of deep learning technology, and the support of massive data and efficient computing power in the Internet and cloud computing era, artificial intelligence technologies such as computer vision technology, speech technology, and natural language understanding technology have made breakthrough progress, and unlocked artificial intelligence scenarios in many industries, generated huge business value, and driven the development of artificial intelligence industry. At the same time, with the increasing complexity of artificial intelligence technology, the demand for artificial intelligence computing power will grow.
market size
At present, the development of China's artificial intelligence chip industry is still in its infancy. With the development of big data and the improvement of computing power, artificial intelligence has ushered in a new round of outbreak in the past two years. Data show that from 2016 to 2019, the market size of China's smart chips increased from 1.9 billion yuan to 5.61 billion yuan, with a compound annual growth rate of 43.5%, and the market size is expected to grow further in 2021, reaching 8.63 billion yuan. In the future, the market prospect of artificial intelligence chip industry is very considerable.
Data source: China Business Industry Research Institute
Cloud high-performance chips mainly meet the centralized computing needs of data centers in artificial intelligence computing, and are mainly used in intelligent server products and cloud artificial intelligence computing scenarios. According to IDC research data, the size of China's smart server market in 2019 is about 1.95 billion US dollars, with a compound annual growth rate of 27.09% from 2018 to 2023, and the market size is expected to reach 3.18 billion US dollars by 2021, with rapid market growth.
Data source: China Business Industry Research Institute
Artificial intelligence applications such as smart cities and smart businesses require small-scale processing and analysis on the edge side, and edge devices such as intelligent NVR and XVR based on deep learning require the dedicated computing power of artificial intelligence chips. In China's NVR market, the penetration rate of smart chips is significantly higher than that of end-perception devices. Data show that the market size of NVR products in 2019 is about 5.29 billion yuan, of which the penetration rate of intelligent NVR products is about 23.5%, and it is expected that by 2021, the market size of NVR products will be about 9.38 billion yuan.
Data source: China Business Industry Research Institute
Future development prospect
In recent years, the state attaches great importance to the development of artificial intelligence and the chip industry, artificial intelligence and the chip industry has become a national strategic emerging industry, and relevant support policies have been introduced. In 2020, The State Council issued Several Policies to Promote the High-quality Development of the Integrated Circuit and Software Industry, which clearly stated that the integrated circuit industry and the software industry are the core of the information industry and the key force leading a new round of scientific and technological revolution and industrial change, and clarified the preferential tax policies for relevant enterprises or projects encouraged by the State to promote industrial development.
2. Technological progress promotes industrial upgrading
With the continuous improvement of chip computing power, the rapid progress of artificial intelligence computing power products such as artificial intelligence chips, cloud, edge end, and terminals provides high-performance and high-computing density computing power support for the large-scale deployment of various artificial intelligence solutions. The rapid progress of artificial intelligence technology has provided the prerequisite for industry applications, unlocked the artificial intelligence scenarios of many industries, promoted the transformation and efficiency improvement of traditional industries, and generated huge commercial value.
3. The steady growth of market demand continues to promote the development of artificial intelligence industry
With the development of artificial intelligence technology, in the field of smart cities, \"understanding\" has gradually replaced \"seeing, seeing far, and seeing clearly\" to become the main factor promoting the progress of the industry. In 2016 and 2017, through the joint action of the country's vigorous construction and the active innovation of enterprises, smart cities began to enter the rapid landing stage in 2018, and technology began to truly empower the industry. Computer vision technology, speech technology, natural language understanding technology and other artificial intelligence technology has become the main force to promote the field of smart city, with the continuous expansion of the market breadth, all kinds of application scenarios based on artificial intelligence technology will continue to emerge.
At present, China's artificial intelligence chip industry is in the infancy of its life cycle. The main reason is that the domestic artificial intelligence chip industry started late, the overall sales market is in the eve of rapid growth stage, the application scenario of traditional chips is gradually replaced by artificial intelligence dedicated chips, the market demand for artificial intelligence chips will grow with cloud\/edge computing, smart phones and Internet of things products, and during this period, Many domestic enterprises have released their own dedicated AI chips.
Although domestic artificial intelligence chips are gradually replacing traditional chips, integrators or chip companies are still looking for new cooperation models, so as to better grasp the needs of new customers.
< br >
Artificial intelligence accelerator chips have been hyped, but how big is this market, and which companies are really selling artificial intelligence chips today?
< br >
Two new reports from ABI Research provide a detailed analysis of the state of the AI chipset market today. In it, Lian Jye Su, principal analyst at ABI Research, talks about the companies and technologies that are entering this potentially lucrative market.
< br >
The first report, \"Cloud AI Chipsets: Market Landscape and Vendor Positioning,\" highlights the rapid growth of cloud AI inference and training services. As a result, ABI Research predicts that the AI chipset market is expected to grow from $4.2 billion in 2019 to $10 billion in 2024. The current leaders in this space, Nvidia and Intel, are being challenged by companies such as Cambricon Technologies, Graphcore, Habana Labs and Qualcomm.
< br >
According to Su, Nvidia remains the clear leader in this market, largely due to Nvidia's mature developer ecosystem and first-mover advantage.
< br >
\"As AI models, libraries, and toolkits continue to change and update, Nvidia is a great choice because of its ability to offer general-purpose AI chipsets. Of course, these advantages will diminish as the market matures, but Nvidia will remain in a strong position for at least the foreseeable future.\"
< br >
Today's cloud AI chipset market can be divided into three segments: First, cloud service providers that host public clouds, including AWS, Microsoft, Google, Alibaba, Baidu, and Tencent; Second is the enterprise data center, or private cloud; Then there is hybrid cloud, which is a combination of public and private clouds (VMware, Rackspace, NetApp, HPE, Dell).
< br >
The report also identifies another emerging market segment - telecom cloud, which refers to cloud infrastructure deployed by telcos for their core network, IT and edge computing workloads.
< br >
Su said this new market segment presents a huge opportunity for AI chipset manufacturers.
< br >
\"We've seen network infrastructure vendors like Huawei, as well as Nokia, launch ASics that are optimized for telecom network functions. It's a huge market, and one that Nvidia has been trying to get into lately.\"
< br >
Total annual sales of AI chipsets from 2017 to 2024 (Source: ABI Research)
< br >
Although Su does not believe that other vendors will be able to replace Nvidia's dominant position in cloud AI training anytime soon, the specific AI reasoning field is not dominated by one vendor, which is partly determined by the different nature of reasoning workloads in the vertical direction. ASIC is expected to see strong growth in the segment starting in 2020, he said.
< br >
For now, the trend to move AI reasoning to edge devices means that devices like smartphones, self-driving cars, and robots rely less on the cloud, but that doesn't mean inference workloads - which some cloud service providers see as larger than training workloads - will decrease, Su said.
< br >
\"Some AI will never go to the edge, such as chatbots and conversational AI, fraud monitoring, and cybersecurity systems. These systems will evolve from rule-based systems to deep learning-based AI systems, which will actually increase the inference effort enough to replace those inference workloads that are moving to the edge.\"
< br >
In addition, Google's TPU can solve training and reasoning problems in the cloud and is seen as a strong challenger to CPU and GPU technologies (dominated by Intel and Nvidia, respectively). As the report states, Google's success with Tpus provides a blueprint for other cloud service providers (CSPS) that are developing their own AI accelerator ASics, such as Huawei, AWS, and Baidu, which are already on the move.
< br >
If cloud service providers are using their own chipsets, is there room for other chipset providers in this segment?
< br >
\"This is extremely challenging for CSPS who are just starting out with their own chipsets, and we even predict that the CSP market will decline by 15 to 18 percent by 2024.\" The opportunity is more in the private data center space. Banking institutions, healthcare organizations, R&D Labs, and academia will still need to run AI, and they will consider chipsets that are more optimized for AI workloads, which gives some advantages to newcomers like Cerebras, Graphcore, Habana Labs, and Wave Computing.
< br >
Others who will benefit from these trends are IP core licensee vendors such as ARM, Cadence, and VeriSilicon, who will be responsible for helping companies that are even starting their own development with chipset design.
< br >
Artificial intelligence at the edge
< br >
ABI's second report, titled \"Edge AI Chipsets: Technology Outlook and Use Cases,\" states that the market for edge AI inference chipsets in 2018 was $1.9 billion, and the market for edge training was $1.4 million.
< br >
What applications are trained at the edge today? Among these data, Su explains, are gateways (historical databases or device hubs) and on-premises servers (in a private cloud, but physically located close to where the AI data is generated). Chipsets designed for training tasks on on-premises servers include Nvidia's DGX, Huawei's gateways and servers, which include the Ascend 910 chipset, and system-level products for on-premises data centers such as Cerebras System, Graphcore, and Habana Labs.
< br >
\"The 'edge training' market is still small because the cloud is still the top choice for AI training,\" Su said.
< br >
Total annual sales revenue of AI chipsets for reasoning and training, 2017-2024 (Source: ABI Research)
< br >
Edge AI reasoning is the main driver for the edge AI market to achieve a 31% CAGR between 2019 and 2024. Su mentioned three major markets (smartphones\/wearables, automotive, smart home\/white goods) as well as three niches.
< br >
The first niche market is robotics, which, because of its reliance on multiple types of neural networks, often require heterogeneous computing architectures, such as SLAM (simultaneous localization and mapping) for navigation, conversational AI for human-machine interfaces, and machine vision for object detection, all of which use cpus, Gpus, and ASics to varying degrees. At present, Nvidia, Intel and Qualcomm are engaged in a fierce competition in this field.
< br >
The second niche market is smart industrial applications in manufacturing, smart buildings, and oil and gas. We see FPGA vendors excel in this space because of legacy devices, but also because of the flexibility and adaptability of FPGA architectures.
< br >
The final niche is the \"very edge,\" where ultra-low-power AI chipsets are embedded in sensors and other small end points in the WAN. Due to the focus on ultra-low power consumption, this space is dominated by FPGA vendors, RISC-V design, and ASIC vendors.
< br >
So who is leading the field of edge AI reasoning so far?
< br >
\"What is unexpected - or expected - is that smartphone AI ASIC manufacturers take the lead in this field, because smartphone shipments are large, such as Apple, Hisilicon Semiconductor, Qualcomm, Samsung and Mediatek, if it is a start-up company. I think Hailo, Horizon Robotics and Rockchip seem to be gaining quite a bit of momentum relative to the end device manufacturers.\"
< br >
Su also said that software will be critical to the commercial implementation and deployment of edge AI chipsets, and that Nvidia is upgrading compilation tools and building a community of developers, in contrast to Intel and Xilinx's strategy of partnering with startups or acquiring software-based acceleration solutions.
< br >
\"Chipset manufacturers should consider making toolkits and libraries available to the developer community through developer training programs, competitions, forums and conferences, as this can attract developers to work with chipset manufacturers to develop relevant applications, all of which are not easily available to startups.\"
< br >
The report concludes that in addition to providing the right software and support for the developer community, vendors should also provide a good development roadmap and support for the rest of the technology value chain, in addition to allowing their chips to have large-scale use cases and competitive pricing.