Large tech companies have been jumping into the semiconductor industry. Amazon, Google, Tesla, and others have developed semiconductors for use in their own operations instead of buying all of their semiconductors from Nvidia, Intel and the like. Custom-made chips tailored to their company’s specific requirements can perform better and are cheaper to make than buying other companies’ chips in the market.
In the case of AI server chips, companies undoubtedly are looking to save money by developing an alternative to Nvidia’s chips, which in the case of its A100 GPUs can sell for $20,000 to $25,000 each on eBay. The costs can add up quickly. OpenAI, for example, will need more than 30,000 of Nvidia’s A100 GPUs for the commercialization of ChatGPT, an April 18 article at TheVerge.com reported.
Here’s a look at the progress some tech companies are making in designing their own chips:
(1) Amazon chips in.
Earlier this week, Amazon said it will invest up to $4 billion in Anthropic, an artificial intelligence (AI) firm with an AI chatbot called “Claude 2.” Anthropic will use Amazon Web Services (AWS) as its primary cloud provider, and it will use AWS-designed semiconductors when training the AI models on vast amounts of data.
Anthropic will use AWS Trainium and Inferentia chips to build, train, and deploy future foundation models. The two companies will also collaborate on the development of future Trinium and Inferentia technology. The two chips are considered a less expensive, more accessible alternative to Nvidia chips used for the same purposes.
Amazon jumpstarted its efforts in chip development in 2015 when it purchased Annapurna Labs, an Israeli startup. Since then, it has produced Graviton and Nitro, chips used in its servers. Now Amazon has an AI package to offer clients. In addition to Anthropic, Amazon can offer clients its Trainium and Inferentia chips; Titan, a large language model; and Bedrock, a service to help developers enhance software using generative AI. Some believe that having its own AI chips—which Microsoft does not have—will become a differentiator for Amazon, an excellent August 21 CNBC article on Amazon’s efforts reported.
(2) Google has AI chips too.
Google has custom developed Tensor Processing Units, chips designed to accelerate machine learning tasks like image recognition, natural language processing, and predictive analysis. Only customers of Google Cloud access the chips.
Google has also developed Tensor chips for its Pixel phones in conjunction with Samsung. Google reportedly is working to design its first fully custom chipset, the Tensor G5, by 2025 without the aid of Samsung, a July 7 Tom’s Guide article reported. TSMC would handle the production of the chip.