AI

Meta Unveils Four New Chips to Power Its AI and Recommendation Systems

Meta has unveiled four new chips designed to handle tasks such as training and running AI models and recommendations across its social media and other services.

The new chips are part of Meta’s Meta Training and Inference Accelerator (MTIA) family and are designed for use in data centers. Meta has been designing its own silicon for a few years now, mainly as a way to reduce the cost of powering its AI and recommendation systems. The company says it needs custom chips to keep up with demand for AI-driven services.

Google, Amazon and Microsoft have also been building their own AI chips as a way to avoid relying on components from other companies and to optimize their machine learning data centers. A recent article about the global shortage of AI chips underscores this point, explaining that “tech companies are in a huge race to find computing power to keep up with the growing demands of artificial intelligence models.” The bottom line of all this is that whoever has the best AI infrastructure may end up owning the future of AI.

What chips are made of

MTIA chips are designed to perform two main functions. Training is a computationally intensive task of training an AI model on a dataset. Forecasting is the process of using a trained model to make predictions in real time. Meta’s custom chips are designed for consideration, which is not surprising since the company’s main products revolve around recommendation algorithms.

Every time you like or comment on a post or scroll past a video, an AI model makes predictions about what you might want to see next. Analysts often say recommendations are among the strongest AI use cases in the world. To see how they work across social media, check out this recent story about AI recommendation algorithms. Optimizing those workloads can be the difference between a fast and slow application.

Why is it important

In a way, the details of the chips are secondary to the most important trend: AI is no longer just about software, it’s about computing power. To build advanced AI models, you need custom-built chips, large amounts of power and large data centers. Companies that can get a handle on that infrastructure get a huge advantage over everyone else.

Meta’s foray into custom chips is a sign that the next phase of the AI ​​wars may be waged not just in AI research but in semiconductor design. Some analysts think that if companies can build their own advanced hardware stacks, they will be able to significantly reduce their costs and accelerate the deployment of AI in all kinds of applications, from recommendations to voice assistants to the embedded digital world of the metaverse.

At this point, Meta’s announcement of four new chips may seem like minor details in the grand story of AI. But ask the people who work on these things, and they’ll tell you otherwise: Sometimes the key to unlocking AI isn’t in the algorithms, it’s etched into the silicon itself.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button