New startup mATX competes with Nvidia in AI chips

Nvidia has a new competitor. mATX specializes in developing equipment tailored for LLM. The company's chips allow you to train GPT-4 and run ChatGPT with the budget of a small startup.

New startup mATX competes with Nvidia in AI chips

New startup mATX competes with Nvidia. He specializes in developing LLM-optimized hardware to increase computing power.

mATX chips make it possible to train GPT-4 and run ChatGPT even on a small startup budget. Unlike other designs that work equally across all models, mATX targets each transistor to maximize performance.

mATX chips are designed to save resources when processing large volumes of input data and production output. They achieve peak performance with workloads such as large transformer-based models, 20B+ parameters, and inference involving thousands of concurrent users.

The startup team consists of specialists who previously worked on ML chips, ML compilers and LLMs at companies such as Google and Amazon. The company's CEO focused on driving efficiency at Google PaLM, where he developed and implemented the world's fastest LLM inference software. And the technical director worked on the architecture of one of Google's ML chips and TPUs.

The company has already raised $25 million in investment from the CEOs of Auradine and Replit. In addition, she has additional funding from leading AI and LLM researchers.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow