Microsoft introduced a mini-model of artificial intelligence Phi-3
Microsoft has launched Phi-3 Mini , a new miniature AI model designed for small businesses. It competes with GPT-3.5 and has impressive characteristics: 3.8 billion parameters and training data of 3.3 trillion tokens.

Microsoft has launched its new mini AI model, Phi-3. It is already available for use.
About Phi-3 Mini
Phi-3 Mini, despite its compactness, includes 3.8 billion parameters and is trained on a data set of 3.3 trillion tokens. The mini-model continues the traditions of its predecessors, learning and developing their skills. While Phi-1 was focused on programming and Phi-2 began to master logical thinking, Phi-3 improved her coding and reasoning skills.
The developers of Phi-3 also used a special teaching method based on the way children learn complex topics through fairy tales. Therefore, the model was based on children's books and fairy tales written by other models.
Microsoft says mini models like the Phi-3 are ideal for small businesses. In the near future, the company will expand its range by releasing Phi-3 Small with 7 billion parameters and Phi-3 Medium with 14 billion parameters.
Market for small AI models
Unlike their larger counterparts, miniature AI models offer the benefits of lower cost of use and more efficient performance on personal devices such as smartphones and laptops.
Microsoft's direct competitors also have their own versions of smaller AI models, most of which are designed to perform simpler tasks, such as analyzing documents or assisting with coding.
Google's Gemma 2B and 7B models are focused on basic chatbots and text translation tasks. Anthropic's Claude 3 Haiku is capable of quickly processing and summarizing large research documents, including graphs. And the recently introduced Llama 3 8B from Meta* can be used for some chatbots and programming assistance.
What's Your Reaction?






