News

Microsoft Introduces Phi-3: A Family of Lightweight Language Models

Microsoft has recently unveiled three new AI models from the Phi-3 family. The smallest AI model in this family is called Phi-3 Mini, and Microsoft has made it available on various platforms. This model can run on smartphones.

The Phi-3 Mini model has 3.8 billion parameters and is trained on a collection of children’s stories generated by a large language model. The model is now available on Azure, Hugging Face, and Olama platforms. Microsoft plans to release two other models called Phi-3 Small (with 7 billion parameters) and Phi-3 Medium (with 14 billion parameters).

Comparison of Phi-3 Family Models with Other Models in Various Benchmarks

Microsoft released the Phi-2 small language model in December (December 1402), which performed well compared to larger models like Llama 2. Microsoft now says that Phi-3 is better than its predecessor and can provide responses close to models 10 times larger.

Phi-3 Mini Trained on Simple Children’s Books

Developers trained Phi-3 Mini on simple books. They were inspired by how children learn from bedtime stories that present big-themed topics with simple words and sentence structures. However, there are not enough children’s books to train an AI model; so they asked a large language model with a set of simple words to produce new books to train Phi-3 Mini.

Small AI Models for Edge AI

Small AI models are often suitable for cheaper AI execution on personal devices such as mobile phones and laptops. Microsoft also reported earlier this year that it is building a team specifically focused on lighter AI models. In addition to the Phi family, the company has also developed the Orca-Math model, which is powerful at solving math problems.

Back to top button