News

Microsoft launched the next version of its lightweight AI model Phi-3 Mini, the first of three small models the company plans to release. Phi-3 Mini measures 3.8 billion parameters and is trained ...
Microsoft is doubling down on that concept. Microsoft on Tuesday launched Phi-3 Mini, the first of three small models the company says it'll launch in the coming months. Microsoft trained Phi-3 ...
Microsoft on Tuesday began publicly sharing Phi-3, an update to its small language model that it says is capable of handling many tasks that had been thought to require far larger models.
Microsoft announced that its Phi-3.5-Mini-Instruct model, the latest update to its Phi-3 model family, is now available. The Phi family is Microsoft's assorted compact micro models that can run on ...
Recognizing that bigger isn't always better, Microsoft has released its smallest open AI models to date, Phi-3. The new micro models "are the most capable and cost-effective small language models ...
Phi-3.5-MoE, a 42-billion parameter Mixture of ... extraction of insights from unstructured data. More broadly, Microsoft will launch the AI21 Jamba 1.5 Large and Jamba 1.5 models on Azure AI ...
Notably, Phi-4 is the first Phi-series model to launch following the departure of Sébastien Bubeck. Previously one of the vice presidents of AI at Microsoft and a key figure in the company’s ...
Microsoft has launched a series of AI ... science and coding applications. Meanwhile the Phi 4 mini reasoning model has 3.8 billion parameters and was trained on around a million synthetic ...
Microsoft Corp. has developed a small language model that can solve certain math problems better than algorithms several times its size. The company revealed the model, Phi-4, on Thursday.
This is Microsoft's latest small language model, coming in at 14 billion parameters in size, and it will compete with other small models such as GPT-4o mini, Gemini 2.0 Flash, and Claude 3.5 ...