Join our community of SUBSCRIBERS and be part of the conversation.

To subscribe, simply enter your email address on our website or click the subscribe button below. Don't worry, we respect your privacy and won't spam your inbox. Your information is safe with us.

Subscribe

News

Company:

Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Monday, May 12, 2025

Microsoft Unveils Phi-3 Family, Game-Changer for SLMs

Share

Microsoft Phi 3 family of small language models (SLMs) represents a significant advancement in the field of artificial intelligence (AI). These cutting-edge models outperform current SLMs of comparable scale in terms of both performance and cost.

Traditionally, large language models (LLMs) have been a significant focus of AI research. These models are perfect for applications like targeted drug discovery since they are highly good at complex tasks like assessing large quantities of scientific literature. However, their enormous scale has disadvantages, such as high processing costs and restricted applicability for deployment on-device.

Microsoft understands that more diverse AI models are required. According to Microsoft’s Principal Product Manager for Generative AI, Sonali Yadav, “We’re moving towards a future where customers can choose the most appropriate model for their specific needs, not just rely on a single category.” An essential stride in this direction is represented by Phi-3.

Microsoft Phi 3 With Compact Size, Massive Capacity

There are multiple models in the Microsoft Phi 3 series; the Phi-3-mini was the first model to be made available. Phi-3-mini beats models with twice the size on language, coding, and math benchmarks despite its small size of 3.8 billion parameters. Microsoft’s creative training strategy is responsible for this outstanding success.

Microsoft’s Phi-3 model uses specially selected datasets for training, whereas typical models are trained on large volumes of raw web data. Sebastien Bubeck, the vice president of AI at Microsoft, is leading this strategy, which prioritizes quality over quantity.

Bedtime stories provided an unexpected source of inspiration for this strategy. According to Bubeck, “We questioned why not focus on high-quality data instead of simply using raw web data?” As a result, the ‘TinyStories’ dataset was produced, which consists of millions of short stories created by feeding a massive model of words that a four-year-old child would recognize. Amazingly, a tiny model trained on TinyStories could generate coherent, grammatically perfect stories.

Also! Do You Know Who Is the AI Seoul Summit Host 2024 & Why?

Building on this achievement, the Microsoft team created ‘CodeTextbook,’ a dataset devoted to premium web content that has been approved for use in education. To ensure its quality, this data was subjected to several rounds of prompting, creation, and filtering by larger AI models as well as humans.

“A significant amount of effort goes into creating these synthetic datasets,” says Bubeck. “We curate the data carefully and don’t include everything that is produced.” The higher performance of Phi-3 models is mainly attributable to this careful approach to data selection.

Benefits of Small Size of Model Devices

On-Device Deployment: Phi-3 models can be directly installed on devices because of their lower size, which allows for low-latency AI experiences without the need for an internet connection. This makes it possible to use smart sensors, cameras, and farming equipment, among other things.

Enhanced Privacy: By storing data locally on the device rather than sending it to the cloud, on-device deployment also helps to protect data privacy.

Check Out! A Guide to Access & Use Mistral AI New Model

Despite the fact that Phi-3 has many advantages, Microsoft is still dedicated to ethical AI development. According to a blog post, “Mitigating risks associated with generative AI models is a core principle for Microsoft.” To protect Phi-3, Microsoft has taken a multi-layered strategy according to its established protocols for all AI models. 

With the release of Microsoft Phi 3, Microsoft and the AI community at large have made tremendous progress. Microsoft provides a range of models, from robust LLMs to small and effective SLMs, enabling users to select the best solution for their requirements. This adaptability and Microsoft’s dedication to ethical AI development open the door for AI to have a more significant and inclusive future.

Read more

Local News