Microsoft releases Phi-3, a new artificial intelligence model designed for cellphones that was trained using children's stories.

Microsoft has unveiled Phi-3, a small-weight yet effective AI model. Children's bedtime stories served as the model for Phi-3's training methodology.

Apr 24, 2024 - 15:36
 0  11
Microsoft releases Phi-3, a new artificial intelligence model designed for cellphones that was trained using children's stories.

Microsoft has been a leader in the fight for artificial intelligence. The IT giant led by Satya Nadella has been devoting a lot of its resources to artificial intelligence (AI), as evidenced by its billion-dollar investments in OpenAI and the launch of its own chatbot tool. The business has since unveiled a new AI model that developers may find useful. Introducing Phi-3, an AI model that is lighter than existing LLMs while supposedly providing equivalent performance. What's more? The concept for Phi-3 training originated with bedtime stories for kids. You did really read correctly.

A blog article by Microsoft described the Phi-3 family as "the most capable and cost-effective small language models (SLMs) available, outperforming models of the same size and next size up across a variety of language, reasoning, coding, and math benchmarks." This is a detailed description of the Phi-3 family.

The business has currently introduced Phi-3 Mini, the smallest model in the next series that also includes Phi-3 Small and Phi-3 Medium, all of which are steadily larger while retaining efficiency. It is important to note that smaller AI models can perform jobs like document summarization, coding support, and simple chatbot powering more effectively. They are also more user-friendly across a variety of platforms.

Phi-3 Mini, which runs on 3.8 billion parameters, is intended to provide a balance between economy and capability, making it appropriate for laptops and smartphones with little processing power. Microsoft claims that by utilizing fewer parameters, it hopes to lower the cost and increase the accessibility of AI, especially for companies that need to integrate AI without requiring a lot of processing power or those with smaller datasets.

In addition, Microsoft stated in its blog post that Phi-3 Mini is easily integrated and widely compatible, as it is optimized for Nvidia's GPUs and accessible on platforms such as Hugging Face, Azure, and Ollama.

Furthermore, Microsoft provided information on how Phi-3 was trained after its inspiration from children's literature. The business explained in a different blog post that feeding enormous volumes of internet data to large language models allowed them to learn the nuances of language and provide intelligent responses. But a fresh concept was put forth by Microsoft researchers under the direction of Vice President of Generative AI Research Sebastien Bubeck. They looked for incredibly high-quality data rather than depending only on basic web data.

The blog article also described how Ronen Eldan of Microsoft used to read bedtime stories to his daughter and he used to wonder how she was able to make the connections between the words in the book. Microsoft researchers chose to handpick a particular dataset for Phi-3's training, partly motivated by Eldan's bedtime reading routines with his daughter.

They gave a big language model a list of 3,000 well chosen words, with a good balance of verbs, adjectives, and nouns, and they had it create stories for kids. One noun, one verb, and one adjective from the list were used to create each tale. Millions of succinct children's stories were produced by repeating this method millions of times over several days. Phi-3 was then trained using the stories.

Eric Boyd, corporate vice president of Microsoft Azure AI Platform, told The Verge, "We took a list of more than 3,000 words and asked an LLM to make 'children's books' to teach Phi because there aren't enough children's books out there."

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow

Suggestion News Suggestion News: Revolutionizing Media Engagement, In a time where data streams at the speed of light, media organizations should adjust quickly to remain pertinent. Idea News, a unique player in the business, is doing exactly that. With a new methodology and unflinching obligation to quality reporting, they're causing disturbances.