Microsoft has introduced Phi-3 Mini, the latest addition to its lineup of lightweight AI models. Phi-3 Mini, with 3.8 billion parameters, offers a streamlined alternative to larger language models like GPT-4, catering to users' diverse needs. It is engineered to provide high-quality AI capabilities on local devices, bypassing the need for extensive computing power typically associated with large language models (LLMs) like GPT-4. Here are the key highlights:
Phi-3 Models: Phi-3 is a family of open AI models developed by Microsoft. These models are the most capable and cost-effective small language models (SLMs) available, outperforming models of the same size and next size up across various language, reasoning, coding, and math benchmarks.
Phi-3 Mini: Phi-3 mini, a 3.8B language model, is now available on Microsoft Azure AI Studio, Hugging Face, and Ollama. It comes in two context-length variants_4K and 128K tokens. Notably, it is the first model in its class to support a context window of up to 128K tokens without comprising quality. Phi-3 mini is instruction-tuned, making it ready to use out-of-the-box. It is optimized for ONNX Runtime with support for Windows DirectML and cross-platform compatibilty across GPU, CPU, and mobile hardware. Additionally, it is available as an NVIDIA NIM microservice with a standard API interface that can be deployed anywhere.
Performance: Phi-3 models significantly outperform language models of the same and larger sizes on key benchmarks. Phi-3 mini performs better than models twice its size, and Phi-3-small and Phi-3-medium outperform much larger models, including GPT-3.5T. These numbers are produced with the same pipeline to ensure comparability. However, it's essential to note that Phi-3 models may not perform as well on factual knowledge benchmarks due to their smaller size.
Safety-First Design: Microsoft continues to offer the best models across the quality-cost curve, emphasizing safety-first model design.
In the coming weeks, additional models will be added to the Phi-3 family, providing customers with even more flexibility across the quality-cost curve. Phi-3-small (7B) and Phi-3-medium (14B) will be available in the Azure AI model catalog and other model gardens shortly. Phi-3 Mini represents a groundbreaking advancement in AI scalability, bringing top-tier technology to smartphones and local devices without relying on cloud computing.
For more information on IT Services, Web Applications & Support kindly call or WhatsApp at +91-9733733000 or you can visit https://www.technodg.com