sábado, fevereiro 1, 2025
HomeArtificial IntelligenceAnnouncing the availability of the o3-mini reasoning model in Microsoft Azure OpenAI...

Announcing the availability of the o3-mini reasoning model in Microsoft Azure OpenAI Service


We are pleased to announce that OpenAI’s new o3-mini model is now available in Microsoft Azure OpenAI Service. Building on the foundation of the o1 model, o3-mini delivers a new level of efficiency, cost-effectiveness, and reasoning capabilities.

We are pleased to announce that OpenAI o3-mini is now available in Microsoft Azure OpenAI Service. o3-mini adds significant cost efficiencies compared with o1-mini with enhanced reasoning, with new features like reasoning effort control and tools, while providing comparable or better responsiveness.

o3-mini’s advanced capabilities, combined with its efficiency gains, make it a powerful tool for developers and enterprises looking to optimize their AI applications.

With faster performance and lower latency, o3-mini is designed to handle complex reasoning workloads while maintaining efficiency.

New features of o3-mini

As the evolution of OpenAI o1-mini, o3-mini introduces several key features that enhance AI reasoning and customization:

  • Reasoning effort parameter: Allows users to adjust the model’s cognitive load with low, medium, and high reasoning levels, providing greater control over the response and latency. 
  • Structured outputs: The model now supports JSON Schema constraints, making it easier to generate well-defined, structured outputs for automated workflows.
  • Functions and Tools support: Like previous models, o3-mini seamlessly integrates with functions and external tools, making it ideal for AI-powered automation. 
  • Developer messages: The “role”: “developer” attribute replaces the system message in previous models, offering more flexible and structured instruction handling.
  • System message compatibility: Azure OpenAI Service maps the legacy system message to developer message to ensure seamless backward compatibility.
  • Continued strength on coding, math, and scientific reasoning: o3-mini further enhances its capabilities in coding, mathematics, and scientific reasoning, ensuring high performance in these critical areas. 

With these improvements in speed, control, and cost-efficiency, o3-mini is optimized for enterprise AI solutions, enabling businesses to scale their AI applications efficiently while maintaining precision and reliability. 

From o1-mini to o3-mini: What’s changed? 

o3-mini is the latest reasoning model released, with notable differences compared with the o1 model released last September. While both models share strengths in reasoning, o3-mini adds new capabilities like structured outputs and functions and tools, resulting in a production-ready model with significant improvements in cost efficiencies. 

Feature comparison: o3-mini versus o1-mini

Feature o1-mini o3-mini
Reasoning Effort Control No Yes (low, medium, high)
Developer Messages No Yes
Structured Outputs No Yes
Functions/Tools Support No Yes
Vision Support No No

Watch o3-mini in action, helping with banking fraud, in the demo below:

Join us on this journey

We invite you to explore the capabilities of o3-mini and see how it can transform your AI applications. With Azure OpenAI Service, you get access to the latest AI innovations, enterprise-grade security, and global compliance, and data remains private and secure.

Learn more about OpenAI o3-mini in GitHub Copilot and GitHub Models here.

Get started today! Sign up in Azure AI Foundry to access o3-mini and other advanced AI models.



RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -
Google search engine

Most Popular

Recent Comments