Datagrom AI News Logo

Microsoft’s GRIN-MoE AI model takes on coding and math, beating competitors in key benchmarks

Microsoft’s GRIN-MoE AI model takes on coding and math, beating competitors in key benchmarks

September 19, 2024: Microsofts GRIN-MoE AI: Efficient, Powerful Coding & Math - Microsofts GRIN-MoE (Gradient-Informed Mixture-of-Experts) AI model excels in coding and math, outperforming competitors in key benchmarks like MMLU and HumanEval. The models sparse computation via SparseMixer-v2 enables high efficiency, activating only 6.6 billion of its 163.8 billion parameters during inference. GRIN-MoEs scalability and performance make it ideal for enterprises with limited computational resources but come with limitations in multilingual and conversational tasks.

KEEP UP WITH THE INNOVATIVE AI TECH TRANSFORMING BUSINESS

Datagrom keeps business leaders up-to-date on the latest AI innovations, automation advances,
policy shifts, and more, so they can make informed decisions about AI tech.