Microsoft’s new Maia 200 inference accelerator chip enters this overheated market with a new chip that aims to cut the price ...
The collaboration builds upon ongoing work, with Microsoft supporting the ICAEW’s latest GenAI Accelerator module.
OpenAI and Booking.com today announced the launch of the SME AI Accelerator, a Europe-wide initiative designed to help small ...
OpenAI has teamed up with Booking.com to launch a new scheme designed to help 20,000 small and medium-sized businesses across Europe to adopt and use AI tools in their company operations.
Today, we’re proud to introduce Maia 200, a breakthrough inference accelerator engineered to dramatically improve the economics of AI token generation. Maia 200 is an AI inference powerhouse: an ...
University will be launching the Advancing AI for the Public Good Initiative, which will include a free online program.
Built with TSMC's 3nm process, Microsoft's new Maia 200 AI accelerator will reportedly 'dramatically improve the economics of ...
Microsoft unveils Maia 200 AI inference chip using TSMC 3nm, claiming higher FP4/FP8 performance and 30% better $/perf vs rivals—read more now.
The Maia 200 AI chip is described as an inference powerhouse — meaning it could lead AI models to apply their knowledge to ...
With over 100 billion transistors, Maia 200 offers "powerhouse" AI inferencing possibilites, Microsoft says.
Microsoft recently announced Maia 200, a new AI accelerator specifically designed for inference workloads. According to ...