微软近日宣布,将于明年初为旗下的机器学习框架DirectML添加NPU支持。此次升级,旨在更好地适配如英特尔酷睿Ultra等内置NPU的芯片。在此基础上,开发者将可通过跨平台推理引擎“ONNX Runtime”等API,利用芯片内置的NPU运行AI模型,从而提升模型运行效能。

With the announcement, Microsoft aims to enhance the performance of AI models running on devices with integrated NPUs like Intel Core Ultra. Developers will be able to take advantage of the NPUs built into chipsets through cross-platform inference engines like ONNX Runtime and other APIs.

The addition of NPU support to DirectML marks a significant step forward in the company’s commitment to making AI more accessible and efficient for developers. By leveraging the power of NPUs, developers can now build and deploy AI models with improved performance, opening up new possibilities in a range of applications.

This move follows Microsoft’s continued efforts to integrate AI capabilities into its various products and services. As the demand for AI continues to grow, the company remains dedicated to providing developers with the tools and resources they need to harness the technology’s full potential.

Overall, the inclusion of NPU support in DirectML will enable developers to create advanced AI models with greater efficiency, further driving the adoption of AI across various industries and applications.

【来源】https://www.ithome.com/0/740/014.htm

Views: 1

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注