Microsoft yesterday announced that Open Neural Network Exchange (ONNX) format is production ready. ONNX brings interoperability to the AI framework ecosystem providing a definition of an extensible computation graph model, as well as definitions of built-in operators and standard data types. ONNX enables models to be trained in one framework and transferred to another for inference. ONNX models are currently supported in Caffe2, Cognitive Toolkit, and PyTorch.
The ONNX format is the basis of an open ecosystem that makes AI more accessible and valuable to all: developers can choose the right framework for their task, framework authors can focus on innovative enhancements, and hardware vendors can streamline optimizations.
After Microsoft and Facebook announced this format few months back, numerous hardware partners including Qualcomm, Huawei, and Intel announced support of the ONNX format for their hardware platforms, making it easier for users to run models on different hardware platforms.
Microsoft yesterday also announced that Microsoft Cognitive Toolkit support for ONNX. Developers can now import ONNX models into Cognitive Toolkit or export models into ONNX format.
Learn more about it here.