Amazon Web Services announces support for Microsoft's open AI model
1 min. read
Published on
Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more
Earlier this year, Microsoft announced its collaboration with Facebook to introduce Open Neural Network Exchange (ONNX) format. ONNX brings interoperability to the AI framework ecosystem providing a definition of an extensible computation graph model, as well as definitions of built-in operators and standard data types. ONNX enables models to be trained in one framework and transferred to another for inference. ONNX models are currently supported in Caffe2, Cognitive Toolkit, and PyTorch. Last month, Microsoft announced that several industry leaders including Intel, IBM, Qualcomm, ARM, AMD and other have announced their support for the ONNX format.
Yesterday, Amazon Web Services, the world’s largest cloud provider has announced its support for ONNX. They have released ONNX-MXNet, an open source Python package to import ONNX deep learning models into Apache MXNet.
With ONNX format support for MXNet, developers can build and train models with other frameworks, such as PyTorch, Microsoft Cognitive Toolkit, or Caffe2, and import these models into MXNet to run them for inference using the MXNet highly optimized and scalable engine.
In addition, Amazon has also announced that they will be working with Microsoft and Facebook to further develop the ONNX format.
Learn more about this announcement here.
User forum
0 messages