Microsoft today announced the release of Computational Network Toolkit (CNTK) 1.5 with significant language enhancements, an expanded toolbox of features, and improved readers for text and speech. Since its launch, one of the key advantages of using CNTK’s is its ability to scale efficiently across multiple GPUs and machines. With this new update, they are bringing a new technique known as Block Momentum that takes training scalability to a new level of performance.
This update also includes a revamped I/O architecture, including more flexible readers for text and speech, making it easier to input popular formats into the toolkit for deep learning training. They have also included a growing library of standard components, such as Sequence-to-Sequence with Attention and the state-of-the-art Deep Residual Nets for Image Recognition.
Download the CNTK toolkit from GitHub.