You'll soon be able to use Windows Copilot Library's powerful APIs to develop AI apps

The Phi Silica model will also arrive this summer

Reading time icon 2 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

Key notes

  • Microsoft arrived with a lot of AI-centered announcements.
  • Windows Copilot Library will soon bring several powerful APIs for local AI development.
  • Studio Effects, Live Captions Translations, OCR, and more are some of the APIs announced.

It’s been a busy week for Microsoft. The Redmond tech giant arrived with a lot of AI-centered announcements, like the Copilot+ PCs and a new Phi Silica small model at the annual Build conference on Monday. And now, Windows Copilot Library gets several powerful APIs for local AI development. 

Microsoft says that at least over 40 on-device AI models and algorithms, including DiskANN, are built-in natively in Windows Copilot Library. It then has ready-to-use AI APIs like Studio Effects, Live Captions Translations, OCR, Recall with User Activity, and more. 

Why it matters is that these APIs will allow developers to easily include AI experiences inside their apps. For example, WhatsApp uses Windows Studio effects API which will allow users to use AI effects when users use a camera inside the app.

“Vector Embeddings, Retrieval Augmented Generation (RAG), Text Summarization along with other APIs will be coming later to Windows Copilot Library. Developers will be able to access these APIs as part of the release,” Microsoft promises in the blog post.

The Recall feature, for example, lets you improve the user’s experience by adding helpful information through the User Activity API, so users can easily continue where they left off in your app.

An API is an “Application Programming Interface,” an intermediary between different programs to communicate with each other, thus impacting how an app is developed.

Phi Silica, Microsoft’s latest small language model, will be available starting in June for developers to try. Built from the Phi-3 family, Phi Silica is “custom-built for the NPUs in Copilot+ PCs” and will be more cost-effective and power-friendly. It consumes only 1.5 Watts for the first token latency at 650 tokens/second.

Leave a Reply

Your email address will not be published. Required fields are marked *