Microsoft details new Contextual sensing in Windows 10


Windows 10 (and Windows 10 Mobile) contains several new APIs which surface high level data for apps culled from the sensors which festoon our devices these days.

These for example include APIs which can tell apps if you are walking or driving a car, based on Microsoft processing of accelerometer and gyroscope data.

The APIs can also tell apps for example when your device is idle (ie lying on a beside for a few hours) or in your pocket, allowing the apps to change its behaviour according. The APIs include those for counting steps,barometer and altitude sensing, which is useful for fitness tracking and indoor location tracking, detecting proximity and presence and more.

Examples include:

  • Surfacing information based on the user’s motion context (such as an activity-based playlist)
  • Change the app behaviour based on the user’s motion context (such as auto-adjusting the camera focus when you detect that the user is capturing an image while walking or running)
  • Health and fitness tracking
  • Navigation and maps
  • Power saving (for example, avoiding constantly polling location or wifi when the device os idle or stationary)

The technology is certainly interesting, but we hope Microsoft does not only rely on developers to offer new features based on the APis, but also themselves add some cool features to Windows 10 Mobile which take advantage of these e.g. having Cortana automatically remember where you left your car after you get out of your car and start walking.

The full post  with more details for developers can be read here.

Some links in the article may not be viewable as you are using an AdBlocker. Please add us to your whitelist to enable the website to function properly.