Microsoft Patents Gesture Based Interaction System For Desktop Computing

Reading time icon 2 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

Microsoft keyboard Patent

Natural User Interfaces (NUI) has started to evolve more rapidly in the past few years with the gesture-based interfaces that use touch or touch-less interactions or the full body to enable rich interactions with a computing device. But, none of these have changed the way of interacting with traditional desktop computing using keyboard and mouse. With this patent, Microsoft is describing a gesture based system for desktop which allows you to control elements of these tasks such as mode switches, window and task management, menu selection and certain types of navigation which are offloaded to shortcut and modifier keys or context menus.

Summary:

The following presents a simplified summary of the disclosure in order to provide a basic understanding to the reader. This summary is not an extensive overview of the disclosure and it does not identify key/critical elements or delineate the scope of the specification. Its sole purpose is to present a selection of concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.

Described herein are methods and systems for controlling a computing-based device using both input received from a traditional input device (e.g. keyboard) and hand gestures made on or near a reference object (e.g. keyboard). In some examples, the hand gestures may comprise one or more touch hand gestures and/or one or more free-air hand gestures.

Many of the attendant features will be more readily appreciated as the same becomes better understood by reference to the following detailed description considered in connection with the accompanying drawings.

Abstract:

Methods and system for controlling a computing-based device using both input received from a traditional input device (e.g. keyboard) and hand gestures made on or near a reference object (e.g. keyboard). In some examples, the hand gestures may comprise one or more hand touch gestures and/or one or more hand air gestures.

Source: USPTO

User forum

0 messages