Two years back, we reported that Microsoft Research is working on gesture based search. Microsoft research was then working on gesture based search where a user will be drawing some gestures on the screen based on which search results will appear.
A user draws two circles in the search window of a smartphone. In a moment, a bicycle and a bundle of apples appear on the screen, among other things with round shapes. This is Microsoft’s next-generation search technology called “gesture,” which enables automatic search of relevant information by drawing only; no word entry on a keyboard is necessary.
Microsoft Research even claimed that the technological development of gesture search is almost complete and will be seen on mobile and PCs, but it didn’t materialize as expected. Today, a Microsoft’s patent titled “SEARCHING AT A USER DEVICE” describes the exact implementation on Windows tablets. This patent describes the technology how Google’s Gesture search works on an Android device. Google Gesture Search lets you quickly access contacts, applications, settings, music and bookmarks on your Android device by drawing letters or numbers. It continuously refines search results as you add each gesture, and becomes better as it learns from your search history. Hopefully, Microsoft will implement the same for Windows devices.
Method, computer program product and user device for searching implemented at the user device, wherein an input is received from a user via a touch screen of the user device. Without user initiation of a temporary character recognition mode specifically to receive the input, the input is analysed using character recognition. Responsive to the character recognition recognizing at least one character in said received input, a search mode is invoked in which one or more search results which at least partially match the at least one recognized character are displayed. The user may select one of the displayed search results.
Source: USPTO 20140062904