Microsoft Awarded With Important Patents On Gesture Computing/NUI

0

CAMERA NAVIGATION FOR PRESENTATIONS:

Techniques for managing a presentation of information in a gesture-based system, where gestures are derived from a user’s body position or motion in the physical space, may enable a user to use gestures to control the manner in which the information is presented or to otherwise interact with the gesture-based system. A user may present information to an audience to an audience using gestures that control aspects of the system, or multiple users may work together using gestures to control aspects of the system. Thus, in an example embodiment, a single user can control the presentation of information to the audience via gestures. In another example embodiment, multiple participants can share control of the presentation via gestures captured by a capture device or otherwise interact with the system to control aspects of the presentation.
GESTURE STYLE RECOGNITION AND REWARD

Systems, methods and computer readable media are disclosed for determining whether a given gesture was performed with a particular style. This style information may then be used to personalize a gaming or multimedia experience, rewarding users for their individual style.
MOTION DETECTION USING DEPTH IMAGES

A sensor system creates a sequence of depth images that are used to detect and track motion of objects within range of the sensor system. A reference image is created and updated based on a moving average (or other function) of a set of depth images. A new depth images is compared to the reference image to create a motion image, which is an image file (or other data structure) with data representing motion. The new depth image is also used to update the reference image. The data in the motion image is grouped and associated with one or more objects being tracked. The tracking of the objects is updated by the grouped data in the motion image. The new positions of the objects are used to update an application. For example, a video game system will update the position of images displayed in the video based on the new positions of the objects. In one implementation, avatars can be moved based on movement of the user in front of a camera.



About Author

Pradeep, a Computer Science & Engineering graduate.