pIT lab tumblr

This is the tumblr of the Pervasive Interaction Technology - the pIT - Laboratory at the IT University of Copenhagen.
The main focus for the lab is to be a place for faculty and students at the IT University of Copenhagen and collaborating institutions to work with hardware and software technologies for Pervasive and Ubiquitous Computing.
Submit

In the three months since we released Beta 2, we have made many improvements to our SDK and runtime, including:

Support for up to four Kinect sensors plugged into the same computer
Significantly improved skeletal tracking, including the ability for developers to control which user is being tracked by the sensor
Near Mode for the new Kinect for Windows hardware, which enables the depth camera to see objects as close as 40 centimeters in front of the device
Many API updates and enhancements in the managed and unmanaged runtimes
The latest Microsoft Speech components (V11) are now included as part of the SDK and runtime installer
Improved “far-talk” acoustic model that increases speech recognition accuracy
New and updated samples, such as Kinect Explorer, which enables developers to explore the full capabilities of the sensor and SDK, including audio beam and sound source angles, color modes, depth modes, skeletal tracking, and motor controls
A commercial-ready installer which can be included in an application’s set-up program, making it easy to install the Kinect for Windows runtime and driver components for end-user deployments.
Robustness improvements including driver stability, runtime fixes, and audio fixes

More Information