Hands-on computing
Hands-On Computing is a brach of Human-Computer Interaction research which focuses on computer interfaces that respond to human touch or expression, allowing the machine and the user to interact physically.
Implementations
- touch screens [1]
- stylus pens
- facial sensors
- stylus pens
Current Problems
There are still many problems with hands-on computing interfaces that are currently being eradicated through continuing research and development. Because some interactions between human and machine are ambiguous, the mechanical response is not always the desired result for the user. Different hand gestures and facial expressions can lead the computer to interpret one command, while the user wished to convey another one entirely. Solving this problem is currently one of the main focuses in research and development.
Researchers are also working to find the best way to design hands-on computing devices, so that the consumer can use the product easily. Focusing on user-centered design while creating hands-on computing products helps developers make the best and easiest to use product.
References
- ^ http://research.microsoft.com/sendev/
- ^ http://research.microsoft.com/research/detail.aspx?id=2 Microsoft Research for Human-Computer Interaction