The most popular and comprehensive Open Source ECM platform
Human-computer interactions have been long dominated by keyboard, then mouse, and more recently voice. Much work has also been done recently to enable computers to recognize human gestures.
The combination of voice and gesture recognition creates what is being called “Natural User Interfaces.”
Bill Gates described Natural User Interfaces, saying that “until now, we have always had to adapt to the limits of technology and conform the way we work with computers to a set of arbitrary conventions and procedures. With NUI, computing devices will adapt to our needs and preferences for the first time and humans will begin to use technology in whatever way is most comfortable and natural for us.”
Research & Markets predicts that gobally, the gesture recognition market will increase from from $5.4 billion today to $23.5 billion by 2023.
Dr Sean Follmer, an expert in human computer interaction at Stanford University, said that “with mobile computing devices like smartwatches or even, in the future, augmented reality glasses, we no longer have large surfaces on which to place keyboards or mice, so we need to create new input devices and technologies that can allow us to interact while we are on the go.”
Mark Weiser, early Xerox PARC researcher, said that “a good tool is an invisible tool. By invisible, we mean that the tool does not intrude on your consciousness; you focus on the task, not the tool.”