Computing has come a long way since the invention of the diminutive mouse. Unlike the keyboard, which was an adaptation of old technology, the mouse was the first purpose built approach to interacting with computers to catch on with consumers.
Tapping into computing’s sensitive side
As we surround ourselves with more and more computers we need new ways to interact with all that processing power
We now have a growing number of ways for humans to tell computers what to do. Speech recognition has reached a stage where you can talk to your PC and get it to follow instructions. Call centres can also ‘understand' what you say even if there isn't a human anywhere within earshot, reducing the need to tap in all those numbers.
Mice and speech recognition were just the beginning of the growing area of what has come to be called Human– Computer Interaction, or HCI. Screens that you can interact with and hand-held devices that can sense motion are already widespread consumer products. The future will be even more interactive.
Second Light (Microsoft Research Cambridge) enriches the intuitive user interfaces with new dimensions.
Even the touch-sensitive screen has only just begun to realise its full potential. How about a touch-sensitive sphere? That's what they are working on at Microsoft Research in Redmond, in the United States. Researchers there have already developed several applications for Sphere, including a picture and video browser, interactive globe visualisation, finger painting, and an omnidirectional video-conferencing application. Touch-sensitive computing is also behind the HCI work of the Computer- Mediated Living group at Microsoft Research Cambridge, in the United Kingdom. Shahram Izadi, a member of the team, likes working on what he calls “novel technologies to enable weird and wonderful forms of human-computer interaction beyond the desktop.” What this means, he explains, is “building, hacking, dismantling and playing with as much technology as possible.”
One of the team's projects is ThinSight (see Futures, December 2007). Shahram describes this as “a new technique for optical sensing through thin form-factor displays. It allows for detection of fingers and other physical objects close to or on the display surface. This essentially allows us to turn a regular LCD into a sensing surface that can be used for multi-touch and tangible computing applications.”
With Thinsight (Microsoft Research Cambridge) the experience with a Tablet PC is becoming truly interactive and intuitive.
ThinSight is an optical sensing system that can detect fingertips or anything else placed near the display surface. The idea is to put an array of infrared emitters and detectors behind a regular LCD. This then allows IR-based sensing without degrading the performance of the display.
The Cambridge team believes it has just begun to scratch the surface of what you can do with this new approach to sensing objects. They have already shown that ThinSight can detect shapes and hope to move on to detecting objects other than hands and fingers near the screen. You could even put infrared transmitters into objects to ease their detection.
The Cambridge research team also has other designs on our screens. Dubbed SecondLight, their latest device is a new form of rear projection. This new interactive surface technology has the same capabilities as a touch screen but can also extend the inter-action space beyond the surface.
Sidesight (Microsoft Research Cambridge) is giving a glimpse of next-generation mobile devices.
Part of the new technology is an electronically switchable projection screen that can be made diffuse or clear. Toggle the display between clear and opaque quickly enough and the viewer will not notice the switching. It is then possible to rear-project what is perceived as a stable image onto the display surface, when the screen is in fact transparent for half the time.
During the clear periods, the display can project a different image through the display onto objects held above the surface. At the same time, a camera mounted behind the screen can see out into the environment. This can then track the movement of objects held in front of the screen.
The display can also project on to those objects. All the time that this is going on, the main screen can show a completely different image.
As with many new approaches to HCI, it is too soon to know where SecondLight will go. Other ideas in the works include using the technology to scan documents. But we won't see their true potential until end users get these types of technologies into their hands.
“The future will be even more interactive”
Technologies like ThinSight and SecondLight are really just the beginning of the process of rethinking how we interact with computers. The true value of a computer interface becomes clear when you see it in applications. Take TouchLight, a technology developed by Microsoft Research that lets users move and manipulate 3D images with their hands. HCI experts like to describe this idea as a gesture-based user interface.
Professor David Gann, Head of Innovation and Entrepreneurship at Imperial College London, believes that HCI technology like TouchLight could revolutionise the innovation process.
For example, the software underpins a series of data visualisation systems that allow data from real and virtual objects to be combined, creating virtual prototypes in ‘immersive' studios with a high degree of detail. These studios came about when Microsoft, through its IP Licensing Program, licensed EON Reality to incorporate the TouchLight software into applications and hardware display systems for viewing and producing 3D content.
Second Light, Microsoft Research Cambridge.
“The studios combine the best technologies from computer games with advanced engineering and design tools, including holographic imaging and touch-sensitive virtual models,” says Professor Gann. These studios immerse experts and users in a virtual environment, where they can manipulate data using TouchLight to move 3D images by hand.
Companies can use these studios to try out new products before actually making something. “With these systems,” says Professor Gann, “firms and their customers can experience products and services before they are produced in reality.”
Technology is just the enabler of such applications. As in many applications of IT, their real value comes when users start to adapt them and put them to uses that the original designers never envisaged. This is one reason why there is growing interest in studying HCI as a subject in its own right. By looking at what people want to do, it might be possible to come up with even more imaginative ways of interacting with computers.