New Age of Product Design
Brian Pereira | 07 November 2007
Many of us equate product design to looks. Someone once asked Steve Jobs what he thought of design. For Jobs, design is the ease with which one uses a product-it's the man-machine interface. And to master it, "you need to empathize with the product," said Jobs.
Companies like Sony, Apple, and Philips (among others) spend millions on design. The iPod, Macintosh computer, and Sony Walkman have exemplary design. Everyone is comfortable using these devices or machines because of their elegant and simple interfaces. Commenting on the Airbus A380's cockpit design and control interfaces, a test pilot said that flying the new airplane was "as easy as riding a bicycle". This is testimony to the application of ergonomics besides gadgets.
Last week, I got to use a product of fine design (more on this later). After reminiscing on an incident during my college days, I marveled at the progress in man-machine interfaces.
During a lab session in college, a peon placed an ordinary wooden box on my workbench. On closer examination, I saw a row of toggle switches on one side and some LEDs. A chip was mounted on the top. The practical session in the electronics lab was based on the functioning of logic gates. We were required to activate specific switches to provide some input; a few LEDs would light up in response. And in that manner, we could test the output for gates like AND, OR, and combinations like NAND and NOR. Early personal computers, such as the Altair, did not have a keyboard or monitor, and were programmed in a similar fashion.
Last week, someone placed a Logitech MX Air mouse in my palm and showed me how to use it. With MX Air, you move the mouse pointer just by waving the mouse in air. Press the volume button on the mouse, flick your wrist to the right and the audio volume increases. Flick your wrist to the left to decrease volume. No more + or - buttons or volume sliders! The MX doubled as a cordless presenter for my PowerPoint presentation. I'm told this mouse has gyroscopic sensors that advise it about its position in real-time, as you rotate it in a 360 degree plane. Now that certainly expands the definition of Mouse Gesture (which is usually along a horizontal plane on a flat surface).
Let's talk about touch screens and surface computing. The iPhone's interface gives a new meaning to touch screen, which earlier meant using a stylus to select options on the phone or a PDA's screen. With iPhone, you use your fingers to not just tap, but also to pinch, and stretch pictures or windows.
Microsoft's Surface Computer is a fine example. I watched a video clip of the same and was amazed at its realism. Simply, it's about surfaces becoming smart. For instance, the table top screen shows water, and as you 'dip' your finger in the 'water', little ripples or waves form around it, just as it would happen in a real pond. I'm sure this computerized coffee table will be the centerpiece of living rooms and lounge bars a few years from now.
HP is also using surface computing technology for its kitchen computer. HP Labs is developing Gesture-based Command and Control (GECCO) technology as an open source code. And this has great potential for laptops. For instance, you could just draw and doodle on the touchpad to log in. Draw an 'O' and you could start the Outlook e-mail application. Try 'W' for Word and 'M' for Media Player too.