The year-end prediction lists from technology companies and research firms are - let's be honest - in good part thinly-disguised marketing pitches. These are the big trends for next year, and - surprise - our products are tailored-made to help you turn those trends into moneymakers.
But I.B.M. has a bit different spin on this year-end ritual. It taps its top researchers worldwide to come up with a list of five technologies likely to advance remarkably over the next five years. The company calls the list, âFive In Five,â with the latest released on Monday. And this year's nominees are innovations in computing sensors for touch, sight, hearing, taste and smell.
Touch technologies may mean that tomorrow's smartphones and tablets will be gateways to a tactile world. Haptics feedback techni ques, infrared and pressure-sensitive technologies, I.B.M. researchers predict, will enable a user to brush a finger over the screen and feel the simulated touch of a fabric, its texture and weave. The feel of objects can be translated into unique vibration patterns, as if the tactile version of fingerprints or voice patterns. The resulting vibration patterns will simulate a different feel, for example, of fabrics like wool, cotton or silk.
The coming sensor innovations, said Bernard Meyerson, an I.B.M. scientist whose current title is vice president of innovation, are vital ingredients in what is called cognitive computing. The idea is that in the future computers will be increasingly able to sense, adapt and learn, in their way.
That vision, of course, has been around for a long time - a pursuit of artificial intelligence researchers for decades. But there seem to be two reasons that cognitive computing is something I.B.M., and others, are taking seriously thes e days. The first is that the vision is becoming increasingly possible to achieve, though formidable obstacles remain. I wrote a piece in the Science section last year on I.B.M.'s cognitive computing project.
The other reason is a looming necessity. When I asked Dr. Meyerson why the five-year prediction exercise was a worthwhile use of researchers' time, he replied that it helped focus thinking. Actually, his initial reply was a techie epigram. âIn a nutshell,â he said, âseven nanometers.â
Dr. Meyerson's Ph.D. is in solid-state physics, and he's a chip guy. And he was talking about the physical limits on the width of semiconductor circuits, when they can't be shrunk any further. (The width of a human hair is roughly 80,000 nanometers.) Today, the most advanced chips have circuits 22 n anometers in width. Next comes 14 nanometers, then 10 and then 7, said Dr. Meyerson.
âWe have three more cycles, and then the biggest knobs for improving performance in silicon are gone.â he said. âYou have to change the architecture, use a different approach.â
âWith a cognitive computer, you train it rather than program it,â Dr. Meyerson said.
The cognitive path, if successful, would raise a machine's level of recognition of the world. Today, computers mimic human intelligence with brute force, collecting a vast amount of data and then sifting for statistical patterns that identify specific words, images, biological or chemical compounds.
But a cognitive computer, Dr. Meyerson said, would ânot have to go for all the fine detail, but go up and see the interesting thing. This is about moving computing way, way up from where it is today.â
He provides a sensor-based example. The computer is presented with a white powder in two piles . One is salt and the other is sugar. âIt could taste the difference, without having to do a detailed chemical analysis,â Dr. Meyerson explained.
Cognitive computers, by learning some tricks from the way human brains compute, could in theory deliver big energy savings. I.B.M.'s Jeopardy-winning computer is a very clever machine, defeating its human rivals last year. But Watson runs on 85,000 watts of electricity. The human brain hums along on 20 watts.
âWe need to make the machines much, much more efficient,â Dr. Meyerson said.