John Seely Brown
Craig Mundie Chief Research and Strategy Officer, Microsoft Corporation
The way I think about in terms of this ultimate ubiquitous computing concept is that the microprocessor itself, as it has become more powerful but gotten smaller and smaller physically, has resulted in we're at the point today where you could pretty much put a computer in anything you want to put a computer in. We have these little sensors about the size of a grain of sand that power themselves from the air and temperature movement and they have a computer in them. So the idea that we can have a computer anywhere we need a computer, we're sort of there now.
The problem is that they're all islands of computation. And it's too hard for people to manage all this disparate computing, and what do you do with it? I think the next big shift that is upon us – it's a big focus here at Microsoft – is this transition from the traditional graphical-oriented man/machine interfaces to what we think of as natural interfaces.
And with that, I think two things will happen. We'll elevate the semantic level of the conversation between people and computers because dealing with the computer as a collection of these devices will now appear as one big integrated system, not a bunch of disjoint little computers that you have to go deal with one at a time. Because you can deal with the computer more like another person, you'll increasingly come to think of it as a helper or an assistant, and less like a tool. These are the two big changes that I think are coming now, and will be critical to achieve to get the value out of the ubiquitous presence of computing capability where you're kind of living with a computer environment.
I think there's two aspects of naturalness in my mind. One aspect is that the computer has to emulate more and more of the human senses at a quality level that really is sort of human-like, and it has to do multiple of them at a time. If you deal with a person who for one reason or another has lost one of the principle human senses, it becomes more challenging to deal with them. The computers in the past weren't powerful enough to do a good job on even one sense, let alone multiple of them at the same time. We're now crossing through that threshold where we can do a pretty good job on the important human senses, and we can do them multiple at a time.
Machine vision. Speech synthesis. Speech recognition. Certainly touch and gestures as a function of speech. Those things we can do really quite well now. As a result, you can now compose those things together to make an interaction model that is more like another person and less like an artifact like a keyboard or mouse. The other aspect of naturalness is that you don't really want to have a learning curve. You want to take what you know in your natural life and find that that's all you really need to be able to get the computer to do something to help you.
John Seely Brown Visiting scholar at USC, Independent Co-Chairman of the Deloitte Center for the Edge
If we go back to the foundation, which is Mark Weiser, the penetrating genius of ubiquitous computing really is Mark's. The idea that drove both of us was a phenomenological idea. That is to say, how do you actually enable technology to disappear? How does it just stop being a part of your life? Now, in some sense you might say any piece of technology if it gets old enough you stop thinking of it as technology. But I'm talking about something that starts to co-mingle with work practices so thoroughly that there's no obvious boundary.
So I still go back to two of our original stories that I still live with because I'm a fanatic driver. It's just – think about, for example, ABS brakes on a high-performing car. Here's a case where basically the computer system and the sensors inside the car detect on its own that you are in trouble. Okay, the brakes are locking; it can determine that with incredible precision, and it can slowly take over and actually put you back in control of something you've actually lost control of, often without your even knowing it. So it enables you to become a much safer, higher-performance driver and the technology is willing to stay in the periphery and then seamlessly move into the center, take over when you need it and then beautifully fade away again, and you might not even be aware of what happened to you.
And so that's a sense to us that ubiquitous computing is something that is there in the periphery, is there to help us. It moves in to help us. It's just part of the natural practices and you don't even think about it as being technology. It's kind of like the question I love to think about often is as I'm going to look at a new car, I don't really ask what's the operating system of the car, but talk about a phone, talk about a computer, the first thing I think about is what's the operating system. So technology, no matter how used to it I am using that technology, it's very present. Whereas there's a huge amount of computational capability inside a car. But it just is there in a way that you never think about it. And it just helps you be a better performer.
The idea is that nearly everything we own would contain small computers. Phones, watches, vehicles, toys, tools and perhaps even clothing would have some sort of computing element or chip. Internet connections would be practically everywhere. Technologies like LTE or WiMAX -- or their descendants -- would let us maintain a persistent connection to the Internet all the time. In such a world you could walk down the street and intersections would know when you were approaching, adjusting traffic patterns as you get closer so that you could cross the street without waiting. Advertisements would be catered just for you. You'd always know where your friends were. Such a future raises many questions about culture, privacy and other elements normally unrelated to privacy. Perhaps in 100 years, privacy will be a foreign concept.
What attracted you to math?
Answered by John Maeda
How did Xerox PARC fail?
Answered by Professor Robert M. Metcalfe
What is the future of print media?
Answered by Professor Naomi Baron, Anya Kamenetz and 2 others