Will computing power ever reach its limit?
Answered by Mario Paniccia, Jason Howard and 3 others
  • Mario Paniccia

    Mario Paniccia

  • Jason Howard

    Jason Howard

  • John Oliver

    John Oliver

  • Jonathan Strickland

    Jonathan Strickland

  • Science Channel

    Science Channel

  1. Mario Paniccia Intel Fellow, Director, Intel Photonics Technology Lab


    I think Moore's Law will go forever. It just will morph. So the doubling concept -- I think we're on a treadmill that we will never get off. But think about what doubling is. So we're first talking about continue to double in size and transistor dimensions. Now, we've gone to 3D transistors to improve. But when we run out of space this way and the gates are small this way and we run out of atoms, we'll then go to stacking. So then if I take two dye and I put them together I have multiple cores and multiple dye and now I can still double. It's like big cities, in New York, you just start building up.

    So, it will be a morph of Moore's Law. But I think anybody who's predicted Moore's Law ending has always been wrong. And I think this is one of those things, where even if I look at the last 15 years, every five years everyone says, "Well geez, where do I go after this process generation?" Then you fast-forward five years, and you see two more generations have been figured out. I think it's amazing what happens when people are given the challenge of a must-do thing, we have to do it. And innovation just keeps morphing.

    More answers from Mario Paniccia »

  2. Jason Howard Senior Technical Researcher, Intel Labs


    I think there are going to be physical limits. If I had enough power, maybe the power of the sun, and harnessed that, that would be a physical limit of how much computation I can do. But I think in the near future the limit is almost endless from what we can see. You know, we'll just keep pushing it higher and higher and to more and more, greater and greater scales, but I think at some point there will be a limit -- but not in my lifetime.

    More answers from Jason Howard »

  3. John Oliver Senior System Architect, Intel


    Well, because of Moore's law, we are able to get more and more computing power with less and less energy needs. The form factor of this is shrinking dramatically. What's great about Intel is if you want to know what's going to happen in the future, the next five, six, seven years, we're creating that right now. That gives me a unique perspective. I have a crystal ball. I can see the future, at least that far out.

    I know what kind of processing capability, what kind of thermals and power needs I have, and what kind of packaging I can do in that timeframe and the costs associated with that.

    With that I can take advantage of that knowledge and start creating capabilities that others aren't even thinking about yet. So smaller, more powerful computers. Computers today -- we've been struggling to get them over a threshold. I mean, you can only do so much gaming and so many spreadsheets and so many e-mails. It's kind of a joke. That's caveman mentality with computers.

    A lot of people think that's all computers can offer and that we've hit some kind of plateau. We're not hitting a plateau. We're about to get to a launching point. We're about to get to a level of performance and capability that a lot of people like myself have been waiting for for a long, long time, that enable us to step into a new realm, a new paradigm that has never existed before. Things are going to get really exciting this next decade. I think we're right on the edge of that next leap. We're about to take it.

    More answers from John Oliver »

  4. Jonathan Strickland Senior Writer, HowStuffWorks
    Back in 1965, Gordon Moore observed that microchip manufacturers were managing to cram twice as many transistors on a square inch of silicon as they could two years previously. In other words, they were doubling the number of transistors they could fit on a chip every 24 months. His observation became known as Moore's Law. While it's not a physical law of the universe, it has become a self-fulfilling prophecy.

    Every few years, someone will declare that Moore's Law’s days are numbered. Gordon Moore himself has said that it would end. But manufacturers still find ways to push computing power on a schedule comparable to Moore's initial observation. Could the law remain true indefinitely?

    The answer depends on whether we find an alternative to traditional microprocessors. Companies like Intel have pushed microprocessor development by adding features like multiple cores and multi-threads of data. But we'll eventually have to find an alternative to classic chips because the old way simply won't work anymore. It's all because of quantum mechanics.

    On the quantum level, the world seems to go topsy-turvy. Quantum particles like electrons are mysterious -- we can't be entirely certain where they are at any particular moment. What we can do is associate their location with a probability cloud. As an electron approaches a barrier -- such as the gate of a transistor -- part of that cloud might extend beyond the gate to the other side. That means there's a chance that instead of being on one side of the gate, the electron is on the other. And sometimes, that's just what happens. The electron appears on the other side of the gate as if it had tunneled through. We call this quantum tunneling.

    Why does this matter with computers? Well, electronic computing centers on channeling electrons. If the electronic elements are so small that electrons can tunnel through, our computers will produce errors and miscalculations. We'll have reached the physical limit of classical computers.

    But that doesn't mean computing power will stagnate. We may find alternatives to the classic approach. That could include quantum computers, which use qubits to perform calculations. Or it may require us to create biological computers out of DNA. In the end, Moore’s Law is all about human ingenuity -- a resource that seemingly knows no bounds.

    Computing Powerqa3

    More answers from Jonathan Strickland »

  5. Moore's Law, that transistors double in capacity every 18 months, has kept the computer industry hopping, but there may be a limit to how far it can go. It is true that nanotechnology is allowing scientists and engineers to make ever smaller transistors, but there will likely be a point at which they can't get any smaller. In addition, working down at the nano-level, quantum physics start to come into play, which can make for some strange, unpredictable behavior.

    More answers from Science Channel »

Still Curious?
  • Will we soon have superhuman troops?

    Answered by Science Channel

  • How can new technology monitor soldiers' health?

    Answered by Science Channel

  • Just how thin is nanowire?

    Answered by Planet Green


What are you curious about?

Image Gallery