Contrary to current trends, the CPU may get bigger in the future. Yes, the size of CPUs are larger today than they were in the past, but they also pack in more transistors. The future may involve larger CPUs but with a much lower density of transistors. Why? Because of optics.
The idea of purely optical computers—and hybrid electronic-optical computers—is not new. But a set of recent advances is the first time I’ve thought we might be entering an era where some functions beyond long-distance communication will be handled optically.
Have you seen the light?
There are two properties of optical computers that make them attractive. The first is that they are naturally fast: light pulses travel at (yes) the speed of light. And when light switches light—the optical equivalent of a transistor—it happens very fast (think femtoseconds, which are 10-15 of a second). These two properties combine to make optical computers much faster than electronic computers.
The downsides are related directly to the upside. Using light to switch light is generally inefficient, meaning that you spend a lot of energy to compute. Likewise, light travels fast, but it also spreads out, meaning that components have to be separated by large distances.
The middle ground is a hybrid device. Light carries the information, but switching is performed electronically. Essentially, the light has to be absorbed to generate a current. The generated current is then used to modulate another optical signal to create an optical transistor.
Materials capable of absorbing the light (and creating electrons) would normally be quite large, which necessitates a large capacitor, from the point of view of the electrons. The electronic response is limited by the time it takes to charge and discharge the capacitor. The same story repeats itself when it comes to modulating the flow of light: a block of material has to charge and discharge.