For simplicity, most ICs — and in general any design upwards from a certain complexity — rely on clock ticks: a CLK signal governs the entire thing. In a manner of speaking, all events happen only when the clock so dictates. This unwritten rule sounds logical to most of us, and in electronics, throwing the clock out used to border on taboo. Are clocks today becoming an obstacle to progress?
Let’s try to exemplify the concept. On the one hand, picture a large company with perfectly synchronized and punctual workers, clocking in and out. Some employees might be faster (and smarter) than others completing their work, but they’re forced to slow down and/or wait until the entire team has finished a task. In the same manner, some teams would have to wait for other teams until everything’s properly delivered. And now, on the other hand, imagine a startup consisting of a small team, delivering products just based on different pipelines per worker, or per team, without any strict synchronization. At the end, the profit for the giant company does not necessarily exceed the startup’s gains. Along this thinking, we could say that microcontrollers are like companies…
But if everything works fine, why going clockless? Is it just cooler, perhaps? Obviously not. Asynchronous microcontrollers, for instance, have some advantages over their slave-to-the-clock fellows. They are extremely low-power, keep the levels of electromagnetic noise infinitesimal, and can run faster and more efficiently. Besides, in a device in asynchronous mode of operation, the peripherals consume literally zero power when they are not being used, and the same applies to the CPU while in sleep mode. A clockless chip is also capable of accelerating, or slowing down, according to external conditions (such as ambient temperature). These advantages are not to be sniffed at, considering that some of the most cutting-edge projects are heading to a clockless approach. For instance, DARPA’s SyNAPSE neuromorphic chips, some of them with the highest transistor counts ever produced, work in asynchronous mode.
So what is it? Overclocking, underclocking, or zilch-clocking? Some designers believe that in the future, the preferred technique will be to dispense with clocks altogether. However, naysayers also have good reasons to raise eyebrows. Most design suites are not intended for asynchronous logic, and engineers are not usually trained in that field, which slows things down, making the design process extremely time consuming. Also, within the academic world, aside from certain research groups, the subject is almost esoteric. Oh, and regarding interfacing between sync and async chips, just like politics: It’s better to not speak about it!