Another issue I understand is, literally, the clock. Way back, the early computers were a lot less reliable than the 1's now, & it made sense to have a system clock that forced the cpu to literally stop, check all its work, correct errors, & continue. Now error correction isn't nearly as needed, & the clock is become a limiting factor. Intel started xperimenting with clock less chips a few years ago, & found they could make significant speed leaps by eliminating it. Problem is, all the architecture up to now has been based around that clock, & to change it now...
Another way to make a huge leap in computing power/speed: analog.