I should write up a larger technical document on this, but in the meanwhile is this short (-ish) blogpost. Everything you know about RISC is wrong. It’s some weird nerd cult. Techies frequently mention RISC in conversation, with other techies nodding their head in agreement, but it’s all wrong. Somehow everyone has been mind controlled to believe in wrong concepts.
An example is this recent blogpost which starts out saying that “RISC is a set of design principles”. No, it wasn’t. Let’s start from this sort of viewpoint to discuss this odd cult.
What is RISC?
Because of the march of Moore’s Law, every year, more and more parts of a computer could be included onto a single chip. When chip densities reached the point where we could almost fit an entire computer on a chip, designers made tradeoffs, discarding unimportant stuff to make the fit happen. They made tradeoffs, deciding what needed to be included, what needed to change, and what needed to be discarded.
RISC is a set of creative tradeoffs, meaningful at the time (early 1980s), but which were meaningless by the late 1990s.
The interesting parts of CPU evolution are the three decades from 1964 with IBM’s System/360 mainframe and 2007 with Apple’s iPhone. The issue was a 32-bit core with memory-protection allowing isolation among different programs with virtual memory. These were real computers, from the modern perspective: real computers have at least 32-bit and an MMU (memory management unit).
The year 1975 saw the release of Intel 8080 and MOS 6502, but these were 8-bit systems without memory protection. This was at the point of Moore’s Law where we could get a useful CPU onto a single chip.
In the year 1977 we saw DEC release it’s VAX minicomputer, having a 32-bit CPU w/ MMU. Real computing had moved from insanely expensive mainframes filling entire rooms to less expensive devices that merely filled a rack. But the VAX was way too big to fit onto a chip at this time.
The real interesting evolution of real computing happened in 1980 with Motorola’s 68000 (aka. 68k) processor, essentially the first microprocessor that supported real computing.
But this comes with caveats. Making microprocessor required creative work to decide what wasn’t included. In the case of the 68k,
[…]
Content was cut in order to protect the source.Please visit the source for the rest of the article.
Read the original article: