Not everyone working on CPU architecture is with one of the big manufacturers.
We typically only get news about the major players in the CPU game. It is quite refreshing to hear about a small group of people hoping to make big changes in CPU architecture, named Out-of-the-Box Computing. "The Mill" is their name for the processor. In this interview conducted by Hackaday, Ivan Godard explains what the Mill is and how it is different.
In this video, Ivan covers the basics. He discusses the history of Out-of-the-Box Computing and the ideas and inspiration that formed the Mill CPU. Stating the stagnation in processor advancements after the RISC vs. CISC wars, he says that his group just knew they could "do better." They identified a huge gap between the price-points of the embedded world and the desktop-computer world and thought that if they could just fill that massive gap, they could have something really big.
When asked if their intent was simply to produce a core, or to go all the way to producing silicon themselves, Ivan responded with this:
"Intel's quarterly dividend is bigger than ARM's annual sales. Consequently yes, we would like to be a chip company. The fallback option, of course, is that we can be an IP house."
Another quote that really stood out is:
We really are a great supercomputer chip. Nobody makes any money at it, but they'll do anything to get more -- and we're more."
In Part 2, we get to learn a bit about the internals of the Mill CPU. Ivan compares it to a standard DSP, but points out that the advantages lie in the way that the Mill manages roughly 10% of the power usage of other chips for the same computations. It does this by completely rethinking how instruction sets are handled, a topic he covers in-depth in this video. He points out that, even though many people may not require the higher computational power, or even the lower power usage of the Mill, the more efficient use of space will allow for higher yields in manufacturing.
While discussing the issue of expanded memory bandwidth, Ivan pulled out this gem, which is particularly amusing:
Some years back, we took a proposal to Lawrence Livermore and they said, "Can this be built?" We had to go to a partner (it was LSI Logic at the time) and say, "Can you build us a chip with 2,700 pins?" And they swallowed real hard and said yes.
He went on to note "The yield will be horrible, it will be incredibly expensive, and some people will want it." He actually discussed the memory interface in depth in this video, showing that, in fact, they need less memory, not more access. The rough estimate is 25% less memory access than others.
On the topic of difficulties that they are facing he explains that money is a huge hurtle. They have to replace their tool chain, port an operating system, and even file over 50 patents.
Continue on the next page for parts 3 and 4.
In part 3, Ivan talks a little bit about the structure of the company. At this point, it is a completely self-funded "company of incorporators," a common place to be, where there are contractual agreements on how things will be divided eventually. They have been working on this for 10 years, which means that the team has changed many times over the years.
When asked how close they are to producing something, Godard offers his best estimate at roughly one year to 18 months. He elaborates that it could be two to three years before they have "saleable silicon," but again this is all an extremely rough estimate.
If you're wondering why others haven't done all this already, Godard has some thoughts for you. He explains his theories as to why others haven't explored this method, pointing out that others do in fact make changes, but those tend to be fairly small improvements on existing items.
This interview was quite enlightening. The Mill seems like something we'll be hearing a lot more about in the coming years. I'm very curious to see how much of the expected power efficiencies actually carry over from simulation to the physical product. I'll be keeping my eye on these folks for sure.