Of course, that goes without saying. I meant it is the framework of all the earlier games machines/computers. In actual fact a modern Pentium class PC @ 3GHz struggles to do some of the things an Amiga could do (seamlessly anyway) due to extensive operating system issues (non-realtime) and the most expensive thing of all which is retargetable graphics. Early machines benefitted from the afct that everything was designed from the ground up for a particular screen size and depth.
Some of the arcade machines from around the mid 90's used clever hardware techniques to address CPU shortcomings of the time which are compute expensive in SW so there could be issues there as well.
Let's put it into the following frame of reference:
8-bit games machines.
For me personally those older games were a lot more enjoyable than a lot of the subversive reality games of today.
Well, it really depends onthe complexity of the emulator. Emulating old NES games isn't too difficult, however if you were to attempt to emulate an Xbox360, you'd find yourself getting nowhere. The last time I played with emulation a pretty strong computer struggled with Nintendo64 emulation. It has been a while though.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.