The following are some of the mis-steps taken in the design of computer architectures that need to be undone:
1. Adoption of the Von Neumann Architecture.
This led to the following problems:
a. Machine Code use as the means of programming, rather than, patching together the hardware in real time
using patch-cords (somewhat like patching calls manually at a manual telephone exchange).
b. This soon gave rise, despite Von Neumann's vehemant opposition to "wasting" valuable computer time for
noncomputational tasks, to the development, over his objections, of programs which could convert from Programs
composed of punched cards or paper tape (then translated to ASCII or EBCDIC code), from Assembler Mnemonics to
Object (Machine) Code.
b. This led to freezing of the hardware architecture in the interests of forward compatibility of the hardware with all the advances then being made in software design...
i. Development of program "Monitors" for supervising the running of jobs
ii. Development of Autocoder programs in the US that converted assembly language to object (machine) code.
iii. Development of Autocode programs in the UK that converted "high level" languages to machine code.
iv. The development of languages such as FORTRAN, COBOL, and ALGOL that allowed the computer to be programmed using
these languages instead of the painful lower level assembly language or machine code.
Here are the problems associated with these developmets:
Since hardware design was beyond the ken of these new software developers, they got into the habit of encapsulating the hardware in cocoons of software that hid the hardness of the hardware from sight and swept them under the carpet, thus machine language was hidden by assemblers and assembly language by high level language compilers, and so on. All these remained vestigial artifacts of computation, although being software designers, they did not try to get rid of them.
v. Hardware innovation was limited to making the Von Neumann Architecture faster - this led to complexity skyrocketing while basically letting the essential VNM architecture stagnate, and a band-aid approach to speed improvement. Band-aids to making the hardware faster included: Pipelining, branch prediction and instruction prefetching, data and instruction cache, translation lookaside buffers, virtual memory heirarchies, loop unrolling, register renaming, and on and on, giving smaller and smaller benefits with greater and greater complexity.
vi. The invention of pointers and data structures, all based on binary objects stored in memory. A slight error in a pointer arithmetic could start code execution from places not corresponding to instruction boundaries, or data access similarly displaced from their intended frames of reference, leading to program runaway. At the same time, reuse of the hardware made sure that debugging was made difficult, as all the evidence of what was happening was continually being overwritten.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.