@antedelivian: My vote is for a UART for the following reasons...
UART would certainly use fewer pins, but that's not a major concern in this case. Re the framing, parity, and overrun errors, my two MCUs are doing to be sitting just a couple of inches from each other -- I don't think therre's going to be much of a problem -- as a worst case if there were to be an error the end result would just bea glitch on the main display. One the otheer hand, it wouldn;t hurt me to add parity just for the heck of it :-)
Protocols are sliced into little threads, which are then executed on any of the available CPUs when needed. Libraries and Virtual Peripherals can then be defined completely hardware independent and don't need a specific set of peripherals anymore.
So just reuse - or better, design the Virtual Peripheral all by yourself and use it in your project. Use an extended version of the Arduino IDE or Atmel Studio. Use the Arduino language or C++. Or add your customer interface (in VHDL or Verilog), there is still space for it.
I'm not sure if you ever heard about System Hyper Pipelining, didn't you ;-))
So stay tuned and checkout my Indiegogo Crowdfunding campaign, starting in 3 days. You will hear about it, for sure, or check out Arduissimo on Indiegogo.
PS: Programming your application on a multicore environment – which could be as easy as programming an Arduino – makes a lot more fun.
PPS: After almost 20 years of single CPU programming, I'm an old man tired of carrying about interrupts breaking my timing critical single core protocol or reading specifications about yet another implementation of a standard peripheral ;-)
In electronics domain 15 years are like 50 years. It is @Sanjib It is just amazing that your board with that P1 processor is still functional.
It may be easier to scrap the complete board all together instead of doing reverse engineering on that uniqude interface with P2
"Have you ever noticed how each design decision has a ripple-on-effect that influences other decisions downstream?"
-Yes, that is how the life is for engineers! :) Currently I am going through something for which, such a decision that was made 15 years back. Fifteen years back, a design was upgraded to add one Ethernet port, keeping the rest of the hardware same (for minimum effort) and by adding an additional co-processor (let's call P2) having Ethernet port (MII interface). The main processor (let's call P1) on board did not have MII interface. But it was kept and not replaced to save efforts. The communication between the old main processor P1 and the new co-processor P2 was implemented using an unique interface of the new co-processor P2 which allowed the main processor P1 to access shared memory through P2. Now that Ethernet processor P2 which was added to save time then, is obsolete. The main processor P1 is not yet obsolete but the original developers have left or retired from the company; There is no straight forward replacement for P2 and none of the modern processors support that unique interface P2 had.
The lesson that I have learned from this: Whenever there is a non-standard, not so popular interface is used to do something quicker that point of time in short term, there remains a high risk for the need to redo things in long term when that special/unique interface is obsolete.
Drones are, in essence, flying autonomous vehicles. Pros and cons surrounding drones today might well foreshadow the debate over the development of self-driving cars. In the context of a strongly regulated aviation industry, "self-flying" drones pose a fresh challenge. How safe is it to fly drones in different environments? Should drones be required for visual line of sight – as are piloted airplanes? Join EE Times' Junko Yoshida as she moderates a panel of drone experts.