I used to help manage compatibility at Symbian (the smarphone operating system company), and I agree it's a constant dilemma. If you want your customers to buy your latest product, they need to know how much time and effort will be involved in migrating their old applications, hardware, working methods and so on. It's no good just saying "we don't support the old product any more" and expecting customers to stay loyal ([cough] Windows XP).
It's a dilemma for customers as well. They need a migration path - information about how and when to modify anything that relies on compatibililty with your product so they can assess what will continue to work after an upgrade and what won't, what the costs and timescales will be and weigh those costs against the benefits of upgrading. If you spell that out for customers, they will tell you how much effort to spend on compatibility. As you say above, "Some of the best ideas come from our customers."
You may be suprised to find some old (and even new) features are almost completely unused and hence there's no point spending time and money supporting them. They should be deprecated and eventually removed. On the other hand, you may be pleasantly surprised to find that some original features you thought were obsolete are actually still heavily used (even though better alternatives exist) but no-one complains because they "just work".
If breaking compatibility is more in your interest than your customer's, it may be worth making a migration tool or offering free assistance if an upgrade goes wrong.
The Intel 8086 is in turn backward-compatible at the assembly language level to the 8-bit Intel 8080 and 8008. Intel provided a tool to machine-translate 8080 ASM source code into 8086. For the most part it was fully automatic, but there were a few instructions that needed manual intervention. Even so, it made it cheap to port 8080 software to 8086, providing instant software for 8086 and 8088.
The automatically-translated code used 8-bit registers and didn't take full advantage of 8086's 16-bit instructions, so you didn't get any performance improvement. If memory serves, the new 4.77 MHz IBM PC ran early software slower than an 8 MHz Z-80 computer in spite of the PC's 16-bit 8088 processor.
Backward and forward compatibility of programmable interfaces is feasible and has some other significant advantages (e.g., allows proprietary versions). There is no need to trade off future improvements for past if an interface is designed as described in published papers. See "etiquettes" in Fundamental nature of standards: technical perspective IEEE Communications Magazine, Vol. 38, #6, June, 2000, p. 70. or isology.com/pdf/fundtec.pdf
Quote: "Innovation should not be restricted for compatibility reason."
Who said anything about restricting innovation? Innovate as fast as you can, otherwise the competition will get all the new customers. But if you want your old customers to stay with you as you innovate, they need a migration path. In practice, this often means continuing to support the old way of things for a while in parallel with the new innovations.
Sure, that takes time and effort (which I guess you could argue is a "restriction") but the alternative is losing old customers. It's much more expensive recruiting new customers than keeping old ones.
In the era of the old and revered Nokia 5110 GSM cellphone, they stuck with the same form-factor and interface for quite a while. So if you had a car kit or a personal hands-free headset that worked with your old phone, it would work with your new phone too. This went on for quite a while, into the CDMA era. Then they changed, for no apparent reason, the phones were not that much smaller or thinner. When this happened, my employer (who used a lot of cellphones and accessories) began using other companies' gear. I think Nokia lost a lot of customers like that, and today they are a shadow of their former self (not for that reason alone though). So though consistency may be a hobgoblin, sometimes it can work in your favour, if you get everything else right.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.