We are using software extensively in every embedded project we undertake. The good news is there are embedded engineers out there who can wear both hats - Chip-level HW design and bare metal programmer. The bad news is they are in short supply. The entire embedded industry is moving toward a data-centric platform on both the consumer products and industrial side so data scientists with experience in both are another valuable resource to add if you can find one.
I couldn't agree with co-design more. Like you, I've been on projects where the HW design has been essentially completed without involving at least the SW architects at a very early stage. You very clearly illustrate the potential costs of making that kind of mistake.
If you have hardware teams and software teams you're screwed!
Embedded systems should be co-designed.
It is not enough to just make hardware with power saving modes - or whatever - features. The interfaces need to be presented so that they are readily accessed by the software in a usable fashion.
What seems to be a trivial issue from a software (or hardware) point of view can be massive from the other side of the fence.
I once worked on a project where a peripheral could be connected via either a PCI equivalent bus or an internal USB bus. The hw engineer chose to use the PCI bus. The impact of this decision was the addition of a PCI software subsystem - and really complex sleep mode power handling. The USB option would have been trivial to manage. Net difference: approx 4 extra man months of effort, a two month delay in product shipment and an emergency software patch releases.
Thanks! There is also the consideration that changing a chip is hard and expensive compared to changing software (no, I'm not playing down software engineering cost but it does pale a little when compared with the cost of a set of masks...). So, chip designers do need to think a lot longer and harder (and model a lot more) in order to make the "right" decisions. An iterative process is not really viable and trial-and-error is very expensive. SW developers do have the advantage of being able to iterate designs much more quickly and try things out over a longer period.
But SW engineers are newer at the game and have a lot to learn about designing in power-efficiency from the ground up. If your OS scheduler is designed correctly, it matters not what you do in your application, power consumption will be lousy!
Chris is right on. HW folks go nuts to save a few percentage points on power or performance, and then have an epiphany when they realize how often a quick decision by the SW team can make a 10x or more difference. I think this is mostly because the HW team lives and breathes this stuff, and is keenly aware of the implications of what they choose, but the SW team has a much harder time seeing the implications of some of their choices.
As powerful and capable hardware becomes cheaper, more standardised and easier to design in (and I believe all those things are true these days), software is increasingly the means by which product companies differentiate themselves and their products.
It is also the means by which you get the most value out of the hardware. You can build all the whacky features you like into the chips but, unless the software makes use of them (and makes _good_ use of them) you're simply wasting effort and silicon.
I heard a case in the not too distant past that a major mobile device manufacturer achieved a 50% increase in battery life for their product simply be re-spinning the software. It may have been possible to achieve similar gains by respinning the chips but I am willing to bet it would have been much harder and much more expensive.
You're absolutely right, software design and efficient (hardware-sensitive) implementation of that design is increasingly the most important factor in developing really great products.