It's not news that engineering design is largely about making tradeoffs. There is rarely a "free lunch" in this business, in which the design decision doesn't have both positive and negative consequences. We balance speed, accuracy, power, reliability, footprint, part availability/sources, efficiency, and, of course, costs, to name a few critical parameters; I am sure you can add many to the list. For example, a 2-MHz DC/DC switching regulator has one set of general attributes, while a 4-MHz switcher will likely be smaller, but bring some drawbacks in available passive component sources and costs (to name just one tradeoff pair).
But there's one area of power where I feel we have been able to get away, to a large extent, with something for almost nothing: the use of higher voltages to reduce basic resistive (IR) losses. Whether it's in a chassis, on a large PC board, or in a power-line distribution system, we know that using higher voltages is more efficient in terms of distribution losses. While sometimes this incurs the cost of stepping up the source voltage, in many cases this is not a cost, since the source voltage is already much higher than the final AC or DC line. Even when you do have to do a step-up, it seems a modest cost in most cases. The whole idea seems to me like one of those tricks in numerical and quantitative analysis, where you can cut through and work out a complex equation, so you add a slack variable or LaGrange variable, make the problem into one you can solve, solve it, and then remove the added element—but it is not that at all.
When I first learned about the clever idea of using stepped-up, higher line voltage to reduce loss—mind you, this was way, way back, in the days of the dinosaurs, of course—it struck me as a pretty good idea. As I learned more about engineering design and the reality of tradeoffs we wrestle with, I have come to view it as an amazingly good and lucky situation, one of the few places where the laws of physics don't work against us—as they often do.♦