As a sometime Test Engineer, I have dealt with AVS and similar schemes, and there are particular issues in both Design and Test that have to be dealt with. First of all, the regulator should be designed for efficiency with a view of the use-scenario of the product. If it spends 1% of its time pulling maximum voltage, it probably doesn't make sense to maximize the efficiency then, if it means a loss of efficiency during the other times. So a reasonable use-scenario is important to take best advantage of this technology. For many battery-operated devices, it is probably most important to be efficient during "idle" modes. Or maybe not.
The other issue is in Test. When testing power-use, we usually make a tacit assumption that the supply voltage is constant and only the current varies in the VxI=W calculation. That may even be a reasonable incremental approximation in battery devices with slow Vbat changes, but it is not valid for AVS or situations where Rbat is significant; then the problem becomes non-linear, non-time-invariant, and special care has to be used in data acquisition: multiplying Vavg with Iavg does not give Wavg then. Since "averaging" is just low-pass filtering, a data acquisition system can't use low sample-rates and aggressive anti-aliasing filtering; you have to use fa$t sampling, followed by multiplication, THEN filtering+decimation... See http://www.eetimes.com/messages.asp?piddl_msgthreadid=45315&piddl_msgid=302942#msg_302942 for some discussion on this.