I like the idea of trust-based security. But at some point I still have to be able to say that I trust a program.
Under something like Bit9's solution, if I find a new piece of software (say open source) do I have to wait until someone on their end can whitelist it before I can run it?
As for self-signed, all they would have to do is establish a way for clients to get temporary certificates that their system would allow.
Still, it all comes down to who has the final say in what is trusted: me, my OS, my hardware, or my security software.
So there's been this concept of signed code: executables are signed by their creator, who vouches for their safety, and the OS checks that the creator is who they claim they are and that the executable has not been modified. Microsoft implemented this because they had a horrible problem with third party software and drivers; they required it for drivers for years, but didn't make it mandatory for user executables. Bit9 could use this infrastructure by re-signing the executables they deem to be safe; I suspect that they instead built their own implementation. Unfortunately the article doesn't mention which platforms are covered: I assume Wintel and PCs, but they could also be targeting smartphones.
Signed code is coming our way: the new EFI BIOS requires signing of BIOS images, and of the boot loader---this is required by the new Windows 8 hardware spec from Microsoft. I am apprehensive whether this is a good idea all the way through: it essentially gives the control over what software one can install and use to the signing entities. I hope that all such schemes allow self-signing of home-made executables.