The "not in my backyard" syndrome typically keeps these power stations away from highly populated areas where the excess heat could be used to heat buildings. Next best is efficient energy extraction at every stage in the system (from steam to hot water to cold water).
I agree with the "no free lunch" dictum, however, there is also no sense in not availing one self to a low-cost lunch. Well aware that the intuitively obvious solution is not always viable, couldn't we take advantage of convection air currents swirling upward through the cooling towers to capture energy via wind turbines? Recognizing air flow reduction would lower cooling efficiencies, is there a middle ground where some portion of blatantly excess losses might be recaptured?
The conversion of waste heat to usable power is easier said than done because of the "no free lunch clause" of thermodynamics. Besides, the idea of using TEGs to capture waste heat in generating station (silicon-germanium TEGs) is not new, and has not met with much economic success in the past due to the difficulty in economically producing vast amounts of of SiGe material (although there are dozens of other TE materials, most include one or more expensive/rare elements)...but I wish the professor good luck.
I agree. The amount of wasted heat energy is staggering. A lot of our power generation wastes about 50% or more of the initial thermal energy. Even capturing a fraction of the waste would be a significant increase in power available.
Just a thought.
Replay available now: A handful of emerging network technologies are competing to be the preferred wide-area connection for the Internet of Things. All claim lower costs and power use than cellular but none have wide deployment yet. Listen in as proponents of leading contenders make their case to be the metro or national IoT network of the future. Rick Merritt, EE Times Silicon Valley Bureau Chief, moderators this discussion. Join in and ask his guests questions.