The "not in my backyard" syndrome typically keeps these power stations away from highly populated areas where the excess heat could be used to heat buildings. Next best is efficient energy extraction at every stage in the system (from steam to hot water to cold water).
I agree with the "no free lunch" dictum, however, there is also no sense in not availing one self to a low-cost lunch. Well aware that the intuitively obvious solution is not always viable, couldn't we take advantage of convection air currents swirling upward through the cooling towers to capture energy via wind turbines? Recognizing air flow reduction would lower cooling efficiencies, is there a middle ground where some portion of blatantly excess losses might be recaptured?
The conversion of waste heat to usable power is easier said than done because of the "no free lunch clause" of thermodynamics. Besides, the idea of using TEGs to capture waste heat in generating station (silicon-germanium TEGs) is not new, and has not met with much economic success in the past due to the difficulty in economically producing vast amounts of of SiGe material (although there are dozens of other TE materials, most include one or more expensive/rare elements)...but I wish the professor good luck.
I agree. The amount of wasted heat energy is staggering. A lot of our power generation wastes about 50% or more of the initial thermal energy. Even capturing a fraction of the waste would be a significant increase in power available.
Just a thought.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.