My recollection is that the LED chip controllers / circuits also develop considerable heat which explains the need for cooling fins. I recall that being an issue when we tried to develop some LED lighting systems in the lab for machine vision systems. Am I missing something?
The item does not mention any work done to show the energy came from IR rather than visible. If you put your hand in front of 100W of white light, it will get hot. What was done to show that the burning sensation was caused by IR rather than by visible energy?
I have been working with COB LEDs that are up to 1200 watts and Ed's observations are absolutely correct. And don't forget that ¼ to 1/3rd of all heat comes from the down-conversion process of the phosphors floating in silicone. A bare blue die COB when placed in close proximity to a plastic reflector will melt it; you have to use a metal reflector of some sort for the materials to withstand the absorbed IR. FLIRs give you an idea of what you are dealing with but there are many sources of heat in the system and they are all co-located.
Last I checked, E = hν, where E is energy, h is Planck's constant and ν is the frequency of the photon. This means that all electromagnetic energy, including visible light, carries energy, and IR actually carries less energy than visible light because it has a lower frequency.
Want proof? Buy an IR absorbing filter and put it in front of a Fresnel lens; take the combination outside on a sunny day and catch a leaf on fire.
I've worked around Metal Halide arc lamps for projectors and we always had to be careful of getting anything in the way of the beam when it was near focus, even with an IR filter.
Or how about lasers? By definition they emit only one wavelength, but you can buy visible lasers with which you can solder. (Please don't look at the spot!)
This problem is nothing new to optical engineers who have worked in illumination. Highly concentrated visible light can be dangerous.
There's something silly about this "new discovery." If the LED draws 25W of power, it will create photons with that much power, minus whatever inefficiencies there are in the electronics of the power supply. So, no matter how you twist the words, the heat this 25W LED will generate in total will be no more than the heat created in total by a 25 watt incandescent bulb. Although the incandescent bulb generates far less visisble light and far more IR radiation. (A 25W LED should generate the equivalent visible light of a 125W incandescent, give or take, with today's LEDs.)
You can't get something for nothing. That includes heat.
LEDs all produce heat at the PN junction, and those who use them in lighting products understand this well and dissapate the heat through a variety of strategies. The heat Ed discovered has nothing to do with the PN junction. His paper claims that the forward-radiated heat amounts to 8% - 9% of the input power of the LED.
I thank allof you for you comments. What is clear to me is that there are technical folks out there with far more experience and knowledge than I relative to the physics of high power visible and non visble radiation sources and the heating they can surprisingly create, As one of you essentially stated : "so what's the big deal ,many of this have known those things for years in more advanced radiated light source..."?
But I work in the world of practical commercial LED lighting technologies-with interdisciplines of LED chip mechanisms, phosphors, LED drivers, heat sinking,optics etc. and I can say with certainty, as Keith Dawson alluded to, that 99% of "working stiffs" in the LED world accept as fact that 100% of the meaningful heat associated with LEDs comes from the PN junction ( I've proven hat Stokes effect heating is trivial-- and that the appropriate transfer of that heat is the one and only "thermal issue" in how power led arrays, lamps fixtures. As the former founder/ CEO of both a power semconductor company and of a swithcing power supply company, I am accutely aware of the thermal relationships.
My major (and perhaps only) point here is that 99% of all the folks working in the mainstream LED lighitng worked are totally unaware (or unwilling to talk about) the fact of this emission-side heat , which can be as much as 8% of total power applied. It behooves any maker of luminaire using arrays rated over 25 watts to be aware of any effects on his products performnce or reliability. I came upon this issue because this supposedly non existent heat melted the remote phosphor sheet i had placed over (about .060 away ) blue led array, whose substarte gtempo was only 37 C,.
I think all of you mioght agree this was not "imaginary". That's what started me on this little journey.
What is astounding is the lack of information on this from the leading makers of high power LED arrays, white or color only.
I visited Luminus devices oon 2012 and still have two of their LED modules, blue and red. They do get rather hot, even when powrred with just a 9V battery. They need to be water cooled.
"Test stations also need fixtures, and Luminus engineers must design fixtures that cool the LEDs' 150 W of heat. "That's as much heat as in a professional-grade soldering iron," said Joffe. Furthermore, the LEDs must be tested under consistent current and temperature conditions. The test fixtures are water cooled, and the engineers have designed mechanical fixtures that provide consistent contact with heat sinks."