Violating limited conversion ratio of boost controllers causes all sorts of proplems. Consider interleaving boost power stages instead to increase effective switching frequency.
Have you ever needed to provide a boosted non-isolated power supply output from a lower-voltage input? The boost converter is a traditional solution. Still, you need to be mindful of limitations on the control IC.
You may be motivated by cost and area considerations to push the power supply operating frequency as high as possible. However, efficiency concerns and controller considerations will limit how high a frequency you can use.
High conversion ratio boosts with high duty factors challenge controllers.
Just as buck power supply controllers have minimum controllable on times, boost controllers have minimum controllable off time. Boosts with wide conversion ratios can create issues when you violate these limits. Consider a boost converter operating in the continuous conduction mode, as shown above. Its duty cycle is
Making some substitutions and solving for maximum operating frequency based on minimum controllable off times,
As an example, a boost that converts 24V to 140V requires a duty factor of 83 percent and an off-time duty factor of 17 percent. In this example, a LM5122 boost controller has a minimum controllable off time of 750nS, which should be guard banded by at least another 250nS. Doing the math, the switching frequency should be limited to 170kHz.
Here is a schematic of a boost converter that was built to provide 140V at two amps from an input voltage of 24V.
Interleaving a boost extends its power capacity.
This design is interleaved; there are two power stages running 180 degrees out of phase. The current is balanced between the two stages by circuitry in the top controller (master) that sets the input current in each stage with resistive current sensing of the inductor current. The master controller also controls the clock phase and frequency for both stages, as well as softstart and faults.