I'm assuming that the technology can easily be modified for domestic US applications using 120 volt AC power (and our slightly higher frequency can only further reduce flicker). This is a great advance since the bulk of electronics currently required for running off the mains makes the packaging, the cost, and heat dissipation significant issues for household replacement light bulbs.
Reading through the article I learned that the power factor correction offered by the new chip is 98% and Total Harmonic Distortion (THD) is less that 18%, which seems good. How are these parameters for the conventional LED driver circuit? Apart from the improvement in reduced flicker and compactness without larger components - inductors and capacitors would give this an advantage over the conventional LED drivers.