"100 to 500 times more sensitive to light than a traditional silicon wafer"

Bullshit! Silicon solar panels already convert 20% of the light that falls on them in to electricity. If they were 100 times more efficient they'd convert 2,000% percent of the light that falls of on then (i.e., 20x more light than is actually available)!

Typical small-ish pixel CMOS image sensors have peak quantum efficiencies (QEs) of 70% (for mono sensors) at around 600nm. That means that 70% of the green photos that arrive at the sensor get converted in to a detectable electron. Again, increasing that by 100x would lead to 70x more photons being detected than actually exist!

Of course that's the peak (QE) so it would be possible to enhance the QE at other wavelength, but mathematically it's just not possible to increase the sensitivity of a sensor within the visible range (the part of the spectrum you're generally interested in for conventional photography and video) by 100x, or more.

There is room for a big sensitivity improvement in the near infrared where only something like 1% of light is converted to a signal. But IR is only of interest for security applications.