In 1905, Albert Einstein showed that the photoelectric effect—the ability of metals to produce an electric current when exposed to light—could be explained if light is quantum, traveling in discrete bundles of energy. His model, the photon theory, won him the Nobel Prize in 1921, but it left us with an enigma: why does the classical model of electric fields yield correct experimental results for some systems, but fail so dramatically for the photoelectric effect? In other words, at what point does the quantum world begin and the classical world end?

By directing very intense light to a nanoscale needle-like tip, G. Herink, D. R. Solli, M. Gulde, and C. Ropers have bridged the gap between the quantum and classical views of the photoelectric effect. The sheer number of photons hitting the needle dwarf the number of electrons involved, which ensures that individual photon interactions do not dominate. Instead, they created a quasi-classical system in which the bulk electric field of all the photons influences individual electrons. This result shows why the classical and quantum views are correct in certain regimes, and hints at an entirely new way to manipulate electrons in nanoscale materials.

Einstein managed to explain the photoelectric effect by suggesting that light's energy is carried in discrete bundles—photons. The amount of energy of a photon is proportional to its frequency, and and a photon can only liberate an electron from the metal when that frequency is high enough. You might expect that raising the intensity of the light (and thus the total energy) would be enough to liberate electrons and start a current flowing. But high intensity light simply means there are more photons, and even a large number of photons is insufficient to generate a current if the individual ones don't have enough energy.

This is at odds with the pre-quantum view of electromagnetism, where it's the electric field of the light that accelerates electrons. Since the classical theory of electromagnetism can successfully explain many experiments, this raises a question: where does the transition between the field (classical) view and the photon (quantum) view occur?

Just as large numbers of atoms can be treated as a single objects, the new experiment demonstrates that large numbers of photons behave as a continuous electric field. Herink et al. sent intense pulses of infrared light (with a wavelength of 800 nanometers) at a gold tip, sharpened to about 10 nanometers in diameter. On average, over a thousand photons interacted with a single electron, in contrast with the usual single-photon photoelectric model.

In addition, the intensity of the light pulse smoothed out the oscillatory ("quiver") electron motion commonly seen in strong-field experiments, making electrons behave more like they do when interacting with individual photons.

Due to the sheer numbers of photons involved, a lot of energy is transferred to each electron, enough to pop it loose and start a current flowing. Thus, with sufficient intensity, the classical behavior of light can produce the photoelectric effect.

Ordinarily, this kind of interaction would belong to the relativistic regime but, because of the nanoscale needle in which the electron lives, it is liberated with far less fuss. The researchers note that, while such high energies and intensities often wreck the material being studied, in this case no harm was done to the gold needle. That raises the prospect that this provides a new way to control electron dynamics in nanomaterials.

Since both the nanoscale properties and the sheer intensity of light were both required to create a quasi-classical photoelectric effect, Herink et al. demonstrated both where the classical model of electric field is valid and why it's challenging to produce photoemission that way.

Nature, 2012. DOI: 10.1038/nature10878 (About DOIs).