The iWatch may be big on the rumor mill, but as far as the average person is concerned, the clocks on our smartphones and music players work just fine. They may only denote time by the hour and minute, but hey, what value is the second, anyway? Does it mean anything if you’re 20 seconds late versus a full minute? If it were up to a team of Parisian researchers, then yes, even the millisecond matters. Which is why they’ve developed an atomic clock that claims it is so accurate, it can “redefine the second.”

What exactly does it mean to redefine the second? Don’t worry, our current reality hasn’t all been a lie. Since 1967, the International Committee for Weights and Measures’ standard for measuring one second involves firing microwave radiation to cesium atoms and counting their vibration. The Paris Observatory’s newly designed optical lattice clock (OLC) uses laser beams to shoot strontium atoms, making them oscillate 40,000 times faster than microwave clocks. This new method allows scientists to divide time into shorter intervals, measuring it in an even more precise level. The former measurement standard loses a second every 100 million years, while the ultrastable OLC only loses one second every 300 million years.

“Even an accuracy of a second in 300 million years still means a lag of about 0.01 of a nanosecond over the course of a day,” Paris Observatory’s Dr. Jerome Lodewyck tells National Geographic. “And that is not really so little when you think about fiberoptic communications and realize that a single telecommunications slot is 0.1 of a nanosecond.”

A blink of an eye, or approximately 400 milliseconds, has found to be too long for users to wait for a computer response after a mouse click or keyboard tap.

The same can be said for telecommunication services like Skype, Google Hangouts, and even regular phone calls. According to Integrated Research, we are receptive of vocal and visual responses that happen in less than 150 milliseconds. “Our brains can adjust to the rhythm and compensate for the pauses. But when latency is irregular – as in jitter, it’s harder to tolerate,” IR writes. “Network congestion and jitter can cause voice packets to arrive unevenly or out of sequence, disrupting the conversation and causing garbled sentences, choppy voice and dropped audio.” Remember: A millisecond is 1,000th of a second, so anything above 0.15 second becomes distorted to our eyes and ears.

If the OLC can capture a more precise measurement of time, these lines get even thinner. Hypothetically, athletic records could be tracked down to the oscillations, making world records that much more difficult to reach (or beat). Even the sportswear could determine the point at which a racer hits the finish line. “The precision of these clocks is such that the length of the shoes of runners matters,” Dr. Lodewyck tells us. “It would matter so much in fact, that adding a 1000th of a single atom to the tip of the shoe would change the record by one tick of an optical clock.”

And if scientists can figure out how to shrink these atomic clocks to fit inside computers and mobile devices, a hyper-accurate denomination of time can mean significant technological advancements. Data packets could be scheduled for transfer at a more precise point in time, allowing telecommunication signals to be sent and received in a more efficient manner. This could increase phone call quality, lowering the rate of spottiness and enhancing real-time responses. GPS signals would also benefit from this hyper-accurate timekeeping as it requires heavy synchronization with satellites to map a device’s real-time location. These examples lead to higher efficiency, meaning less energy usage and better gadget battery life.

Of course, that kind of application would take decades to accomplish, but the implications could provide new standards for technological and experimental physics developments.

Editors' Recommendations