The history of time-keeping is one of striving to achieve ever-more precise increments of time. We started measuring time in quarterly seasons, and now we measure it billionths of a second. Sometimes improvements in time-keeping drives human advancement; sometimes human advancement drives improvements in time-keeping. The two dance together perhaps in the same way that electricity and magnetism do in order to form light.
About 3,000 B.C., Egyptians first caught onto the idea of systematically measuring time, then with a quarterly calendar based on the movement of stars above them. In was a practical pursuit: They watched for Sirius rising, which meant the Nile would soon be flooded and their lands would be ready for planting.
Clocks, or at least clock prototypes that mimicked the movements of the stars, were around as early as 1000, but clocks themselves didn't become widely-used for another 500 years. It was religion and then burgeoning municipalities that first put them to use. As religion observance grew from practicing customs to following rules, religious leaders would have large bells rung to remind the strivingly pious exactly when to pray.
Towns became the major users of clocks though. Country folk had roosters to awake them and the growing shadows to remind them the day was coming to an end. But as villages grew into cities, the townsfolk pooled their taxes and bought town clocks. Town clocks allowed urban dwellers to break their day into smaller, serial chunks--one hour could be devoted to work, another hour to enjoy a beer with friends, and so on. It also reminded textile workers when to return to work. Clocks were regarded with the same fascination as computers today.
More use drove greater accuracy. In 1300, clocks measured only time in only hours. Then came clocks that measured time in quarter hours, minutes and, by 1500, seconds. By 1750, precision tool-making had taken hold, and watches became widespread. Watches democratized time, and once timekeeping devices became personal objects, the question arose of who had the "correct time."
At first, each village kept its own time, and naturally, time varied from village to village. At first this was not considered an issue, but as long-distance transport of goods and people grew more prominent, a form of speedy synchronization was needed. People were anxious to find out when coaches would arrive at their destinations. Departing ships needed deadlines for its feeder cargo to arrive.
The biggest driver for synchronized time were the emerging railroads. In the early 1800s, railroad companies printed schedules showing the arrival and departure times according to local municipality times. This grew to be unmanageable, however, and by the 1830s, the British government mandated that all railroads used Greenwich Mean Time. The recently-invented telegraph was then used to distribute the correct GMT time across the land, with an accuracy down to 1/100th of a second. Each British post office would get hourly updates on the time via telegraph. Later, BBC radio would broadcast the correct time across the land every 15 minutes. Now the Internet distributes time automatically via the Network Time Protocol (computers require time synchronization within 10 minutes to exchange data).
The precision of time-keeping continued to improve as the centuries passed. The history of time-keeping could actually be seen in terms of reading successively more precise reference oscillations, either mechanically-produced or induced.
In 1714, the British offered a prize for a system that would establish longitude at sea--a difficult task as establishing longitude required a clock that would keep the time across a roiling sea voyage. The prize required that a sea clock could vary from proper time no more than 3 seconds a day. This led to the chronometer, a more purely mechanical method of time keeping, one based on counter-oscillating weighted beams (the earlier pendulum had issues with gravity).
By the 1970s, cheap quartz clocks soon overtook mechanical timepieces in improved accuracy. They are based on an oscillating electric field caused when a small charge is applied to a crystal. And ever the past 50 years, government labs the world over have been competing to build ever more precise atomic clocks, again by counting oscillations. These clocks are based on the changing state of electrons of an ion--a single second is considered 9,192,631,770 oscillations between two states. Now the measure of precision has shifted to how long a clock can keep an accurate time. First generation cesium clocks varied only so much that they keep the correct time for 80 million years. More recent alumumiun/beryllium-based clocks could keep time accurate for a billion years.
This achievement comes from an increase in the sampling rate, so to speak. Over the past 10 years, the femtosecond laser-which samples the oscillations 1,000 times more per second than picosecond lasers formerly used-has been used to interrogate the ions. Time measurement at the terahertz level is so sensitive that elevation of the instrument now has to be taken into account to offset the effects of gravitation. Physicists predict that further refinements may butt up against quantum laws of physics.
And, just as the technology of time measurement marches forward, so too does the need for greater precision in time measurement. Engineers complain of not being able to synchronize time between networked devices across vast distances in levels of precision that they now require. Today's Internet protocol to synchronize far-flung clocks require that the time be updated every 1,024 seconds. The group behind this protocol are working to increase this time between updates to 36 hours.
At a conference I attended, one developer bemoaned how Linux could only measure time only in microseconds, or millionths of a second. Microseconds are fine for most purposes, but the developer had written a program to search for bugs in applications and wanted to divide time into ever-finer slices to run more test cases of program under scrutiny. He wanted Linux to cycle in nanoseconds, or billionths of a second.