Yahoo Answers is shutting down on May 4th, 2021 (Eastern Time) and the Yahoo Answers website is now in read-only mode. There will be no changes to other Yahoo properties or services, or your Yahoo account. You can find more information about the Yahoo Answers shutdown and how to download your data on this help page.

Badastronomy confusion over leap seconds and slowing rotation of the earth?

I was just reading Phil Plait's explanation of tidal locking and why the moon is receding from the earth, It's very good, but when he gets into leap seconds he says this--

"However, the Earth's rotation is decelerating at a rate of about 0.002 seconds per day per century. It's been about a century since the atomic clocks' standard time, so the Earth is slowing relative to an atomic clock by about 0.002 seconds per day, or about 0.7 seconds per year. Note that this does not mean the Earth is actually slowing its rotation by that amount; it means that a clock set by the rotating Earth loses time at that rate relative to an atomic clock."

I'm very confused by the same that "this does not mean the Earth is actually slowing its rotation by that amount," because surely it DOES mean that, right?

Update:

In a link to a Naval Observatory page he gives I find

"Confusion sometimes arises over the misconception that the occasional insertion of leap seconds every few years indicates that the Earth should stop rotating within a few millennia. The confusion arises because some mistake leap seconds as a measure of the rate at which the Earth is slowing."

I don't get this, either. Where would people get the idea that a leap second added every year or so would halt the earth's rotation in a few thousand years? What am I missing?

http://www.badastronomy.com/bad/misc/tides.html

Update 2:

---I'm still confused about where the confusion is. If you keep adding leap seconds as days get gradually longer wouldn't it take an infinite time to reduce the earth's rotation rate to zero? I still don't see what the incorrect thinking really is. Where are they getting a figure of a few thousand years from?

Update 3:

OK -- I am finally catching on. The most important thing is to realize that even a clock that is running at a *constant* rate will drift out of sync and "lose time" relative to another one running at a slightly faster constant rate.

So the confusion is coming from mistaking the "drift apart" rate for a changing rate of the slower clock. I was indeed making this mistake at first. And it must be the same error Phil Plait made that he says he edited out of his treatment. I'll try to encourage him to put a fuller treatment back in so others can understand this pitfall better.

Thanks, everyone!

7 Answers

Relevance
  • Dr Bob
    Lv 6
    1 decade ago
    Favorite Answer

    There are a lot of answers here -- some good, some not.

    One of the fundamental problems of timekeeping is that the earth's rotation is not uniform. In the long run, the earth gradually slows down; but there are larger short-term irregularities. For a long time, the second was defined as a certain fraction of the length of the year 1900 (based on the equinoxes). In 1967, scientists came up with a physical definition of the second (based on a certain transition of the element cesium) that matched, as well as possible, the definition based on 1900; and we have been using that definition ever since.

    Nominally, a day contains 86400 seconds (=24*60*60). The earth is now rotating a bit more slowly than it was in 1900. Even if the rotation remained fixed from now to the rest of eternity, the average number of seconds in the mean solar day would not be 86400, but about 86400.0004 (the current length of the mean solar day, in terms of the atomic definition of the second). Thus, we would have to add a leap second roughly every 2500 days (6.8 years) to keep Universal Time (which uses atomic seconds) in sync with the rotation of the earth. Adding this leap second periodically does not mean per se that the earth is slowing down; it simply means that the atomically-defined second is not exactly 1/86400 of a mean solar day.

    But the earth's rotation speed continues to change. In the long run it slows down, but this does not happen at a steady pace. The irregularity of this process means that leap seconds are not added at a regular interval, but rather whenever it is found observationally that the two systems of timekeeping (UT vs. the earth's rotation) have drifted apart by about a second.

    As the earth continues to slow down in the long term, we will have to add leap seconds more and more often, and that will make time calculations more difficult in the future. For this reason, some scientists have suggested that we replace the leap-second system with something else. (This is an ongoing debate at the present time.)

    Note that leap seconds have absolutely nothing to do with a leap year. Leap seconds come about because the atomic second doesn't currently match 1/86400 of the mean solar day, and because the earth's rotation continues to change irregularly (with a long-term slowing trend). Leap years come about because the number of days in a year is not an exactly integral number of days. Even if the earth's rotation were perfectly uniform, we would have the problem of leap years. The earth's changing rotation speed will, however, eventually lengthen the number of days in the year enough that we will have to modify the calendar system (but that won't happen for a long time).

    Now I'll discuss the common misconception people have. As I said, a leap second occurs for a combination of the following two reasons:

    1) The atomic second and the astronomically defined second do not match.

    2) The earth's rotation speed continues to change.

    Even if only #1 was true, we would have leap seconds. (And if #2 alone were true at first, then #1 would also soon be true.)

    Some people, however, incorrectly interpret a leap second to mean that the earth's daily rotation has changed by a full second. (In reality, the one-second discrepancy has accumulated over a few years.) Starting from this false premise, they jump to the conclusion that after we've added tens of thousands of leap seconds, it will mean that the earth's rotation will have stopped. This is a completely erroneous interpretation.

    In the long run, the length of the day is increasing very slowly -- about 0.00025 seconds per century. You might think that this will bring the rotation almost to a stop eventually, but the rate will diminish over time. Eventually (after billions of years!), the earth and moon will become tidally locked. At that point, the sidereal rotation period of the earth will be fixed at about 47 current days (and this will also be the sidereal period of the moon's orbit). When that happens, not only will the same side of the moon always face the earth, but the same side of the earth will always face the moon.

    Hope that helps. Add some more notes in your question or email me if there's any confusion.

    -- edit -- comment on your note

    You're right that even if each leap second meant that the earth's rotation has slowed down by a second (and it doesn't mean that), it would take an infinitely long time for the rotation to stop. So whoever reaches that conclusion would be piling one misconception on top of another. I've never actually heard this combination of misconceptions, so I don't think I can explain the reasoning behind it.

    Let me just add one more thing to contrast leap seconds with leap years. The purpose of leap seconds is to make sure that the sun reaches its highest point at noon (on the average, after allowing for time zones, day-to-day variations related to the earth's elliptical orbit, etc.); in other words, it prevents the time of the sun's highest point from drifting in time from year to year. The purpose of leap years is to make sure that the vernal equinox always occurs on about the same day each year; that is, it prevents the vernal equinox (or any other equinox or solstice) from drifting through the calendar year. Leap seconds keep the time of day in sync with the sun, and leap years keep the calendar in sync with the seasons. (Nature didn't make it easy! It's our job to come up with a scheme to adapt to whatever nature has given us.)

    By the way, you can see that timekeeping is a messy subject, and there's quite a bit more to it than has been discussed here. Fortunately, most people in their everyday lives are not concerned with time to the nearest second, and most people don't adjust their watches when there's a leap second. (But today's "atomic" clocks and wristwatches do that automatically.)

    Source(s): (I'm an astronomer too, whatever that's worth.)
  • 1 decade ago

    No, the difference in the two time standards is due to the cumulative slowing. If the slowing ceased, we would still need to add those leap seconds regularly. It's like having a clock that's slow - if it loses five minutes a day, it can continue to do so indefinitely. By some innumerate arguments, you would expect it to slow to a dead stop in less than a year. The first number stated is the actual rate of slowing - 0.002 seconds per day per century. That's about 0.0036 seconds per year, not 0.7. That's a long-term average for the current era. It is in fact faster now than it was in the distant past because the ocean tides are close to resonance and are therefore absorbing more energy. There are also a variety of factors that can speed or slow the Earth's rotation in the short term, so the slowing is not constant.

    As for your second question, about the leap seconds, I have seen some very confused arguments from people who think that if we need to add a leap second every other year, that the length of the year (or maybe even the *day*) is increasing by that amount every year, from which they linearly extrapolate a time when the rotation rate will go to zero.

  • 1 decade ago

    The definition of a "second" was set so that there are 86,400 seconds in a day -- back around the year 1840! That is the average date for the observations that were used to define the second (i.e., about 1790 to 1880's). Since then, then Earth has slowed down a tiny bit for a while, and sped up for a while. Since 1972, there have been 23 leap seconds, over a period of 13,149 days. That means that ON AVERAGE, over the last 36 years, the days have been longer than 86,400 seconds by 0.00175 seconds. That extra part is called the "excess length of day", and adds up to about 0.6 or 0.7 seconds per year.

    Now, we might suppose that the Earth could continue at that rate forever, with each day being 86,400.00175 seconds (average over several decades). The would be zero deceleration. But in fact, the Earth is decelerating. In about 100 years, we can expect that the day will be 86,400.00375 seconds long. Then leap seconds will have to put in about every 267 days (average). And in 1000 years from now, the day will be 86,400.02175 seconds long, and leap seconds will be needed every 46 days (as always, on average). (The detailed situation is a bit more complicated.) Some people are worried about this -- see the second link below.

    There is a graph of the "excess" length of day at the first link. Look for the graph labeled "BullA LODS".

    For a more technical paper, try the third link.

  • 1 decade ago

    Ok, this is a source of some confusion, but basically here's the scoop:

    There are two systems of time, "Universal Time" (UT1), and "Universal Time Coordinated" (UTC). One day in UT1 is measured by one rotation of the Earth as seen from the average position of the Sun. One day in UTC is measured by 24 hours passing on an atomic clock.

    Now, when it started on January 1st, 1972, UTC was perfectly in line with UT1, and that is actually what defines an hour in UTC time - 1/24th of an Earth rotation in 1972. However, the Earth is very slightly slowing down its rotation due to the moon's tidal force. Specifically, it slows at a rate of .002 seconds a day per century.

    So, in 2072, the UT1 day will be longer by .002 seconds than the 1972 day (and by transitivity, .002 seconds longer than the UTC day, too). In 2172, a UT1 day will be .004 seconds longer than the UTC day...in 2272, the UT1 day will be .006 seconds longer, and so on.

    Now, here's where the confusion comes in: Even though the *rate* of change of the UT1 day is miniscule, it adds up to a large *total difference* in time between UT1 and UTC over the long haul. Note the important distinction in those two: rate vs. total difference.

    In other words, say that right now in 2008, the UT1 day is now .001 seconds longer than the UTC day (it's not quite that much yet, but it'll make the math easier). Let's say that the "time folks" just inserted a leap second so that right now the UT1 and UTC times are perfectly synchronized. Tomorrow, they'll be .001 seconds different. The day after tomorrow they'll be .002 seconds different, the next day they'll be .003 seconds different, and so on. It won't take very long (about 1.5 years) until they're more than half a second off, and they'll have to insert another second to synchronize the two times again.

    The point that I believe both Phil and the USNO is trying to make is that even though the rate of Earth's slowing is relatively small, that can add up to a big difference in a relatively short amount of time.

    Edited for clarity: The confusion that some people have is that they get these two concepts confused...they assume that if we have to insert 1 second every 1.5 years, then the *rate* of slowing must be 1 second/1.5 years.

    This is not true, but to produce a mathematical result that the Earth will come to a dead halt in just a few thousand years (rather than just exponentially longer days), think of this in terms of frequency rather period: When UTC and UT1 are in sync, the Earth makes 1 rotation in 86,400 sec (one UTC day). In 1.5 years, it will only make 0.999988 rotations in 86,400 sec. In 3 years, it will only make .999976 rotations in 86,400 sec. At this rate, it will take 129,601.5 years before it makes zero rotations in that time period. At least, I *think* that's how they come to that conclusion. I'm sure you could greatly shorten that time with the added fact that leap seconds will have to be added more and more frequently as UT1 runs even slower compared to UTC.

    A quick side note: this has absolutely nothing to do with the extra day inserted on leap years - that's simply due to the fact that one year is closer to 365.25 days than just 365 days. Since it's not an integer number, we need an extra day every 4 years to account for that extra 0.25.

    Source(s): I'm an astronomer (I know that's an "argument from authority" fallacy, but I've actually studied this stuff in detail.)
  • 1 decade ago

    Steve -

    I don't see the difference between adding a leap second once in a while and adding a leap day every four years. I see it as a way of correcting the calendar for what's really happening, not as a way of measuring the rate of deceleration of earth's rotation.

    Also, I agree that Mr Plait sounds like he is arguing with himself. I would probably not believe either one of him.

    I do think that the effects of tidal locking will eventually slow the rotation of the earth - very gradually - but I'm not convinced that it can be measured in leap seconds.

  • 1 decade ago

    It's because there are two different definitions of "second", both derived from Earth's rotation, but at different times. The atomic clock second is from the rate of Earth's rotation a century ago, which was slightly different from the rate of Earth's rotation today.

    Source(s): NIST SP-432
  • 1 decade ago

    I can't answer that. there are inconsistencies regarding time and the rotation of the earth. such as satellites loosing time (because of relativity) the further away from the gravity of earth and the faster they travel cause time to slow down.

    we must look at the two clocks they are comparing.

    it seems to me that clocks on the equator might possible loose time compared to a clock on the poles.

Still have questions? Get your answers by asking now.