Hello, In the context of another mailing list on astronomy and satellites I have come across the following problem: we all know that the orbital plane of satellites performs a precession in that for inclinations between 0 and 90 degrees it turns about the Earth's axis in the direction east->west. For retrograde orbits the precession is in the opposite direction. The rate of precession is proportional to the cosine of the orbit's inclination, i.e. it vanishes for a polar orbit and has a maximum in the order of 10 degrees per day for near-equatorial orbits. This is the rate integrated over one orbit, it is variable at different points along the orbit. = It is also well understood that this effect is due to the flattening of the Earth (the J2 term in the Earth's potential). The mathematical proof may be lenghty and cumbersome, but it is conclusive. However, the result is not all intuitive. What exactly is the force that makes the orbit precess, and why does this force depend on inclination ? It is particularly intriguing that the rate of precession increases as the orbital inclination decreases. In the limit, i.e. for an equatorial orbit, there is a perfectly symmetrical and constant situation with the attractive force all the time directed toward the center of the Earth, and yet, this is when the precession is at its maximum (to be entirely exact: for inclination zero precession is not defined because there are no nodes, but at an infinitesimally small inclination it is). This is admittedly a theoretical question, not directly linked to the observation of satellites but as I would like to understand (as opposed to "being able to calculate") this phenomenon I thought that perhaps someone on this list might have the answer. To be clear: I am looking for an explanation which the moderately educated layperson with a knowledge of the basic laws of physics can understand but who cannot be bothered with partial differential vector equations. Thanks for any suggestions. Bruno Tilgner Bruno_Tilgner@compuserve.com