This is getting seriously OT, but since I didn't explain it very well,
I thought I'd try again.

On Jan 15, 2008 10:33 AM, Mark J. Reed <[log in to unmask]> wrote:
> Well, yes.  What we see as linear motion across our line of vision is
> actually angular motion in an arc centered on our point of view.

Put another way.  I'm sitting at my desk; a coworker is standing in my
doorway.  I hold out my arm straight and extend my index finger so
that it covers the head of the coworker, who is now looking at me
oddly.  He steps to the side, and I move my arm to keep the finger
lined up with him.

Say the doorway is 3m away from my eyes, and the coworker's sidestep
was about 1m.  That means they moved through an arc of arctangent(1/3)
or about 18 degrees.  If the finger on the end of my arm is 70cm from
my eyes, I only had to move it about 20cm to get the same 18-degree
arc.   So my arm covered a much smaller distance than my coworker in
the same period of time, and was therefore moving more slowly, even
though in my line of sight they were moving together.

If I had moved my arm at the same speed, it would have covered a full
meter in that time frame; the finger would have whipped through a full
ninety-degree arc, from straight out in front of me to straight out to
my right, apparently moving much faster through my field of view than
the coworker.

This same effect causes me to get a different view out my door as I
lean to the left and right; the office beyond appears to move much
farther than the near wall, so I get a much wider view than just the
difference in my head position.

So how do we use this to measure stellar distances?  Well, the Earth
whips around the Sun much, much faster than the stars move through the
sky; from our perspective, the stars are essentially fixed in place.
We know how wide the Earth's orbit is (about 190 million miles), so by
measuring the angular difference between where the star appears to be
at one end of the Earth's orbit, and where it appears to be at the
other end (six months later), we can determine how far away the star
must be.  If the difference is a second of arc (1/60 of a degree),
then the star is one parsec away - that's the origin of the term
parsec: parallax second.   No star has that big a parallax, however;
the closest star is a bit more than a parsec away.

Mark J. Reed <[log in to unmask]>