The Theory of William Tell
By SHERMAN WAN
Blast San Francisco Bureau
Here's a philosophical dilemma. Or perhaps a metaphysical one.
Let's say you're standing with your back against a wall with an apple on
your head. Let's further say that an archer stands 100 feet away and
shoots an arrow at the apple.
@ <----------()
Now, let's suppose that within some x period of time, the arrow moves
halfway along the line from the archer's hand to the apple. Agreed?
The arrow is now 50 feet away from the apple.
@ <----------()
After another x period of time, the arrow moves halfway again, from
the previous point, to the apple. The arrow is now 25 feet away from
the apple. Still agreed?
@ <----------()
Now, we learned in high school that there are an infinite number of points
along a line between two points, right?
The arrow continues halving the distance until we get down the microscopic
level. We're dealing on the order of microns and nanoseconds. It may be
on an infinitesimally small level, but within a certain amount of time, the
arrow moves halfway closer to the apple from its previous point. Correct?
@<----------()
The question: Why and how does the arrow eventually strike the apple and
cleave it in half? When does the distance between the arrowhead and the
apple fail to be further halved?
This has bugged me since I was a little kid.