Because every time you repeat it, you get a little bit closer to 1/3, and the size of the next step you take is proportional to the remaining distance to 1/3.

What I meant about an infinite number of zeros is 0 * infinity, or {0+0+0+0+...}. These values are undefined.For example, start with a value of 5. Divide it by x. As x becomes larger, 5/x becomes smaller. So it stands to reason that when x becomes infinite, 5/inf = 0. It also seems logical that if you take your divided parts and put them back together, you'd get your original 5 back. But if you describe that mathematically, you get 0 * inf = 5. This can't be right, since you could have started with any number instead of 5, and gotten that same number back. But what went wrong? It can only be that you can't multiply the equation by infinity to put the parts back together.Now .333... means {.3+.03+.003+.0003+...}. How could that be smaller then {0+0+0+0+...}, when every value within is larger? If infinity isn't a real number, then how can a value defined by infinity ever be real?What I understand about the calculus concept of "limits" is that it seems to be based on the _assumption_ that .333...=1/3 and that other infinite sequences add up this way. But because of that, citing limits as the answer is just circular logic. Limits sound useful for real world calculations, but as far as I can tell, they simply assume this result without ever really justifying it on theoretical grounds.What you are describing here is Zeno's paradox, which most philosophers do NOT consider solved. Again calculus simply assumes and declares a solution without ever really stating one. If I'm cutting wood or working out the volume of some shape, I'd be happy to assume that .333...=1/3, but in pure mathmatics I'm sure that this is going to jump out and bite someone someday.