Accuracy is the tendency of your measurements to agree with the true values. Precision is the degree to which your measurements pin down an actual value. The question is about an interplay of accuracy and precision.

As a general principle, you don't need much more precision in recording your measurements than there is accuracy built into them. Using too much precision can mislead people into believing the accuracy is greater than it really is.

Generally, when you degrade precision--that is, use fewer decimal places--you can lose some accuracy. But how much? It's good to know that the meter was originally defined (by the French, around the time of their revolution when they were throwing out the old systems and zealously replacing them by new ones) so that ten million of them would take you from the equator to a pole. That's 90 degrees, so one degree of latitude covers about 10^7/90 = 111,111 meters. ("About," because the meter's length has changed a little bit in the meantime. But that doesn't matter.) Furthermore, a degree of longitude (east-west) is about the same or less in length than a degree of latitude, because the circles of latitude shrink down to the earth's axis as we move from the equator towards either pole. Therefore, it's always safe to figure that the sixth decimal place in one decimal degree has 111,111/10^6 = about 1/9 meter = about 4 inches of precision.

Accordingly, if your accuracy needs are, say, give or take 10 meters, than 1/9 meter is nothing: you lose essentially no accuracy by using six decimal places. If your accuracy need is sub-centimeter, then you need at least seven and probably eight decimal places, but more will do you little good.

Thirteen decimal places will pin down the location to 111,111/10^13 = about 1 angstrom, around half the thickness of a small atom.

Using these ideas we can construct a table of what each digit in a decimal degree signifies: