Like jkt mentions, there are other methods of rounding. I conjecture that we generally use the "round up" method (well, actually, the "round away from 0" method) because it means it is possible to accurately round any floating-point number with more digits than the tenths place while only looking at the tenths place; 0.50001 is technically closer to 1 than 0, so rounding up is more accurate than rounding down. On the other hand, if we used the "round to even" rule, we would have to always remember to look past the tenths place and always round up if it isn't exactly *.5, because if we rounded 2.52 to 2, that would be less accurate than rounding it to 3. So rounding up always is a simplification, and a good one. But still, when rounding 1.5, both 1 and 2 are just as accurate choices, mathematically; it's the exact same difference.