Something occurred to me while writing some code today. I was thinking about spans of time -- days, months, years, and so on. It occurred to me that we tend to use the plural inflection of a word much more often than the singular.
For example, if one day has passed, we say "one day" or "a day". However, if any number greater than 1 has gone by -- 2, 3, etc. -- then we use the plural inflection. "two days", "three days", et cetera.
When we involve some rational numbers, we tend to use the singular again. "a half a day", "a third of a day", et cetera. But we only really use that for proper fractions with denominators which we can express in English. The utility of the latter is hamstrung when you consider that we quickly run out of words for particularly large numbers and instead treat them as a sequence of digits rather than expressing them with words.
And indeed, when we make the jump to real numbers -- 1.234, for example -- we tend to use the plural inflection again and that's completely apart from the concept.
Consider that 0.99999999999999999 can be shown to be equal to 1, yet we'd still say 0.9999999999.... days.
Let's define a random variable R which belongs to the set of real numbers. The probability of R belonging to one or a number which can be expressed with a proper fraction is vanishingly small.( Collapse )
So why then, do we so often find use for the singular in our daily lives? This would imply either a fact about nature or a fact of our perceptions.
Given how rarely whole numbers -- let alone 1 -- fits into equations from any of the hard sciences which seek to model nature, it would imply that we tend to perceive the singular more often than is really fair.
Let's say you have two croissants and one of them is slightly smaller than the other. You'd never say 1.98942 croissants. You'd say two croissants, assuming they were whole.
We humans have very strange tendencies to distort data indeed.
EDIT: Minor correction: The singular form can be used for all rationals, not just a specific class of them.