I thought I’d veer off into esoterica today. I don’t know why it came to mind a couple of weeks ago, but since it came to mind, I have found myself pondering the matter. Now I’ll share it with you and get your input.
The matter at issue is the numeral designator for half. If we write 2 days, there is no question what is meant. Similarly, if we write 2.5 days, readers correctly translate that to two-and-a-half days. But is it really correct?
I suppose that it is because it has been accepted and understood as correct for decades, if not for centuries. But shouldn’t time be more accurately represented? If a day has 24 hours, then a half day has 12 hours, which means that 2.5 really means two days plus 5 hours. Yet if we were to write 2.12 days, no one would understand that means 2 days plus 12 hours or two-and-a-half days.
Time has always been treated differently from other yardsticks. Probably because time is so important in our daily lives. We have coalesced around certain conventions, correct or not, that are now the accepted methods for portraying time, especially decimally.
Consider the matter of years. we all know and accept that 6 months equals one-half year. Yet we do not write 1.6 years to represent one-and-one-half years; as with days, we write 1.5 years and we all know what is meant.
I work on nonfiction books, which has led me to occasionally wonder if an error will occur when measure shorthands aren’t correlated with the written out version; that is, how likely is it that some reader will mistake 1.5 days for 1 day 5 hours, so I should write one-and-one-half days rather than 1.5 days?
Of course, I only wonder and do not spell it out because I understand that we have accommodated our use of language so that there is no likelihood of misinterpretation. But that doesn’t move me away from wondering how this came about and why such imprecision is accepted by communities that require precision elsewhere.
Not only have we accommodated our use of language to .5 representing one-half, but this accommodation appears to be fairly universal among languages. Writing 1.5 days will not mislead a French, Italian, Slovakian, Chinese, or Malayan speaker any more than it misleads an English speaker. The convention has crossed linguistic borders (someone once said that math is a universal language, so perhaps the fault for this accommodation lies in math’s universality).
I’m not interested in trying to change the accommodation (some brick walls truly are meant to stand forever), but I am curious about how we came to universally accept and understand that 1.5 days means one-and-one-half days and not one day, five hours.
What is your theory?