- Joined
- Jun 29, 2004
- Messages
- 2,821
- Reaction score
- 0
Having seen total confusion when cultures clashed on one of the threads, I would like to point out that Latin convention of using commas instead of decimal points is just as valid as the Anglo-Saxon convention. This system was not invented yesterday! Further proof that the the Real World is not flat like the ASCII World.
http://en.wikipedia.org/wiki/Decimal_point
<<<The decimal separator is a symbol used to mark the boundary between the integer and the fractional parts of a decimal numeral.
In the Middle Ages, that is, before printing, a bar over the units digit was used. However, its regular usage and classification is attributed to Muḥammad ibn Mūsā al-Ḵwārizmī, a Persian scientist. Later, a separator (a short, roughly vertical, ink-stroke) between the units and tenths position became the norm. When type-set, it was convenient to use the existing marks called a comma or a period, which is variously called a stop or a dot, or else a point for this purpose.
In France the dot was already in use in printing to make Roman numerals more readable, so the comma was chosen. Many other countries also chose the comma to mark the decimal units position. It has been made standard by the ISO for international blueprints.
English-speaking countries, however, took the comma to separate sequences of three digits. In the US, a period (.), which is called a "stop" or "full stop" in some other countries, was the standard. In the nations of the British Empire, although this could be used as in typewritten material, the point (middle dot: ·, which can also be called an interpunct, was preferred for the decimal separator in those technologies which could accommodate it.>>>
http://en.wikipedia.org/wiki/Decimal_point
<<<The decimal separator is a symbol used to mark the boundary between the integer and the fractional parts of a decimal numeral.
In the Middle Ages, that is, before printing, a bar over the units digit was used. However, its regular usage and classification is attributed to Muḥammad ibn Mūsā al-Ḵwārizmī, a Persian scientist. Later, a separator (a short, roughly vertical, ink-stroke) between the units and tenths position became the norm. When type-set, it was convenient to use the existing marks called a comma or a period, which is variously called a stop or a dot, or else a point for this purpose.
In France the dot was already in use in printing to make Roman numerals more readable, so the comma was chosen. Many other countries also chose the comma to mark the decimal units position. It has been made standard by the ISO for international blueprints.
English-speaking countries, however, took the comma to separate sequences of three digits. In the US, a period (.), which is called a "stop" or "full stop" in some other countries, was the standard. In the nations of the British Empire, although this could be used as in typewritten material, the point (middle dot: ·, which can also be called an interpunct, was preferred for the decimal separator in those technologies which could accommodate it.>>>