What Does Decimal Mean?
In the context of computing, decimal refers to the base-10 numbering system. It is the way humans read numbers. In general, decimal can be anything that is based on the number 10. Understanding how decimal relates to binary, octal and hexadecimal is essential for those working in the IT industry.
Other terms for decimal are base-10 and denary.
Techopedia Explains Decimal
In mathematics, decimal can refer to the decimal numbering system, decimal notation or any number written in decimal notation. Decimal numbers are written according to their place value. The integer is written to the left of the decimal point, and any fractional number is written to the right (for example, 1.023).
The use of decimal is as old as human history. The common use of decimal may be because of humans’ ten fingers (which could also be called bi-quinary, since there are five fingers on each hand). Decimal was used in ancient calculators such as the Chinese rod calculus and the abacus. The Greek mathematician Archimedes used powers of 108 (10,000 × 10,000, or “a myriad myriads”) to estimate the size of the universe.
Base-10 uses the numerals 0-1-2-3-4-5-6-7-8-9, as opposed to 0-1 used in binary. Modern computers count in binary, but there have also been decimal computers. The Analytical Engine of Charles Babbage was designed using decimal. Other early computers like the ENIAC and the IBM 650 used base-10 internally.