ECMAScript Number uses the IEEE-754 format to represent both integers and floating-point values.
The most basic number literal format is a decimal integer, which can be entered directly as shown here:
var intNum = 55; //integer
Integers can be represented as either octal (base 8) or hexadecimal (base 16) literals.
For an octal literal, the first digit must be a zero (0) followed by a sequence of octal digits (numbers 0 through 7).
If a number out of this range is detected in the literal, then the leading zero is ignored and the number is treated as a decimal:
var octalNum1 = 070; //octal for 56 var octalNum2 = 079; //invalid octal - interpreted as 79 var octalNum3 = 08; //invalid octal - interpreted as 8
Octal literals are invalid when running in strict mode and will cause the JavaScript engine to throw a syntax error.
To create a hexadecimal literal, make the first two characters 0x (case insensitive), followed by any number of hexadecimal digits (0 through 9, and A through F).
Letters may be in uppercase or lowercase. Here's an example:
var hexNum1 = 0xA; //hexadecimal for 10 var hexNum2 = 0x1f; //hexedecimal for 31
Numbers created using octal or hexadecimal format are treated as decimal numbers in arithmetic operations.