The number 9,223,372,036,854,775,807 is the integer equal to 2^{63} − 1. Its prime factorization is 7^{2} · 73 · 127 · 337 · 92737 · 649657, which is equal to Φ_{1}(2) · Φ_{3}(2) · Φ_{7}(2) · Φ_{9}(2) · Φ_{21}(2) · Φ_{63}(2).
9223372036854775807  

Cardinal  nine quintillion two hundred twentythree quadrillion three hundred seventytwo trillion thirtysix billion eight hundred fiftyfour million seven hundred seventyfive thousand eight hundred seven 
Ordinal  9223372036854775807th (nine quintillion two hundred twentythree quadrillion three hundred seventytwo trillion thirtysix billion eight hundred fiftyfour million seven hundred seventyfive thousand eight hundred seventh) 
Factorization  7^{2} × 73 × 127 × 337 × 92737 × 649657 
Greek numeral  ͵εωζ´ 
Roman numeral 

The number 9,223,372,036,854,775,807, equivalent to the hexadecimal value 7FFF,FFFF,FFFF,FFFF_{16}, is the maximum value for a 64bit signed integer in computing. It is therefore the maximum value for a variable declared as a long integer (long
, long long int
, or bigint
) in many programming languages running on modern computers. The presence of the value may reflect an error, overflow condition, or missing value.
This value is also the largest positive signed address offset for 64bit CPUs utilizing signextended memory addressing (such as the AMD x8664 architecture, which calls this "canonical form" extended addressing). Being an odd value, its appearance may reflect an erroneous (misaligned) memory address. Such a value may also be used as a sentinel value to initialize newly allocated memory for debugging purposes.
The C standard library data type time_t
, used on operating systems such as Unix, is typically implemented as either a 32 or 64bit signed integer value, counting the number of seconds since the start of the Unix epoch (midnight UTC of 1 January 1970). Systems employing a 32bit type are susceptible to the Year 2038 problem, so many implementations have moved to a wider 64bit type, with a maximal value of 2^{63}−1 corresponding to a number of seconds 292 billion years from the start of Unix time.
Other systems encode system time as a signed 64bit integer count of the number of ticks since some epoch date. On some systems (such as the Java standard library), each tick is one millisecond in duration, yielding a usable time range extending 292 million years into the future. On other systems (such as Win32), each tick is 100 nanoseconds long, yielding a time range of ±29,227 years from the epoch.
This page is based on a Wikipedia article written by authors
(here).
Text is available under the CC BYSA 3.0 license; additional terms may apply.
Images, videos and audio are available under their respective licenses.