In computer architecture, 8-bit integers, memory addresses, or other data units are those that are 8 bits (1 octet) wide. Also, 8-bit CPU and ALU architectures are those that are based on registers, address buses, or data buses of that size. 8-bit is also a generation of microcomputers in which 8-bit microprocessors were the norm.

The IBM System/360 introduced byte-addressable memory with 8-bit bytes, as opposed to bit-addressable or decimal digit-addressable or word-addressable memory, although its general purpose registers were 32 bits wide, and addresses were contained in the lower 24 bits of those addresses. Different models of System/360 had different internal data path widths; the IBM System/360 Model 30 (1965) implemented the 32-bit System/360 architecture, but had an 8 bit native path width, and performed 32-bit arithmetic 8 bits at a time.[1]

The first widely adopted 8-bit microprocessor was the Intel 8080, being used in many hobbyist computers of the late 1970s and early 1980s, often running the CP/M operating system; it had 8-bit data words and 16-bit addresses. The Zilog Z80 (compatible with the 8080) and the Motorola 6800 were also used in similar computers. The Z80 and the MOS Technology 6502 8-bit CPUs were widely used in home computers and second- and third-generation game consoles of the 1970s and 1980s. Many 8-bit CPUs or microcontrollers are the basis of today's ubiquitous embedded systems.


There are 28 (256) different possible values for 8 bits. When unsigned, it has possible values ranging from 0 to 255; when signed, it has -128 to 127.

Eight-bit CPUs use an 8-bit data bus and can therefore access 8 bits of data in a single machine instruction. The address bus is typically a double octet wide (i.e. 16-bit), due to practical and economical considerations. This implies a direct address space of only 64 kB on most 8-bit processors.

Notable 8-bit CPUs

The first commercial 8-bit processor was the Intel 8008 (1972) which was originally intended for the Datapoint 2200 intelligent terminal. Most competitors to Intel started off with such character oriented 8-bit microprocessors. Modernized variants of these 8-bit machines are still one of the most common types of processor in embedded systems.

Another notable 8-bit CPU is the MOS Technology 6502; it, and variants of it, were used in a number of personal computers such as the Apple I and Apple II, the Atari 8-bit family, the BBC Micro, and the Commodore PET and Commodore VIC-20, and in a number of video game consoles such as the Atari 2600 and the Nintendo Entertainment System.

Early or popular 8-bit processors (incomplete)
Manufacturer Processor Year Comment
Intel 8008 1972 Datapoint 2200 compatible
Signetics 2650 1973
Intel 8080 1974 8008 source compatible
Motorola 6800 1974
Fairchild F8 1975
MOS 6502 1975 Similar to 6800, but incompatible
Microchip PIC 1975 Harvard architecture microcontroller
Electronic Arrays EA9002 1976 8-bit data, 12-bit addressing
RCA 1802 1976
Zilog Z80 1976 8080 binary compatible
Intel 8085 1977 8080 binary compatible
Motorola 6809 1978 6800 source compatible
Zilog Z8 1978 Harvard architecture microcontroller
Intel 8051 1980 Harvard architecture microcontroller
MOS 6510 1982 Enhanced 6502 custom-made for use in the Commodore 64
Ricoh 2A03 1982 6502 clone minus BCD instructions for the Nintendo Entertainment System
Zilog Z180 1985 Z80 binary compatible
Motorola 68HC11 1985
Atmel AVR 1996
Zilog EZ80 1999 Z80 binary compatible
Infineon XC800 2005
Freescale 68HC08
Hudson HuC6280
Motorola 6803
NEC 78K0[2]


  1. ^ Amdahl, G. M.; Blaauw, G. A.; Brooks, F. P. (1964). "Architecture of the IBM System/360" (PDF). IBM Journal of Research and Development. 8 (2): 87–101. doi:10.1147/rd.82.0087. Archived (PDF) from the original on 2017-08-10.
  2. ^ "NEC 78K0". NEC. Archived from the original on 2008-10-28. Retrieved 2009-02-10.
8-Bit (studio)

Eight Bit Inc. (Japanese: 株式会社エイトビット, Hepburn: Kabushiki-gaisha Eito Bitto), also known as the 8bit, is a Japanese animation studio established in September 2008 by former Satelight members.

8-Bit Theater

8-Bit Theater is a completed sprite comic created by Brian Clevinger, and published in 1,225 episodes from March 2, 2001 to June 1, 2010. The webcomic is among the most popular sprite comics, winning various awards, and is part of the Create a Comic Project.

The plot of 8-Bit Theater is loosely based on that of the video game Final Fantasy, in which four adventurers, the Light Warriors, must save the world by defeating four powerful demons that represent the four elements, thus relighting four magical orbs tied to the same elements, and, finally, defeating the personification of evil, Chaos. However, while many of the original plot points and characters are present, the way they come about is often radically different. The Light Warriors themselves tend to cause far more harm than good on their travels and mostly have to be blackmailed, bribed, or threatened into accepting quests.

The comic is also not a serious epic; the protagonists and many of the supporting characters are based on and a parody of exaggerated role-playing video game stereotypes to the point where many characters are actually named after their character classes, and much of the humor displayed in 8-Bit Theater is derived from the ineptitude of characters as well as from the interactions between four protagonists who are travelling together but do not actually like each other very much. The range of comedic devices 8-Bit Theater employs includes droll humor, running gags, word play, and slapstick, and another significant portion of the humor results from creating reader anticipation for dramatic moments which fail to come. Clevinger has stated that "[his] favorite comics are the ones where the joke is on the reader."

Atari 8-bit family

The Atari 8-bit family is a series of 8-bit home computers introduced by Atari, Inc. in 1979 and manufactured until 1992. All of the machines in the family are technically similar and differ primarily in packaging. They are based on the MOS Technology 6502 CPU running at 1.79 MHz, and were the first home computers designed with custom co-processor chips. This architecture enabled graphics and sound capabilities that were more advanced than contemporary machines at the time of release, and gaming on the platform was a major draw. Star Raiders is considered the platform's killer app.

The original Atari 400 and 800 models launched with a series of plug-n-play peripherals that used the Atari SIO serial bus system, an early analog of the modern USB. To meet stringent FCC requirements, the early machines were enclosed in a cast aluminum block, which made them physically robust but expensive to produce. Over the following decade, the 400 and 800 were replaced by the XL series, then the XE. The XL and XE are much lighter in construction and less expensive to build, while also having Atari BASIC built-in and reducing the number of joystick ports from 4 to 2. The 130XE, released in 1985, increased the memory to 128K of bank-switched RAM.

The Atari 8-bit computer line sold two million units during its major production run between late 1979 and mid-1985. They were not only sold through dedicated computer retailers, but department stores such as Sears, using an in-store demo to attract customers. The primary competition in the worldwide market came several years later, when the Commodore 64 was introduced in 1982. This was the first computer to offer similar graphics performance, and went on to be the best selling computer of the 8-bit era. Atari also found a strong market in Eastern Europe and had something of a renaissance in the early 1990s as these countries joined a uniting Europe.

In 1992, Atari Corp. officially dropped all remaining support of the 8-bit line.


The byte is a unit of digital information that most commonly consists of eight bits, representing a binary number. Historically, the byte was the number of bits used to encode a single character of text in a computer and for this reason it is the smallest addressable unit of memory in many computer architectures.

The size of the byte has historically been hardware dependent and no definitive standards existed that mandated the size – byte-sizes from 1 to 48 bits are known to have been used in the past. Early character encoding systems often used six bits, and machines using six-bit and nine-bit bytes were common into the 1960s. These machines most commonly had memory words of 12, 24, 36, 48 or 60 bits, corresponding to two, four, six, eight or 10 six-bit bytes. In this era, bytes in the instruction stream were often referred to as syllables, before the term byte became common.

The modern de-facto standard of eight bits, as documented in ISO/IEC 2382-1:1993, is a convenient power of two permitting the values 0 through 255 for one byte (2 in power of 8 = 256, where zero signifies a number as well). The international standard IEC 80000-13 codified this common meaning. Many types of applications use information representable in eight or fewer bits and processor designers optimize for this common usage. The popularity of major commercial computing architectures has aided in the ubiquitous acceptance of the eight-bit size. Modern architectures typically use 32- or 64-bit words, built of four or eight bytes.

The unit symbol for the byte was designated as the upper-case letter B by the International Electrotechnical Commission (IEC) and Institute of Electrical and Electronics Engineers (IEEE) in contrast to the bit, whose IEEE symbol is a lower-case b. Internationally, the unit octet, symbol o, explicitly denotes a sequence of eight bits, eliminating the ambiguity of the byte.


The Compatibility Encoding Scheme for UTF-16: 8-Bit (CESU-8) is a variant of UTF-8 that is described in Unicode Technical Report #26. A Unicode code point from the Basic Multilingual Plane (BMP), i.e. a code point in the range U+0000 to U+FFFF, is encoded in the same way as in UTF-8. A Unicode supplementary character, i.e. a code point in the range U+10000 to U+10FFFF, is first represented as a surrogate pair, like in UTF-16, and then each surrogate code point is encoded in UTF-8. Therefore, CESU-8 needs six bytes (3 bytes per surrogate) for each Unicode supplementary character while UTF-8 needs only four.

The encoding of Unicode supplementary characters works out to 11101101 1010yyyy 10xxxxxx 11101101 1011xxxx 10xxxxxx (yyyy represents the top five bits of the character minus one).

CESU-8 is not an official part of the Unicode Standard, because Unicode Technical Reports are informative documents only. It should be used exclusively for internal processing and never for external data exchange.

Supporting CESU-8 in HTML documents is prohibited by the W3C and WHATWG HTML standards, as it would present a cross-site scripting vulnerability.CESU-8 is similar to Java's Modified UTF-8 but does not have the special encoding of the NUL character (U+0000).

The Oracle database uses CESU-8 for its "UTF8" character set. Standard UTF-8 can be obtained using the character set "AL32UTF8" (since Oracle version 9.0).


Chiptune, also known as chip music or 8-bit music, is a style of synthesized electronic music made using the programmable sound generator (PSG) sound chips in vintage arcade machines, computers and video game consoles. The term is commonly used to refer to tracker format music which intentionally sounds similar to older PSG-created music (this is the original meaning of the term), as well as music that combines PSG sounds with modern musical styles.By the early 1980s, personal computers had become less expensive and more accessible than they had been previously. This led to a proliferation of outdated personal computers and game consoles that had been abandoned by consumers as they upgraded to newer machines. They were in low demand by consumers as a whole, and thus were not difficult to find, making them a highly accessible and affordable method of creating sound or art. While it has been a mostly underground genre, chiptune has had periods of moderate popularity in the 1980s and 21st century, and has influenced the development of electronic dance music.

The terms "chip music" and "chiptune" refer to music made by the sound chips found within early gaming systems and microcomputers.A waveform generator is a fundamental module in a sound synthesis system. A waveform generator usually produces a basic geometrical waveform with a fixed or variable timbre and variable pitch. Common waveform generator configurations usually included two or three simple waveforms and often a single pseudo-random-noise generator (PRNG). Available waveforms often included pulse wave (whose timbre can be varied by modifying the duty cycle), square wave (a symmetrical pulse wave producing only odd overtones), triangle wave (which has a fixed timbre containing only odd harmonics, but is softer than a square wave), and sawtooth wave (which has a bright raspy timbre and contains odd and even harmonics). Two notable examples of systems employing this technology comprise the Nintendo Game Boy portable game console, and the Commodore 64 personal computer. The Game Boy uses two pulse channels (switchable between 12.5%, 25%, 50% and 75% wave duty cycle), a channel for 4-bit pulse-code modulation (PCM) playback, and a pseudo-random-noise generator. The Commodore 64, however, used the MOS Technology SID chip which offered 3 channels, each switchable between pulse, saw-tooth, triangle, and noise. Unlike the Game Boy, the pulse channels on the Commodore 64 allowed full control over wave duty cycles. The SID was a very technically advanced chip, offering many other features including ring modulation and adjustable resonance filters.Due to limited number of voices in those primitive chips, one of the main challenges is to produce rich polyphonic music with them. The usual method to emulate it is via quick arpeggios, which is one of the most relevant features of chiptune music (along, of course, with its electronic timbres).

Some older systems featured a simple beeper as their only sound output, as the original ZX Spectrum and IBM PC; despite this, many skilled programmers were able to produce unexpectedly rich music with this bare hardware, where the sound is fully generated by the system's CPU by direct control of the beeper.

Code page 1287

Code page 1287, also known as CP1287, DEC Greek (8-bit) and EL8DEC, is one of the code pages implemented for the VT220 terminals. It supports the Greek language.

Code page 1288

Code page 1288, also known as CP1288, DEC Turkish (8-bit) and TR8DEC, is one of the code pages implemented for the VT220 terminals. It supports the Turkish language.

File Allocation Table

File Allocation Table (FAT) is a computer file system architecture and a family of industry-standard file systems utilizing it. The FAT file system is a continuing standard which borrows source code from the original, legacy file system and proves to be simple and robust. It offers useful performance even in lightweight implementations, but cannot deliver the same performance, reliability and scalability as some modern file systems. It is, however, supported for compatibility reasons by nearly all currently developed operating systems for personal computers and many mobile devices and embedded systems, and thus is a well-suited format for data exchange between computers and devices of almost any type and age from 1981 up to the present.

Originally designed in 1977 for use on floppy disks, FAT was soon adapted and used almost universally on hard disks throughout the DOS and Windows 9x eras for two decades. As disk drives evolved, the capabilities of the file system have been extended accordingly, resulting in three major file system variants: FAT12, FAT16 and FAT32. The FAT standard has also been expanded in other ways while generally preserving backward compatibility with existing software.

With the introduction of more powerful computers and operating systems, as well as the development of more complex file systems for them, FAT is no longer the default file system for usage on Microsoft Windows computers.FAT file systems are still commonly found on floppy disks, flash and other solid-state memory cards and modules (including USB flash drives), as well as many portable and embedded devices. FAT is the standard file system for digital cameras per the DCF specification.

ISO/IEC 8859

ISO/IEC 8859 is a joint ISO and IEC series of standards for 8-bit character encodings. The series of standards consists of numbered parts, such as ISO/IEC 8859-1, ISO/IEC 8859-2, etc. There are 15 parts, excluding the abandoned ISO/IEC 8859-12. The ISO working group maintaining this series of standards has been disbanded.

ISO/IEC 8859 parts 1, 2, 3, and 4 were originally Ecma International standard ECMA-94.

ISO/IEC 8859-1

ISO/IEC 8859-1:1998, Information technology — 8-bit single-byte coded graphic character sets — Part 1: Latin alphabet No. 1, is part of the ISO/IEC 8859 series of ASCII-based standard character encodings, first edition published in 1987. ISO 8859-1 encodes what it refers to as "Latin alphabet no. 1," consisting of 191 characters from the Latin script. This character-encoding scheme is used throughout the Americas, Western Europe, Oceania, and much of Africa. It is also commonly used in most standard romanizations of East-Asian languages. It is the basis for most popular 8-bit character sets and the first block of characters in Unicode.

ISO-8859-1 is (according to the standards at least) the default encoding of documents delivered via HTTP with a MIME type beginning with "text/" (HTML5 changed this to Windows-1252). As of April 2019, 3.3% of all web sites claim to use ISO 8859-1. However, this includes an unknown number of pages actually using Windows-1252 and/or UTF-8, both of which are commonly recognized by browsers despite the character set tag.

It is the default encoding of the values of certain descriptive HTTP headers, and defines the repertoire of characters allowed in HTML 3.2 documents (HTML 4.0 uses Unicode), and is specified by many other standards. This and similar sets are often assumed to be the encoding of 8-bit text on Unix and Microsoft Windows if there is no byte order mark (BOM), this is only gradually being changed to UTF-8.

ISO-8859-1 is the IANA preferred name for this standard when supplemented with the C0 and C1 control codes from ISO/IEC 6429. The following other aliases are registered: iso-ir-100, csISOLatin1, latin1, l1, IBM819. Code page 28591 a.k.a. Windows-28591 is used for it in Windows. IBM calls it code page 819 or CP819. Oracle calls it WE8ISO8859P1.

KOI character encodings

KOI (КОИ) is a family of several code pages for the Cyrillic script.

The name stands for Kod Obmena Informatsiey (Russian: Код Обмена Информацией) which means "Code for Information Interchange".

A particular feature of the KOI code pages is that the text remains human-readable when the leftmost bit is stripped, should it inadvertently pass through equipment or software that can only deal with 7 bit wide characters. This is due to characters being placed in a special order (128 codepoints apart from the Latin letter they sound most similar to), which, however, does not correspond to the alphabetic order in any language that is written in Cyrillic and necessitates the use of lookup tables to perform sorting.

These encodings are derived from ASCII on the base of some correspondence between Latin and Cyrillic (nearly phonetical), which was already used in Russian dialect of Morse code and in MTK-2 telegraph code. The first 26 characters from А (0xE1) in KOI8-R are А, Б, Ц, Д, Е, Ф, Г, Х, И, Й, К, Л, М, Н, О, П, Я, Р, С, Т, У, Ж, В, Ь, Ы.

List of Intel microprocessors

This generational list of Intel processors attempts to present all of Intel's processors from the pioneering 4-bit 4004 (1971) to the present high-end offerings, which include the 64-bit Itanium 2 (2002), Intel Core i9, and Xeon E3 and E5 series processors (2015). Concise technical data is given for each product.

List of maze video games

Maze game is a video game genre description first used by journalists during the 1980s to describe any game in which the entire playing field is a maze. Quick player action is required to escape monsters, outrace an opponent, or navigate the maze within a time limit. After the release of Namco's Pac-Man in 1980, many maze games followed its conventions of completing a level by traversing all paths and a way of temporarily turning the tables on pursuers.

Master System

The Sega Master System (SMS) is a third-generation home video game console manufactured by Sega. It was originally a remodeled export version of the Sega Mark III, the third iteration of the SG-1000 series of consoles, which was released in Japan in 1985 and featured enhanced graphical capabilities over its predecessors. The Master System launched in North America in 1986, followed by Europe in 1987, and Brazil in 1989. A Japanese version of the Master System was also launched in 1987, which has additional features over the Mark III and other regional variants of the console, namely a built-in FM audio chip, a rapid-fire switch and a dedicated port for the 3D glasses. A cost-reduced model known as the Master System II was released in 1990 in North America and Europe.

The original Master System models used both cartridges and a credit card-sized format known as Sega Cards. Accessories for the consoles were also released such as a light gun and 3D glasses designed to work with a range of specially coded games, which were sold separately or available in certain bundles. The later Master System II redesign removed the card slot, turning it into a strictly cartridge-only system and was incompatible with the 3D glasses by proxy.

The Master System was released in competition with the Nintendo Entertainment System (NES). It had fewer well-reviewed games than the NES, and a smaller library, due to Nintendo licensing policies requiring platform exclusivity. Despite the Master System's newer hardware, it failed to overturn Nintendo's significant market share advantage in Japan and North America. However, it attained significantly more success in Europe and Brazil.

The Master System is estimated to have sold at 13 million units, excluding recent Brazil sales. Retrospective criticism has recognized its role in the development of the Sega Genesis, and a number of well-received games, particularly in PAL regions, but is critical of its limited library in the NTSC regions, which were mainly dominated by Nintendo's NES. As of 2015, the Master System was still in production in Brazil by Tectoy, making it the world's longest-lived console.


A microcontroller (MCU for microcontroller unit, or UC for μ-controller) is a small computer on a single integrated circuit. In modern terminology, it is similar to, but less sophisticated than, a system on a chip (SoC); an SoC may include a microcontroller as one of its components. A microcontroller contains one or more CPUs (processor cores) along with memory and programmable input/output peripherals. Program memory in the form of ferroelectric RAM, NOR flash or OTP ROM is also often included on chip, as well as a small amount of RAM. Microcontrollers are designed for embedded applications, in contrast to the microprocessors used in personal computers or other general purpose applications consisting of various discrete chips.

Microcontrollers are used in automatically controlled products and devices, such as automobile engine control systems, implantable medical devices, remote controls, office machines, appliances, power tools, toys and other embedded systems. By reducing the size and cost compared to a design that uses a separate microprocessor, memory, and input/output devices, microcontrollers make it economical to digitally control even more devices and processes. Mixed signal microcontrollers are common, integrating analog components needed to control non-digital electronic systems. In the context of the internet of things, microcontrollers are an economical and popular means of data collection, sensing and actuating the physical world as edge devices.

Some microcontrollers may use four-bit words and operate at frequencies as low as 4 kHz, for low power consumption (single-digit milliwatts or microwatts). They generally have the ability to retain functionality while waiting for an event such as a button press or other interrupt; power consumption while sleeping (CPU clock and most peripherals off) may be just nanowatts, making many of them well suited for long lasting battery applications. Other microcontrollers may serve performance-critical roles, where they may need to act more like a digital signal processor (DSP), with higher clock speeds and power consumption.

Sonic the Hedgehog (8-bit video game)

Sonic the Hedgehog is a 1991 side-scrolling platform game and companion to the 16-bit Sega Genesis game of the same name for the 8-bit Game Gear and Master System consoles. Ancient—a studio founded by composer Yuzo Koshiro for the project—developed the game and Sega published it to promote the handheld Game Gear. The 8-bit Sonic is similar in style to its Genesis predecessor, but reduced in complexity to fit the 8-bit systems. It was released for the Game Gear on December 28, 1991, and for the Master System around the same time. It was later released through Sonic game compilations and Nintendo's Virtual Console.

The premise and story of the 8-bit Sonic are identical to that of the Genesis game: as the anthropomorphic hedgehog Sonic, the player must race through levels to rescue the imprisoned animals Doctor Robotnik plots to turn into robots. Gameplay is similar too: Sonic collects rings while avoiding obstacles, but is paced slightly slower as the 8-bit version focuses more on exploration. While some level themes, such as Green Hill Zone, are borrowed from the Genesis game, others are original. It also features a different soundtrack from Koshiro, which consists of rearranged versions of Masato Nakamura's tracks for the Genesis game and new material.

Reviewers acclaimed the 8-bit Sonic for its level variety, visuals, gameplay, and audio. Many believed it was just as good as the original, although some criticism was directed at its low difficulty and short length. Game journalists retrospectively considered it one of the best Game Gear and Master System games. The game spawned several sequels, beginning with Sonic the Hedgehog 2 in 1992. It was also Ancient's first game and the only Sonic game they would develop.

Third generation of video game consoles

In the history of computer and video games, the third generation (sometimes referred to as the 8-bit era) began on July 15, 1983, with the Japanese release of two systems: the Nintendo Family Computer (referred to in Japan in the abbreviated form Famicom, and later known as the Nintendo Entertainment System, or NES, to the rest of the world) and Sega SG-1000. This generation marked the end of the North American video game crash, and a shift in the dominance of home video games from the United States to Japan. Handheld consoles were not a major part of this generation, although the Game & Watch line from Nintendo had started in 1980 and the Milton Bradley Microvision came out in 1979 (both considered second generation hardware).

Some features that distinguish third generation consoles from most second generation consoles include:

D-pad game controllers.

Screen modes with resolutions up to 256×240 or 320×200.

Tile-based playfields with smooth multi-directional hardware scrolling.

Advanced hardware scrolling, including per-pixel scrolling, multi-directional scrolling, diagonal scrolling, and line-scrolling.

25–32 colors on screen, from a palette of 53–256 colors.

64–100 sprites on screen, each with 4–16 colors and 8×8 to 16×16 pixel sizes.

Up to five channel (primarily square wave) mono PSG audio.The best-selling console of this generation was the NES/Famicom from Nintendo, followed by the Sega Master System, and then the Atari 7800. Although the previous generation of consoles had also used 8-bit processors, it was at the end of this generation that home consoles were first labeled, and marketed, by their "bits". This also came into fashion as next-generation 16-bit systems like the Sega Mega Drive/Genesis were marketed in order to differentiate between the generations of consoles. In Japan and North America, this generation was primarily dominated by the Famicom/NES, while the Master System dominated the European and Brazilian markets. The end of the 3rd generation of video games was marked by 8-bit consoles becoming obsolete in terms of their graphics and processing power (compared to 16-bit consoles).

Word (computer architecture)

In computing, a word is the natural unit of data used by a particular processor design. A word is a fixed-sized piece of data handled as a unit by the instruction set or the hardware of the processor. The number of bits in a word (the word size, word width, or word length) is an important characteristic of any specific processor design or computer architecture.

The size of a word is reflected in many aspects of a computer's structure and operation; the majority of the registers in a processor are usually word sized and the largest piece of data that can be transferred to and from the working memory in a single operation is a word in many (not all) architectures. The largest possible address size, used to designate a location in memory, is typically a hardware word (here, "hardware word" means the full-sized natural word of the processor, as opposed to any other definition used).

Modern processors, including those in embedded systems, usually have a word size of 8, 16, 24, 32, or 64 bits; those in modern general-purpose computers in particular usually use 32 or 64 bits. Special-purpose digital processors, such as DSPs for instance, may use other sizes, and many other sizes have been used historically, including 9, 12, 18, 24, 26, 36, 39, 40, 48, and 60 bits. Several of the earliest computers (and a few modern as well) used binary-coded decimal rather than plain binary, typically having a word size of 10 or 12 decimal digits, and some early decimal computers had no fixed word length at all.

The size of a word can sometimes differ from the expected due to backward compatibility with earlier computers. If multiple compatible variations or a family of processors share a common architecture and instruction set but differ in their word sizes, their documentation and software may become notationally complex to accommodate the difference (see Size families below).

Instruction set
Word size
Core count

This page is based on a Wikipedia article written by authors (here).
Text is available under the CC BY-SA 3.0 license; additional terms may apply.
Images, videos and audio are available under their respective licenses.