A cryptographic hash function is a special class of hash function that has certain properties which make it suitable for use in cryptography. It is a mathematical algorithm that maps data of arbitrary size to a bit string of a fixed size (a hash) and is designed to be a oneway function, that is, a function which is infeasible to invert.^{[1]} The only way to recreate the input data from an ideal cryptographic hash function's output is to attempt a bruteforce search of possible inputs to see if they produce a match, or use a rainbow table of matched hashes. Bruce Schneier has called oneway hash functions "the workhorses of modern cryptography".^{[2]} The input data is often called the message, and the output (the hash value or hash) is often called the message digest or simply the digest.
The ideal cryptographic hash function has five main properties:
Cryptographic hash functions have many informationsecurity applications, notably in digital signatures, message authentication codes (MACs), and other forms of authentication. They can also be used as ordinary hash functions, to index data in hash tables, for fingerprinting, to detect duplicate data or uniquely identify files, and as checksums to detect accidental data corruption. Indeed, in informationsecurity contexts, cryptographic hash values are sometimes called (digital) fingerprints, checksums, or just hash values, even though all these terms stand for more general functions with rather different properties and purposes.
Secure Hash Algorithms  

Concepts  
hash functions · SHA · DSA  
Main standards  
SHA0 · SHA1 · SHA2 · SHA3  
Most cryptographic hash functions are designed to take a string of any length as input and produce a fixedlength hash value.
A cryptographic hash function must be able to withstand all known types of cryptanalytic attack. In theoretical cryptography, the security level of a cryptographic hash function has been defined using the following properties:
Collision resistance implies second preimage resistance, but does not imply preimage resistance.^{[4]} The weaker assumption is always preferred in theoretical cryptography, but in practice, a hashfunction which is only second preimage resistant is considered insecure and is therefore not recommended for real applications.
Informally, these properties mean that a malicious adversary cannot replace or modify the input data without changing its digest. Thus, if two strings have the same digest, one can be very confident that they are identical. Second preimage resistance prevents an attacker from crafting a document with the same hash as a document the attacker cannot control. Collision resistance prevents an attacker from creating two distinct documents with the same hash.
A function meeting these criteria may still have undesirable properties. Currently popular cryptographic hash functions are vulnerable to lengthextension attacks: given hash(m) and len(m) but not m, by choosing a suitable m' an attacker can calculate hash(m  m') where  denotes concatenation.^{[5]} This property can be used to break naive authentication schemes based on hash functions. The HMAC construction works around these problems.
In practice, collision resistance is insufficient for many practical uses. In addition to collision resistance, it should be impossible for an adversary to find two messages with substantially similar digests; or to infer any useful information about the data, given only its digest. In particular, a hash function should behave as much as possible like a random function (often called a random oracle in proofs of security) while still being deterministic and efficiently computable. This rules out functions like the SWIFFT function, which can be rigorously proven to be collision resistant assuming that certain problems on ideal lattices are computationally difficult, but as a linear function, does not satisfy these additional properties.^{[6]}
Checksum algorithms, such as CRC32 and other cyclic redundancy checks, are designed to meet much weaker requirements, and are generally unsuitable as cryptographic hash functions. For example, a CRC was used for message integrity in the WEP encryption standard, but an attack was readily discovered which exploited the linearity of the checksum.
In cryptographic practice, "difficult" generally means "almost certainly beyond the reach of any adversary who must be prevented from breaking the system for as long as the security of the system is deemed important". The meaning of the term is therefore somewhat dependent on the application since the effort that a malicious agent may put into the task is usually proportional to his expected gain. However, since the needed effort usually multiplies with the digest length, even a thousandfold advantage in processing power can be neutralized by adding a few dozen bits to the latter.
For messages selected from a limited set of messages, for example passwords or other short messages, it can be feasible to invert a hash by trying all possible messages in the set. Because cryptographic hash functions are typically designed to be computed quickly, special key derivation functions that require greater computing resources have been developed that make such brute force attacks more difficult.
In some theoretical analyses "difficult" has a specific mathematical meaning, such as "not solvable in asymptotic polynomial time". Such interpretations of difficulty are important in the study of provably secure cryptographic hash functions but do not usually have a strong connection to practical security. For example, an exponential time algorithm can sometimes still be fast enough to make a feasible attack. Conversely, a polynomial time algorithm (e.g., one that requires n^{20} steps for ndigit keys) may be too slow for any practical use.
An illustration of the potential use of a cryptographic hash is as follows: Alice poses a tough math problem to Bob and claims she has solved it. Bob would like to try it himself, but would yet like to be sure that Alice is not bluffing. Therefore, Alice writes down her solution, computes its hash and tells Bob the hash value (whilst keeping the solution secret). Then, when Bob comes up with the solution himself a few days later, Alice can prove that she had the solution earlier by revealing it and having Bob hash it and check that it matches the hash value given to him before. (This is an example of a simple commitment scheme; in actual practice, Alice and Bob will often be computer programs, and the secret would be something less easily spoofed than a claimed puzzle solution).
An important application of secure hashes is verification of message integrity. Comparing message digests (hash digests over the message) calculated before, and after, transmission can determine whether any changes have been made to the message or file.
MD5, SHA1, or SHA2 hash digests are sometimes published on websites or forums to allow verification of integrity for downloaded files,^{[7]} including files retrieved using file sharing such as mirroring. This practice establishes a chain of trust so long as the hashes are posted on a site authenticated by HTTPS. Using a cryptographic hash and a chain of trust prevents malicious changes to the file to go undetected. Other error detecting codes such as cyclic redundancy checks only prevent against nonmalicious alterations of the file.
Almost all digital signature schemes require a cryptographic hash to be calculated over the message. This allows the signature calculation to be performed on the relatively small, statically sized hash digest. The message is considered authentic if the signature verification succeeds given the signature and recalculated hash digest over the message. So the message integrity property of the cryptographic hash is used to create secure and efficient digital signature schemes.
Password verification commonly relies on cryptographic hashes. Storing all user passwords as cleartext can result in a massive security breach if the password file is compromised. One way to reduce this danger is to only store the hash digest of each password. To authenticate a user, the password presented by the user is hashed and compared with the stored hash. A password reset method is required when password hashing is performed; original passwords cannot be recalculated from the stored hash value.
Standard cryptographic hash functions are designed to be computed quickly, and, as a result, it is possible to try guessed passwords at high rates. Common graphics processing units can try billions of possible passwords each second. Password hash functions that perform Key stretching  such as PBKDF2, scrypt or Argon2  commonly use repeated invocations of a cryptographic hash to increase the time (and in some cases computer memory) required to perform brute force attacks on stored password hash digests. A password hash requires the use of a large random, nonsecret salt value which can be stored with the password hash. The salt randomizes the output of the password hash, making it impossible for an adversary to store tables of passwords and precomputed hash values to which the password hash digest can be compared.
The output of a password hash function can also be used as a cryptographic key. Password hashes are therefore also known as Password Based Key Derivation Functions (PBKDFs).
A proofofwork system (or protocol, or function) is an economic measure to deter denialofservice attacks and other service abuses such as spam on a network by requiring some work from the service requester, usually meaning processing time by a computer. A key feature of these schemes is their asymmetry: the work must be moderately hard (but feasible) on the requester side but easy to check for the service provider. One popular system – used in Bitcoin mining and Hashcash – uses partial hash inversions to prove that work was done, to unlock a mining reward in Bitcoin and as a goodwill token to send an email in Hashcash. The sender is required to find a message whose hash value begins with a number of zero bits. The average work that sender needs to perform in order to find a valid message is exponential in the number of zero bits required in the hash value, while the recipient can verify the validity of the message by executing a single hash function. For instance, in Hashcash, a sender is asked to generate a header whose 160 bit SHA1 hash value has the first 20 bits as zeros. The sender will on average have to try 2^{19} times to find a valid header.
A message digest can also serve as a means of reliably identifying a file; several source code management systems, including Git, Mercurial and Monotone, use the sha1sum of various types of content (file content, directory trees, ancestry information, etc.) to uniquely identify them. Hashes are used to identify files on peertopeer filesharing networks. For example, in an ed2k link, an MD4variant hash is combined with the file size, providing sufficient information for locating file sources, downloading the file and verifying its contents. Magnet links are another example. Such file hashes are often the top hash of a hash list or a hash tree which allows for additional benefits.
One of the main applications of a hash function is to allow the fast lookup of a data in a hash table. Being hash functions of a particular kind, cryptographic hash functions lend themselves well to this application too.
However, compared with standard hash functions, cryptographic hash functions tend to be much more expensive computationally. For this reason, they tend to be used in contexts where it is necessary for users to protect themselves against the possibility of forgery (the creation of data with the same digest as the expected data) by potentially malicious participants.
There are several methods to use a block cipher to build a cryptographic hash function, specifically a oneway compression function.
The methods resemble the block cipher modes of operation usually used for encryption. Many wellknown hash functions, including MD4, MD5, SHA1 and SHA2 are built from blockcipherlike components designed for the purpose, with feedback to ensure that the resulting function is not invertible. SHA3 finalists included functions with blockcipherlike components (e.g., Skein, BLAKE) though the function finally selected, Keccak, was built on a cryptographic sponge instead.
A standard block cipher such as AES can be used in place of these custom block ciphers; that might be useful when an embedded system needs to implement both encryption and hashing with minimal code size or hardware area. However, that approach can have costs in efficiency and security. The ciphers in hash functions are built for hashing: they use large keys and blocks, can efficiently change keys every block, and have been designed and vetted for resistance to relatedkey attacks. Generalpurpose ciphers tend to have different design goals. In particular, AES has key and block sizes that make it nontrivial to use to generate long hash values; AES encryption becomes less efficient when the key changes each block; and relatedkey attacks make it potentially less secure for use in a hash function than for encryption.
A hash function must be able to process an arbitrarylength message into a fixedlength output. This can be achieved by breaking the input up into a series of equalsized blocks, and operating on them in sequence using a oneway compression function. The compression function can either be specially designed for hashing or be built from a block cipher. A hash function built with the Merkle–Damgård construction is as resistant to collisions as is its compression function; any collision for the full hash function can be traced back to a collision in the compression function.
The last block processed should also be unambiguously length padded; this is crucial to the security of this construction. This construction is called the Merkle–Damgård construction. Most common classical hash functions, including SHA1 and MD5, take this form.
A straightforward application of the Merkle–Damgård construction, where the size of hash output is equal to the internal state size (between each compression step), results in a narrowpipe hash design. This design causes many inherent flaws, including lengthextension, multicollisions,^{[8]} long message attacks,^{[9]} generateandpaste attacks, and also cannot be parallelized. As a result, modern hash functions are built on widepipe constructions that have a larger internal state size — which range from tweaks of the Merkle–Damgård construction^{[8]} to new constructions such as the sponge construction and HAIFA construction.^{[10]} None of the entrants in the NIST hash function competition use a classical Merkle–Damgård construction.^{[11]}
Meanwhile, truncating the output of a longer hash, such as used in SHA512/256, also defeats many of these attacks.^{[12]}
Hash functions can be used to build other cryptographic primitives. For these other primitives to be cryptographically secure, care must be taken to build them correctly.
Message authentication codes (MACs) (also called keyed hash functions) are often built from hash functions. HMAC is such a MAC.
Just as block ciphers can be used to build hash functions, hash functions can be used to build block ciphers. LubyRackoff constructions using hash functions can be provably secure if the underlying hash function is secure. Also, many hash functions (including SHA1 and SHA2) are built by using a specialpurpose block cipher in a Davies–Meyer or other construction. That cipher can also be used in a conventional mode of operation, without the same security guarantees. See SHACAL, BEAR and LION.
Pseudorandom number generators (PRNGs) can be built using hash functions. This is done by combining a (secret) random seed with a counter and hashing it.
Some hash functions, such as Skein, Keccak, and RadioGatún output an arbitrarily long stream and can be used as a stream cipher, and stream ciphers can also be built from fixedlength digest hash functions. Often this is done by first building a cryptographically secure pseudorandom number generator and then using its stream of random bytes as keystream. SEAL is a stream cipher that uses SHA1 to generate internal tables, which are then used in a keystream generator more or less unrelated to the hash algorithm. SEAL is not guaranteed to be as strong (or weak) as SHA1. Similarly, the key expansion of the HC128 and HC256 stream ciphers makes heavy use of the SHA256 hash function.
Concatenating outputs from multiple hash functions provides collision resistance as good as the strongest of the algorithms included in the concatenated result. For example, older versions of Transport Layer Security (TLS) and Secure Sockets Layer (SSL) use concatenated MD5 and SHA1 sums.^{[13]}^{[14]} This ensures that a method to find collisions in one of the hash functions does not defeat data protected by both hash functions.
For Merkle–Damgård construction hash functions, the concatenated function is as collisionresistant as its strongest component, but not more collisionresistant. Antoine Joux observed that 2collisions lead to ncollisions: if it is feasible for an attacker to find two messages with the same MD5 hash, the attacker can find as many messages as the attacker desires with identical MD5 hashes with no greater difficulty.^{[15]} Among the n messages with the same MD5 hash, there is likely to be a collision in SHA1. The additional work needed to find the SHA1 collision (beyond the exponential birthday search) requires only polynomial time.^{[16]}^{[17]}
There are many cryptographic hash algorithms; this section lists a few algorithms that are referenced relatively often. A more extensive list can be found on the page containing a comparison of cryptographic hash functions.
MD5 was designed by Ronald Rivest in 1991 to replace an earlier hash function MD4, and was specified in 1992 as RFC 1321. Collisions against MD5 can be calculated within seconds which makes the algorithm unsuitable for most use cases where a cryptographic hash is required. MD5 produces a digest of 128 bits (16 bytes).
SHA1 was developed as part of the U.S. Government's Capstone project. The original specification  now commonly called SHA0  of the algorithm was published in 1993 under the title Secure Hash Standard, FIPS PUB 180, by U.S. government standards agency NIST (National Institute of Standards and Technology). It was withdrawn by the NSA shortly after publication and was superseded by the revised version, published in 1995 in FIPS PUB 1801 and commonly designated SHA1. Collisions against the full SHA1 algorithm can be produced using the shattered attack and the hash function should be considered broken. SHA1 produces a hash digest of 160 bits (20 bytes).
Documents may refer to SHA1 as just "SHA", even though this may conflict with the other Standard Hash Algorithms such as SHA0, SHA2 and SHA3.
RIPEMD (RACE Integrity Primitives Evaluation Message Digest) is a family of cryptographic hash functions developed in Leuven, Belgium, by Hans Dobbertin, Antoon Bosselaers and Bart Preneel at the COSIC research group at the Katholieke Universiteit Leuven, and first published in 1996. RIPEMD was based upon the design principles used in MD4, and is similar in performance to the more popular SHA1. RIPEMD160 has however not been broken. As the name implies, RIPEMD160 produces a hash digest of 160 bits (20 bytes).
In computer science and cryptography, Whirlpool is a cryptographic hash function. It was designed by Vincent Rijmen and Paulo S. L. M. Barreto, who first described it in 2000. Whirlpool is based on a substantially modified version of the Advanced Encryption Standard (AES). Whirlpool produces a hash digest of 512 bits (64 bytes).
SHA2 (Secure Hash Algorithm 2) is a set of cryptographic hash functions designed by the United States National Security Agency (NSA), first published in 2001.[3] They are built using the Merkle–Damgård structure, from a oneway compression function itself built using the Davies–Meyer structure from a (classified) specialized block cipher.
SHA2 basically consists of two hash algorithms: SHA256 and SHA512. SHA224 is a variant of SHA256 with different starting values and truncated output. SHA384 and the lesser known SHA512/224 and SHA512/256 are all variants of SHA512. SHA512 is more secure than SHA256 and is commonly faster than SHA256 on 64 bit machines such as AMD64.
The output size in bits is given by the extension to the "SHA" name, so SHA224 has an output size of 224 bits (28 bytes), SHA256 produces 32 bytes, SHA384 produces 48 bytes and finally SHA512 produces 64 bytes.
SHA3 (Secure Hash Algorithm 3) was released by NIST on August 5, 2015. SHA3 is a subset of the broader cryptographic primitive family Keccak. The Keccak algorithm is the work of Guido Bertoni, Joan Daemen, Michael Peeters, and Gilles Van Assche. Keccak is based on a sponge construction which can also be used to build other cryptographic primitives such as a stream cipher. SHA3 provides the same output sizes as SHA2: 224, 256, 384 and 512 bits.
Configurable output sizes can also be obtained using the SHAKE128 and SHAKE256 functions. Here the 128 and 256 extensions to the name imply the security strength of the function rather than the output size in bits.
An improved version of BLAKE called BLAKE2 was announced in December 21, 2012. It was created by JeanPhilippe Aumasson, Samuel Neves, Zooko WilcoxO'Hearn, and Christian Winnerlein with the goal to replace widely used, but broken MD5 and SHA1 algorithms. When run on 64bit x64 and ARM architectures, BLAKE2b is faster than SHA3, SHA2, SHA1, and MD5. Although BLAKE nor BLAKE2 have not been standardized as SHA3 it has been used in many protocols including the Argon2 password hash for the high efficiency that it offers on modern CPUs. As BLAKE was a candidate for SHA3, BLAKE and BLAKE2 both offer the same output sizes as SHA3  including a configurable output size.
There is a long list of cryptographic hash functions but many have been found to be vulnerable and should not be used. For instance, NIST selected 51 hash functions^{[18]} as candidates for round 1 of the SHA3 hash competition, of which 10 were considered broken and 16 showed significant weaknesses and therefore didn't make it to the next round; more information can be found on the main article about the NIST hash function competitions.
Even if a hash function has never been broken, a successful attack against a weakened variant may undermine the experts' confidence. For instance, in August 2004 collisions were found in several thenpopular hash functions, including MD5.^{[19]} These weaknesses called into question the security of stronger algorithms derived from the weak hash functions—in particular, SHA1 (a strengthened version of SHA0), RIPEMD128, and RIPEMD160 (both strengthened versions of RIPEMD).
On 12 August 2004, Joux, Carribault, Lemuet, and Jalby announced a collision for the full SHA0 algorithm. Joux et al. accomplished this using a generalization of the Chabaud and Joux attack. They found that the collision had complexity 2^{51} and took about 80,000 CPU hours on a supercomputer with 256 Itanium 2 processors—equivalent to 13 days of fulltime use of the supercomputer.
In February 2005, an attack on SHA1 was reported that would find collision in about 2^{69} hashing operations, rather than the 2^{80} expected for a 160bit hash function. In August 2005, another attack on SHA1 was reported that would find collisions in 2^{63} operations. Other theoretical weaknesses of SHA1 have been known:^{[20]}^{[21]} and in February 2017 Google announced a collision in SHA1.^{[22]} Security researchers recommend that new applications can avoid these problems by using later members of the SHA family, such as SHA2, or using techniques such as randomized hashing^{[23]}^{[1]} that do not require collision resistance.
A successful, practical attack broke MD5 used within certificates for Transport Layer Security in 2008.^{[24]}
Many cryptographic hashes are based on the Merkle–Damgård construction. All cryptographic hashes that directly use the full output of a Merkle–Damgård construction are vulnerable against length extension attacks. This makes the MD5, SHA1, RIPEMD160, Whirlpool and the SHA256 / SHA512 hash algorithms all vulnerable against this specific attack. SHA3, BLAKE2 and the truncated SHA2 variants are not vulnerable against this type of attack.

Much more than encryption algorithms, oneway hash functions are the workhorses of modern cryptography.
The BEAR and LION block ciphers were invented by Ross Anderson and Eli Biham by combining a stream cipher and a cryptographic hash function. The algorithms use a very large variable block size, on the order of 213 to 223 bits or more. Both are 3round generalized (alternating) Feistel ciphers, using the hash function and the stream cipher as round functions. BEAR uses the hash function twice with independent keys, and the stream cipher once. LION uses the stream cipher twice and the hash function once. The inventors proved that an attack on either BEAR or LION that recovers the key would break both the stream cipher and the hash.
HAS160HAS160 is a cryptographic hash function designed for use with the Korean KCDSA digital signature algorithm. It is derived from SHA1, with assorted changes intended to increase its security. It produces a 160bit output.
HAS160 is used in the same way as SHA1. First it divides input in blocks of 512 bits each and pads the final block. A digest function updates the intermediate hash value by processing the input blocks in turn.
The message digest algorithm consists of 80 rounds.
HAVALHAVAL is a cryptographic hash function. Unlike MD5, but like most modern cryptographic hash functions, HAVAL can produce hashes of different lengths – 128 bits, 160 bits, 192 bits, 224 bits, and 256 bits. HAVAL also allows users to specify the number of rounds (3, 4, or 5) to be used to generate the hash. HAVAL was broken in 2004.HAVAL was invented by Yuliang Zheng, Josef Pieprzyk, and Jennifer Seberry in 1992.
HMACIn cryptography, an HMAC (sometimes expanded as either keyedhash message authentication code or hashbased message authentication code) is a specific type of message authentication code (MAC) involving a cryptographic hash function and a secret cryptographic key. It may be used to simultaneously verify both the data integrity and the authentication of a message, as with any MAC. Any cryptographic hash function, such as SHA256 or SHA3, may be used in the calculation of an HMAC; the resulting MAC algorithm is termed HMACX, where X is the hash function used (e.g. HMACSHA256 or HMACSHA3). The cryptographic strength of the HMAC depends upon the cryptographic strength of the underlying hash function, the size of its hash output, and the size and quality of the key.
HMAC uses two passes of hash computation. The secret key is first used to derive two keys – inner and outer. The first pass of the algorithm produces an internal hash derived from the message and the inner key. The second pass produces the final HMAC code derived from the inner hash result and the outer key. Thus the algorithm provides better immunity against length extension attacks.
An iterative hash function breaks up a message into blocks of a fixed size and iterates over them with a compression function. For example, SHA256 operates on 512bit blocks. The size of the output of HMAC is the same as that of the underlying hash function (e.g., 256 and 1600 bits in the case of SHA256 and SHA3, respectively), although it can be truncated if desired.
HMAC does not encrypt the message. Instead, the message (encrypted or not) must be sent alongside the HMAC hash. Parties with the secret key will hash the message again themselves, and if it is authentic, the received and computed hashes will match.
The definition and analysis of the HMAC construction was first published in 1996 in a paper by Mihir Bellare, Ran Canetti, and Hugo Krawczyk, and they also wrote RFC 2104 in 1997. The 1996 paper also defined a variant called NMAC. FIPS PUB 198 generalizes and standardizes the use of HMACs. HMAC is used within the IPsec and TLS protocols and for JSON Web Tokens.
Hash chainA hash chain is the successive application of a cryptographic hash function to a piece of data. In computer security, a hash chain is a method to produce many onetime keys from a single key or password. For nonrepudiation a hash function can be applied successively to additional pieces of data in order to record the chronology of data's existence.
JH (hash function)JH is a cryptographic hash function submitted to the NIST hash function competition by Hongjun Wu. Though chosen as one of the five finalists of the competition, JH ultimately lost to NIST hash candidate Keccak. JH has a 1024bit state, and works on 512bit input blocks. Processing an input block consists of three steps:
XOR the input block into the left half of the state.
Apply a 42round unkeyed permutation (encryption function) to the state. This consists of 42 repetitions of:
Break the input into 256 4bit blocks, and map each through one of two 4bit Sboxes, the choice being made by a 256bit rounddependent key schedule. Equivalently, combine each input block with a key bit, and map the result through a 5→4 bit Sbox.
Mix adjacent 4bit blocks using a maximum distance separable code over GF(24).
Permute 4bit blocks so that they will be adjacent to different blocks in following rounds.
XOR the input block into the right half of the state.The resulting digest is the first 224, 256, 384 or 512 bits from the 1024bit final value.
It is well suited to a bit slicing implementation using the SSE2 instruction set, giving speeds of 16.8 cycles per byte.
KupynaKupyna is a cryptographic hash function defined in the Ukrainian national standard DSTU 7564:2014. It was created to replace an obsolete GOST hash function defined in the old standard GOST 34.1195, similar to Streebog hash function standardized in Russia.
In addition to the hash function, the standard also describes message authentication code generation using Kupyna with digest sizes 256, 384 and 512 bits.
MD2 (hash function)The MD2 MessageDigest Algorithm is a cryptographic hash function developed by Ronald Rivest in 1989. The algorithm is optimized for 8bit computers. MD2 is specified in RFC 1319. Although MD2 is no longer considered secure, even as of 2014, it remains in use in public key infrastructures as part of certificates generated with MD2 and RSA. The "MD" in MD2 stands for "Message Digest".
MD4The MD4 MessageDigest Algorithm is a cryptographic hash function developed by Ronald Rivest in 1990. The digest length is 128 bits. The algorithm has influenced later designs, such as the MD5, SHA1 and RIPEMD algorithms. The initialism "MD" stands for "Message Digest."
The security of MD4 has been severely compromised. The first full collision attack against MD4 was published in 1995 and several newer attacks have been published since then. As of 2007, an attack can generate collisions in less than 2 MD4 hash operations. A theoretical preimage attack also exists.
A variant of MD4 is used in the ed2k URI scheme to provide a unique identifier for a file in the popular eDonkey2000 / eMule P2P networks. MD4 was also used by the rsync protocol (prior to version 3.0.0.)
MD4 is used to compute NTLM passwordderived key digests on Microsoft Windows NT, XP, Vista, 7, 8, and 10.
MD6The MD6 MessageDigest Algorithm is a cryptographic hash function. It uses a Merkle treelike structure to allow for immense parallel computation of hashes for very long inputs. Authors claim a performance of 28 cycles per byte for MD6256 on an Intel Core 2 Duo and provable resistance against differential cryptanalysis. The source code of the reference implementation was released under MIT license.Speeds in excess of 1 GB/s have been reported to be possible for long messages on 16core CPU architecture.In December 2008, Douglas Held of Fortify Software discovered a buffer overflow in the original MD6 hash algorithm's reference implementation. This error was later made public by Ron Rivest on 19 February 2009, with a release of a corrected reference implementation in advance of the Fortify Report.MD6 was submitted to the NIST SHA3 competition. However, on July 1, 2009, Rivest posted a comment at NIST that MD6 is not yet ready to be a candidate for SHA3 because of speed issues, a "gap in the proof that the submitted version of MD6 is resistant to differential attacks", and an inability to supply such a proof for a faster reducedround version, although Rivest also stated at the MD6 website that it is not withdrawn formally. MD6 did not advance to the second round of the SHA3 competition. In September 2011, a paper presenting an improved proof that MD6 and faster reducedround versions are resistant to differential attacks was posted to the MD6 website.
NHashIn cryptography, NHash is a cryptographic hash function based on the FEAL round function, and is now considered insecure. It was proposed in 1990 by Miyaguchi et al.; weaknesses were published the following year.
NHash has a 128bit hash size. A message is divided into 128bit blocks, and each block is combined with the hash value computed so far using the g compression function. g contains eight rounds, each of which uses an F function, similar to the one used by FEAL.
Eli Biham and Adi Shamir (1991) applied the technique of differential cryptanalysis to NHash, and showed that collisions could be generated faster than by a birthday attack for NHash variants with even up to 12 rounds.
Preimage attackIn cryptography, a preimage attack on cryptographic hash functions tries to find a message that has a specific hash value. A cryptographic hash function should resist attacks on its preimage.
In the context of attack, there are two types of preimage resistance:
preimage resistance: for essentially all prespecified outputs, it is computationally infeasible to find any input that hashes to that output, i.e., given y, it is difficult to find an x such that h(x) = y.
secondpreimage resistance: it is computationally infeasible to find any second input which has the same output as that of a specified input, i.e., given x, it is difficult to find a second preimage x′ ≠ x such that h(x) = h(x′).These can be compared with a collision resistance, in which it is computationally infeasible to find any two distinct inputs x, x′ that hash to the same output, i.e., such that h(x) = h(x′).Collision resistance implies secondpreimage resistance, but does not guarantee preimage resistance. Conversely, a secondpreimage attack implies a collision attack (trivially, since, in addition to x′, x is already known right from the start).
Public key fingerprintIn publickey cryptography, a public key fingerprint is a short sequence of bytes used to identify a longer public key. Fingerprints are created by applying a cryptographic hash function to a public key. Since fingerprints are shorter than the keys they refer to, they can be used to simplify certain key management tasks. In Microsoft software, "thumbprint" is used instead of "fingerprint".
SM3 (hash function)SM3 is a cryptographic hash function used in the Chinese National Standard. It was published by the State Cryptography Administration (Chinese: 国家密码管理局) on 20101217 as "GM/T 00042012: SM3 cryptographic hash algorithm".SM3 is mainly used in digital signatures, message authentication codes, and pseudorandom number generators. The algorithm is public and is claimed by the China Internet Network Information Center to be like SHA256 in security and efficiency.
Security of cryptographic hash functionsIn cryptography, cryptographic hash functions can be divided into two main categories. In the first category are those functions whose designs are based on a mathematical problem and thus their security follows from rigorous mathematical proofs, complexity theory and formal reduction. These functions are called Provably Secure Cryptographic Hash Functions. However this does not mean that such a function could not be broken. To construct them is very difficult and only a few examples were introduced. The practical use is limited.
In the second category are functions that are not based on mathematical problems but on an ad hoc basis, where the bits of the message are mixed to produce the hash. They are then believed to be hard to break, but no such formal proof is given. Almost all widely spread hash functions fall in this category. Some of these functions are already broken and are no longer in use.
Skein (hash function)Skein is a cryptographic hash function and one of five finalists in the NIST hash function competition. Entered as a candidate to become the SHA3 standard, the successor of SHA1 and SHA2, it ultimately lost to NIST hash candidate Keccak.The name Skein refers to how the Skein function intertwines the input, similar to a skein of yarn.
SnefruSnefru is a cryptographic hash function invented by Ralph Merkle in 1990 while working at Xerox PARC. The function supports 128bit and 256bit output. It was named after the Egyptian Pharaoh Sneferu, continuing the tradition of the Khufu and Khafre block ciphers.
The original design of Snefru was shown to be insecure by Eli Biham and Adi Shamir who were able to use differential cryptanalysis to find hash collisions. The design was then modified by increasing the number of iterations of the main pass of the algorithm from two to eight. Although differential cryptanalysis can break the revised version with less complexity than brute force search (a certificational weakness), the attack requires operations and is thus not currently feasible in practice.
Tiger (hash function)In cryptography, Tiger is a cryptographic hash function designed by Ross Anderson and Eli Biham in 1995 for efficiency on 64bit platforms. The size of a Tiger hash value is 192 bits. Truncated versions (known as Tiger/128 and Tiger/160) can be used for compatibility with protocols assuming a particular hash size. Unlike the SHA2 family, no distinguishing initialization values are defined; they are simply prefixes of the full Tiger/192 hash value.
Tiger2 is a variant where the message is padded by first appending a byte with the hexadecimal value of 0x80 as in MD4, MD5 and SHA, rather than with the hexadecimal value of 0x01 as in the case of Tiger. The two variants are otherwise identical.
Whirlpool (hash function)In computer science and cryptography, Whirlpool (sometimes styled WHIRLPOOL) is a cryptographic hash function. It was designed by Vincent Rijmen (cocreator of the Advanced Encryption Standard) and Paulo S. L. M. Barreto, who first described it in 2000.
The hash has been recommended by the NESSIE project. It has also been adopted by the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC) as part of the joint ISO/IEC 101183 international standard.
This page is based on a Wikipedia article written by authors
(here).
Text is available under the CC BYSA 3.0 license; additional terms may apply.
Images, videos and audio are available under their respective licenses.