Thursday, February 23, 2017

Cryptography: Feistel Design Parameters and Consideration


Block size: Larger block sizes mean greater security (all other things being equal) but reduced encryption/decryption speed for a given algorithm. The greater security is achieved by greater diffusion. Traditionally, a block size of 64 bits has been considered a reasonable tradeoff and was nearly universal in block cipher design. However, the new AES uses a 128-bit block size.

Key size: Larger key size means greater security but may decrease encryption/ decryption speed. The greater security is achieved by greater resistance to brute-force attacks and greater confusion. Key sizes of 64 bits or less are now widely considered to be inadequate, and 128 bits has become a common size.

Number of rounds: The essence of the Feistel cipher is that a single round offers inadequate security but that multiple rounds offer increasing security. A typical size is 16 rounds.

Subkey generation algorithm: Greater complexity in this algorithm should lead to greater difficulty of cryptanalysis.

Roundfunction, F: Again, greater complexity generally means greater resistance to cryptanalysis.
Sitewide-Feb4of4640x480

Cryptography: Diffusion and Confusion

Diffusion means that if we change a character of the plaintext, then several characters of the ciphertext should change, and similarly, i-font-family: Calibri; mso-ascii-theme-font: minor-latin; mso-bidi-font-family: +mn-cs; mso-bidi-theme-font: minor-bidi; mso-fareast-font-family: +mn-ea; mso-fareast-theme-font: minor-fareast; mso-font-kerning: 12.0pt; mso-style-textfill-fill-alpha: 100.0%; mso-style-textfill-fill-color: green; mso-style-textfill-type: solid;">if we change a character of the ciphertext, then several characters of the plaintext should change.
dissipates statistical structure of plaintext over bulk of ciphertex
Where did we see this?
We saw that the Hill cipher has this property.
This means that frequency statistics of letters, [digraphs], etc. in the plaintext are diffused over several characters in the ciphertext, which means that much more ciphertext is needed to do a meaningful statistical attack.
Sitewide-Feb4of4640x480
Confusion means that the key does not relate in a simple way to the ciphertext.
each character of the ciphertext should depend on several parts of the key.
makes relationship between ciphertext and key as complex as possible
For example, suppose we have a Hill cipher with an n*n  matrix, and suppose we have a plaintext-ciphertext pair of length n2 with which we are able to solve for the encryption matrix.
If we change one character of the ciphertext, one column of the matrix can change completely.
When a situation like that happens, the cryptanalyst would probably need to solve for the entire key simultaneously, rather than piece by piece.

Cryptography: Ideal Block Cipher


In an ideal block cipher, the relationship between the input blocks and the output block is completely random. But it must be invertible for decryption to work.

Therefore, it has to be one-to- one, meaning that each input block is mapped to a unique output block.

The mapping from the input bit blocks to the output bit blocks can also be construed as a mapping from the integers corresponding to the input bit blocks to the integers corresponding to the output bit blocks.

The encryption key for the ideal block cipher is the codebook itself, meaning the table that shows the relationship between the input blocks and the output blocks
Sitewide-Feb4of4640x480Problems with Ideal Block Cipher

There is a practical problem with the ideal block cipher.
If a small block size, such as n = 4, is used, then the system is equivalent to a classical substitution cipher.
Such systems are vulnerable to a statistical analysis of the plaintext.
This weakness is not inherent in the use of a substitution cipher but rather results from the use of a small block size.
If n is sufficiently large and an arbitrary reversible substitution between plaintext and ciphertext is allowed, then the statistical characteristics of the source plaintext are masked to such an extent that this type of cryptanalysis is infeasible.
However, an arbitrary reversible substitution cipher (the ideal block cipher) for a large block size is not practical from an implementation and performance point of view.
Why?

n=4, required key length: (4 bits)*(16 rows) = 64 bits.
In general, for an n-bit ideal block cipher, the length of the key defined in this fashion is n * 2n bits.
For a 64-bit block, which is a desirable length to thwart statistical attacks, the required key length is 64 * 264 = 270 = 1021 bits.
The size of the encryption key would make the ideal block cipher an impractical idea.
Think of the logistical issues related to the transmission, storage, and processing of such large keys.
Considering these difficulties, what we need to do is make an approximation for large value of n so that it is easily realizable.
 

Cryptography: History of Data Encryption Standard (DES)


In the late 1960s, IBM set up a research project in computer cryptography led by Horst Feistel.

The project concluded in 1971 with the development of an algorithm with the designation LUCIFER, which was sold to Lloyd’s of London for use in a cash-dispensing system, also developed by IBM.

LUCIFER is a Feistel block cipher that operates on blocks of 64 bits, using a key size of 128 bits.

Because of the promising results produced by the LUCIFER project, IBM embarked on an effort to develop a marketable commercial encryption product that ideally could be implemented on a single chip.

The effort was headed by Walter Tuchman and Carl Meyer, and it involved not only IBM researchers but also outside consultants and technical advice from the National Security Agency (NSA).

The outcome of this effort was a refined version of LUCIFER that was more resistant to cryptanalysis but that had a reduced key size of 56 bits, in order to fit on a single chip.

In 1973, the National Bureau of Standards (NBS) issued a request for proposals for a national cipher standard. IBM submitted the results of its Tuchman–Meyer project. This was by far the best algorithm proposed and was adopted in 1977 as the Data Encryption Standard.
Sitewide-Feb4of4640x480

Cryptography: What are the Controversy surrounding Data Encryption Standard (DES)

Sitewide-Feb4of4640x480
Before its adoption as a standard, the proposed DES was subjected to intense criticism, which has not subsided to this day.

Two areas drew the critics’ fire.

1.First, the key length in IBM’s original LUCIFER algorithm was 128 bits, but that of the proposed system was only 56 bits, an enormous reduction in key size of 72 bits.

Critics feared that this key length was too short to withstand brute-force attacks.

2.The second area of concern was that the design criteria for the internal structure of DES, the S-boxes, were classified.

Thus, users could not be sure that the internal structure of DES was free of any hidden weak points that would enable NSA to decipher messages without benefit of the key.

Subsequent events, particularly the recent work on differential cryptanalysis, seem to indicate that DES has a very strong internal structure.

Furthermore, according to IBM participants, the only changes that were made to the proposal were changes to the S-boxes, suggested by NSA, that removed vulnerabilities identified in the course of the evaluation process.