Saturday, May 27, 2017

Cryptography: Why do we study Finite Field


It is almost impossible to fully understand practically any facet of modern cryptography and several important aspects of general computer security if you do not know what is meant by a finite field.

For example, without understanding the notion of a finite field, you will not be able to understand AES

The substitution step in AES is based on the concept of a multiplicative inverse in a finite field (studied in last chapter)

For another example, without understanding finite fields, you will NOT be able to understand the derivation of the RSA algorithm for public-key cryptography that we will study in this chapter.

And if you do not understand the basics of public-key cryptography, you will not be able to understand the workings of several modern protocols (like the SSH protocol you use everyday for logging into other computers) for secure communications over networks.

You will also not be able to understand what has become so important in computer security

user and document authentication with certificates.

Another modern concept that will befuddle you if you do not understand public key cryptography is that of digital rights management

(Digital rights management (DRM) is a systematic approach to copyright protection for digital media. The purpose of DRM is to prevent unauthorized redistribution of digital media and restrict the ways
consumers
can copy content they've purchased.)

And, as I mentioned earlier, you cannot understand public key cryptography without coming to terms with finite fields.

For yet another example, without understanding finite fields, you will never understand the up and coming ECC algorithm (ECC stands for Elliptic Curve Cryptography) that is already in much use and that many consider to be a replacement for RSA for public key cryptography. 
 To put it very simply, a finite field is a finite set of numbers in which you can carry out the operations of addition, subtraction, multiplication, and division without error.
In ordinary computing, division particularly is error prone and what you see is a high-precision approximation to the true result.
Such high-precision approximations do not suffice for cryptography work.
All arithmetic operations must work without error for cryptography.

No comments:

Post a Comment