It is important to pay our respects to the one subject that the universe seems to be built upon. If you haven’t put two and two together yet, don’t fret.
Instead, join me even as you start getting a tad bit mathematically inclined, by about 45 degrees let's say. Stay wonderstruck as you realise that even the most basic of mathematics permeates our lives in countless ways.
We get so caught up in our live—but not so lively—streams, online shopping, and social networking that we forget that absolutely nothing on a computer happens without numbers. Be it posting a kitten video, tweeting our political leanings, or showing the world what we had for breakfast—it all boils down to binary code. The numbers 0 and 1. Maybe one day they'll figure out how to encrypt email using icons and emojis, but till then, we have to surrender to mathematics.
Carl Friedrich Gauss, one of the greatest mathematicians of all time, proclaimed, "Mathematics is the queen of the sciences and number theory is the queen of mathematics," and I couldn’t agree more.
Mathematics is ever-present in the glory of Mother Nature, in every leaf and every petal. It, therefore, comes as no surprise that her royal highness would form the basis of information security—or if I may get ahead of myself, all of computer science itself.
A few of the topics perceived to be the most important are those that I will touch upon briefly in this article.
Number Theory
Number Theory plays an important role in encryption algorithms. Cryptography, the practise of hiding information and converting secret information to unreadable texts is purely dependent on numbers. Many tools in number theory like primes, divisors, congruencies, and Euler's function play important roles in cryptography. For example, they are used in simple Caesar ciphers, as well as complex RSA public-key cryptography. This gives an idea of the cryptosystem in the context of Algebra and Number Theory.
Information Theory
Information theory studies the quantification, storage, and communication of information. A key quantity in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process.
For example, identifying the outcome of tossing a fair coin (with two equally likely outcomes) has lower entropy than specifying the outcome from a roll of a die (with six equally likely outcomes).
Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory include source coding, algorithmic complexity theory, algorithmic information theory, and information-theoretic security.
Computational complexity theory
Computational complexity theory focuses on classifying computational problems according to their resource usage and relating these classes to each other. A computational problem is a task solved by a computer. A computation problem can be solved by the mechanical application of mathematical steps, such as an algorithm. A problem is regarded as inherently difficult if its solution requires significant resources, whatever the algorithm used. The theory formalizes this intuition, by introducing mathematical models of computation to study these problems and quantifying their computational complexity, i.e., the number of resources needed to solve them, such as time and storage.
Other measures of complexity are also used, such as the amount of communication (used in communication complexity), the number of gates in a circuit (used in circuit complexity), and the number of processors (used in parallel computing). One of the roles of computational complexity theory is to determine the practical limits on what computers can and cannot do.
What can computers not do? The P versus NP problem is a major unsolved problem in computer science. It asks that if a problem is easy to check the solution for, is it also easy to find the solution to?
That sounds simple, but it really isn’t.
An apt example would be this:
P: finding the shortest path in a graph.
NP: the travelling salesman problem—given a list of cities and the distances between each pair of cities, what is the shortest possible route that visits each city exactly once and returns to the origin city?
It is one of the seven Millennium Prize Problems shortlisted by the Clay Mathematics Institute, each of which carries a US$1,000,000 prize for the first correct solution.
Does that sound exciting? I certainly think so. As interesting as cybersecurity and information security sound, there is a decent level of complexity in the mathematics that keeps it ticking behind the scenes. I hope you enjoyed reading about a rather small but important bit of the mathematics involved in the domain of information security. And, if you are interested in taking up infosec or cybersec, make sure you have your math figured out!
Written By: Adithi Bhat