Shannon Entropy and Primitive Roots

Shannon Entropy

Shannon entropy is a measure of the unpredictability or randomness in a probability distribution. It quantifies the amount of information or uncertainty associated with a random variable. Mathematically, it is defined as:

        H(X) = -Σ P(x) log(P(x))
    

Here:

Shannon entropy is widely used in information theory, cryptography, and data compression to analyze and optimize systems involving uncertainty or variability.

Other Diversity Measures of Distributions

Primitive Roots

A primitive root modulo p (where p is a prime number) is an integer g such that for every integer a coprime to p, there exists an integer k satisfying the equation:

        g^k ≡ a (mod p)
    

In simpler terms, the powers of g modulo p generate all integers coprime to p. For example, if p = 7, the primitive root g = 3 produces the sequence {3, 2, 6, 4, 5, 1}, covering all numbers coprime to 7.

Properties of Primitive Roots

Applications

Primitive roots are used in number theory, cryptography (e.g., Diffie-Hellman key exchange), and algorithms that rely on modular arithmetic and cyclic groups.

Exercise

Assignment:

Find and compile a sufficiently large piece of text by selecting several web pages and create a letter frequency distribution. Choose a random shift value (e.g., 1-25, with wrap-around) and apply the Caesar cipher to encrypt the original text: E = L + shift for each letter L of the message. Use frequency analysis or find any efficient and effective strategy to find the shift and decrypt the message.

Solution