Today I'd like to share a bit of math involving ideas from information theory, algebra, and topology. It's all in a new paper I've rece

Entropy + Algebra + Topology = ?

submited by
Style Pass
2021-07-22 10:00:01

Today I'd like to share a bit of math involving ideas from information theory, algebra, and topology. It's all in a new paper I've recently uploaded to the arXiv, whose abstract you can see on the right. The paper is short — just 11 pages! Even so, I thought it'd be nice to stroll through some of the surrounding mathematics here.

To introduce those ideas, let's start by thinking about the function $d\colon[0,1]\to\mathbb{R}$ defined by $d(x)=-x\log x$ when $x>0$ and $d(x)=0$ when $x=0$. Perhaps after getting out pencil and paper, it's easy to check that this function satisfies an equation that looks a lot like the product rule from Calculus:

Functions that satisfy an equation reminiscent of the "Leibniz rule," like this one, are called derivations, which invokes the familiar idea of a derivative. The nonzero term $-x\log x$ above may also look familiar to some of you. It's an expression that appears in the Shannon entropy of a probability distribution. A probability distribution on a finite set $\{1,\ldots,n\}$ for $n\geq 1$ is a sequence $p=(p_1,\ldots,p_n)$ of nonnegative real numbers satisfying $\sum_{i=1}^np_i=1$, and the Shannon entropy of $p$ is defined to be

Now it turns out that the function $d$ is nonlinear, which means we can't pull it out in front of the summation. In other words,  $H(p)\neq d(\sum_ip_i).$ Even so, curiosity might cause us to wonder about settings in which Shannon entropy is itself a derivation. One such setting is described in the paper above, which shows a correspondence between Shannon entropy and derivations of (wait for it...) topological simplices!

Leave a Comment
Related Posts