Stack Exchange network consists of 183 Q&A communities including Stack Overflow, the largest, most trusted online community for developers

Is the largest root of a random polynomial more likely to be real than complex?

submited by
Style Pass
2024-05-10 09:30:09

Stack Exchange network consists of 183 Q&A communities including Stack Overflow, the largest, most trusted online community for developers to learn, share their knowledge, and build their careers.

This question might be hard because it got $35$ upvotes in MSE and also had a $200$ points bounty by Jyrki Lahtonen but it was unanswered. So I am posting it in MO.

The number of real roots of a random polynomial with real coefficients is much smaller than the number of complex roots. Assume that the coefficients are independently and uniformly random in $(-1,1)$ for if not then we can divide each coefficient by the coefficient with the largest absolutely value to scale each coefficient to $(-1,1)$ . The number of real roots of a polynomial of degree $n$ is asymptotic to $\displaystyle \frac{2\log n}{\pi} + o(1)$ . This means that the number of complex roots is approximately $\displaystyle n - \frac{2\log n}{\pi}$ . Similar asymptotics hold for other distribution of the coefficients.

We can ask if the largest (or the smallest) root more likely to be real or complex? Since there are exponentially more complex roots than real roots as seen from the above asymptotic, my naive guess was that the largest (or the smallest) root is more likely to be complex. However experimental data proved to be quite counterintuitive.

Leave a Comment