The space of possible evolved biological minds is far smaller than the space of possible ASI minds
Achkshually, Yudkowskian Orthodoxy says any truly super-intelligent minds will converge on Expected Value Maximization, Instrumental Goals, and Timeless-Decision Theory (as invented by Eliezer), so clearly the ASI mind space is actually quite narrow.
Wow that blows past dunning-kurger overestimation into straight up time cube tier crank.