Scott Alexander argues against several popular Great Filter explanations, emphasizing that the true Filter must be more consistent and thorough than common x-risks or alien interventions.
Longer summary
Scott Alexander critiques several popular explanations for the Great Filter theory, which attempts to explain the Fermi Paradox. He argues that common x-risks like global warming, nuclear war, or unfriendly AI are unlikely to be the Great Filter, as they wouldn't consistently prevent 999,999,999 out of a billion civilizations from becoming spacefaring. He also dismisses the ideas of transcendence or alien exterminators as the Filter. Scott emphasizes that the Great Filter must be extremely thorough and consistent to explain the lack of observable alien civilizations.
Shorter summary