How to avoid getting lost reading Scott Alexander and his 1500+ blog posts? This unaffiliated fan website lets you sort and search through the whole codex. Enjoy!

See also Top Posts and All Tags.

Minutes:
Blog:
Year:
Show all filters
1 posts found
Mar 01, 2023
acx
35 min 4,471 words 621 comments 202 likes podcast
Scott Alexander critically examines OpenAI's 'Planning For AGI And Beyond' statement, discussing its implications for AI safety and development. Longer summary
Scott Alexander analyzes OpenAI's recent statement 'Planning For AGI And Beyond', comparing it to a hypothetical ExxonMobil statement on climate change. He discusses why AI doomers are critical of OpenAI's research, explores potential arguments for OpenAI's approach, and considers cynical interpretations of their motives. Despite skepticism, Scott acknowledges that OpenAI's statement represents a step in the right direction for AI safety, but urges for more concrete commitments and follow-through. Shorter summary