How to avoid getting lost reading Scott Alexander and his 1500+ blog posts? This unaffiliated fan website lets you sort and search through the whole codex. Enjoy!

See also Top Posts and All Tags.

Minutes:
Blog:
Year:
Show all filters
4 posts found
Mar 05, 2024
acx
23 min 2,893 words 176 comments 135 likes podcast
Scott Alexander analyzes the results of his 2023 forecasting contest, comparing various prediction methods and individual forecasters. Longer summary
Scott Alexander reviews the results of his 2023 annual forecasting contest, where participants predicted 50 questions about the upcoming year. He discusses the winners in both 'Blind Mode' (relying on personal knowledge) and 'Full Mode' (using aggregation algorithms). The post analyzes the performance of various forecasting methods, including individual forecasters, prediction markets, superforecasters, and aggregation techniques. Scott concludes that Metaculus, a forecasting platform, outperformed other methods, though some individual forecasters showed exceptional skill. He also examines which 2023 events were most surprising to forecasters and shares his main takeaways from the contest. Shorter summary
Feb 06, 2023
acx
17 min 2,128 words 284 comments 122 likes podcast
Scott Alexander investigates the 'wisdom of crowds' hypothesis using survey data, exploring its effectiveness and potential applications. Longer summary
Scott Alexander discusses the 'wisdom of crowds' hypothesis, which claims that the average of many guesses is better than a single guess. He tests this concept using data from his ACX Survey, focusing on a question about the distance between Moscow and Paris. The post explores how error rates change with crowd size, whether individuals can benefit from averaging multiple guesses, and compares his findings to a larger study by Van Dolder and Van Den Assem. Scott also ponders why wisdom of crowds isn't more widely used in decision-making and speculates on its potential applications and limitations. Shorter summary
Jan 24, 2023
acx
30 min 3,809 words 300 comments 102 likes podcast
Scott Alexander analyzes results from a 2022 prediction contest, discussing top performers and methods for improving forecast accuracy. Longer summary
Scott Alexander reviews the results of a 2022 prediction contest where 508 participants assigned probabilities to 71 yes-or-no questions about future events. The post discusses the performance of individual forecasters, aggregation methods, and prediction markets. It highlights the success of superforecasters, the wisdom of crowds, and prediction markets. The article also announces winners, discusses demographic factors in forecasting ability, and introduces a new contest for 2023, emphasizing the potential for improving forecasting accuracy through various methods. Shorter summary
Nov 27, 2016
ssc
14 min 1,729 words 154 comments podcast
A study on expert prediction of behavioral economics experiments finds that experts have only a slight advantage over non-experts, suggesting that a separate 'rationality' skill may be more important than specific expertise. Longer summary
This post discusses a study by DellaVigna & Pope on expert prediction of behavioral economics experiments. The study found that knowledgeable academics had only a slight advantage over random individuals in predicting experimental results. Prestigious academics did not outperform less prestigious ones, and field of expertise did not matter. The expert advantage was small and easily overwhelmed by wisdom of crowds effects. The author suggests that these results indicate that experts' expertise may not be helping them much in this context, and proposes that a separate 'rationality' skill, somewhat predicted by high IQ and scientific training but not identical to either, might explain the results. The post also discusses the implications of these findings for real-world issues like election predictions, noting important caveats about the nature of the predictive task in the study. Shorter summary