Interesting read, thanks for writing it up. FYI the link “The report on the 2022 results is now available” leads to a private Google Drive file.
ChristianWilliams
Karma: 146
Metaculus Launches Chinese AI Chips Tournament, Supporting Institute for AI Policy and Strategy Research
Metaculus Introduces New Forecast Scores, New Leaderboard & Medals
Metaculus Introduces AI-Powered Community Insights to Reveal Factors Driving User Forecasts
Metaculus’s New Sidebar Helps You Find Forecasts Faster
Metaculus Launches Conditional Cup to Explore Linked Forecasts
Metaculus Announces Forecasting Tournament to Evaluate Focused Research Organizations, in Partnership With the Federation of American Scientists
Metaculus Launches 2023/2024 FluSight Challenge Supporting CDC, $5K in Prizes
Gulf Breeze, Florida, USA – ACX Meetups Everywhere Fall 2023
Metaculus Event: Forecast Friday, April 28th at 12pm ET — Speed Forecasting Session!
Fair point. Noted for future event posts.
[Event] Join Metaculus Tomorrow, March 31st, for Forecast Friday!
[Event] Join Metaculus for Forecast Friday on March 24th!
Metaculus is conducting its first user survey in nearly three years. If you have read analyses, consumed forecasts, or made predictions on Metaculus, we want to hear from you! Your feedback helps us better meet the needs of the forecasting community and is incredibly important to us.
Take the short survey here — we truly appreciate it! (We’ll be sure to share what we learn.)
ChristianWilliams’s Shortform
Metaculus Introduces New ‘Conditional Pair’ Forecast Questions for Making Conditional Predictions
Metaculus Year in Review: 2022
Metaculus Announces The Million Predictions Hackathon
Prediction market Metaculus launches a “Forecasting Our World In Data” tournament (via @metaculus)
Hi, this link should be https://www.metaculus.com/tournament/forecasting-Our-World-in-Data/ instead, it’s missing the www
Hi @Odd anon, thanks for the feedback and questions.
1. To your point about copying the Community Prediction: It’s true that if you copy the CP at all times you would indeed receive a high Baseline Accuracy score. The CP is generally a great forecast! Now, CP hidden periods do mitigate this issue somewhat. We are monitoring user behavior on this front, and will address it if it becomes an issue. We do have some ideas in our scoring trade-offs doc for further ways to address CP copying, e.g.:
Have a look here, and let us know what you think! (We also have some ideas we’re tinkering with that are not listed in that doc, like accuracy metrics that don’t include forecasts that are on the CP or +/- some delta.)
2. On indicating confidence: You’ll see in the tradeoffs doc that we’re also considering the idea of letting users exclude a particular forecast from their peer score (Idea # 3), which could somewhat address this. (Interestingly, indicating confidence was attempted at Good Judgment Project, but ultimately didn’t work and was abandoned.)
We’re continuing to develop ideas on the above, and we’d definitely welcome further feedback!