RSS

PapersToAGI

Karma: −1

Con­cen­tra­tion Risk Is Prob­a­bly More Im­por­tant Than Align­ment Risk, And It’s Head­ing to a Doom Scenario

PapersToAGI20 Mar 2026 14:27 UTC
1 point
0 comments9 min readLW link

Paper­sToAGI’s Shortform

PapersToAGI11 Apr 2025 19:54 UTC
1 point
1 comment1 min readLW link

Why Big­ger Models Gen­er­al­ize Better

PapersToAGI11 Apr 2025 19:54 UTC
1 point
0 comments2 min readLW link