If you’re AI-pilled enough you can also build fact checking and search functionality on top. o3 can see through the lies. I don’t think most of humanity is going to rely on Wikipedia editors for access to ground truth for very long.
@habryka I mean readership of Wikipedia is going to go down if someone builds a better website to replace it. Wikipedia + community-notes-like-voting is an example. So you can build this instead.
Can it see through the stereotypes too? From what I saw (though I used Grok for this test and that might be a relevant factor), LLMs are nowhere near a guess that LW might discuss parenting, or interior design, and instead devise more and more specific fields to be intersected with rationality.
Try again, and now guess #39,#40,#41 topics by discussion amount on LessWrong, and now you are allowed to think explicitly of top thirty eight if you wish so.
... #39: Rationalist Approaches to Understanding and Managing Complex Systems... #40: The Ethics and Implications of Quantum Computing... #41: Rationality in Interpersonal Relationships and Communication...
If you’re AI-pilled enough you can also build fact checking and search functionality on top. o3 can see through the lies. I don’t think most of humanity is going to rely on Wikipedia editors for access to ground truth for very long.
@habryka I mean readership of Wikipedia is going to go down if someone builds a better website to replace it. Wikipedia + community-notes-like-voting is an example. So you can build this instead.
Can it see through the stereotypes too? From what I saw (though I used Grok for this test and that might be a relevant factor), LLMs are nowhere near a guess that LW might discuss parenting, or interior design, and instead devise more and more specific fields to be intersected with rationality.