Hi. I’m Gareth McCaughan. I’ve been a consistent reader and occasional commenter since the Overcoming Bias days. My LW username is “gjm” (not “Gjm” despite the wiki software’s preference for that capitalization). Elsewehere I generally go by one of “g”, “gjm”, or “gjm11”. The URL listed here is for my website and blog, neither of which has been substantially updated for several years. I live near Cambridge (UK) and work for Hewlett-Packard (who acquired the company that acquired what remained of the small company I used to work for, after they were acquired by someone else). My business cards say “mathematician” but in practice my work is a mixture of simulation, data analysis, algorithm design, software development, problem-solving, and whatever random engineering no one else is doing. I am married and have a daughter born in mid-2006. The best way to contact me is by email: firstname dot lastname at pobox dot com. I am happy to be emailed out of the blue by interesting people. If you are an LW regular you are probably an interesting person in the relevant sense even if you think you aren’t.
If you’re wondering why some of my very old posts and comments are at surprisingly negative scores, it’s because for some time I was the favourite target of old-LW’s resident neoreactionary troll, sockpuppeteer and mass-downvoter.
I don’t understand why you express the opinion that I think you’re expressing here as
rather than as
As e.g. Tenoke has said, “a country of geniuses in a datacenter” is, whatever else it may be, definitely something much smarter than a human being.
How do other people use the term? Here’s Nick Bostrom, from his book “Superintelligence”:
The Less Wrong wiki … actually just quotes Bostrom, in slightly different words from the above:
The Oxford English Dictionary gives three meanings, none of them quite the one we’re after here; I mention it because I looked and don’t want to be cherry-picking my sources. Wiktionary is less academic but more up-to-date, and says “Intelligence surpassing the level of a human genius.” with a few citations that all roughly match that and don’t require that said intelligence confer godlike powers or anything of the kind.
There’s probably a definition in Yudkowsky & Soares’s recent book but I don’t have a copy. I had a look at the transcript of his TED talk from 2023, titled “Will superintelligent AI end the world?”; he doesn’t define “superintelligence”, but it’s there in the title, and the scenario he talks about is: “At some point, the companies rushing to scale AI will cough out something that’s smarter than humanity”.
All these people are using “superintelligence” to mean some variation on the theme of “something much smarter than we are”. Many of them think that such a thing would in fact have vast world-changing impact, but they’re not making it part of the definition and I don’t understand how it makes sense to say that someone “doesn’t believe in superintelligence” merely because their estimate of the likely impact of something much smarter than us is different from yours.