My priors on a group like Leverage Research actually discovering something true and important by doing “basic research” in psychology are very low; I don’t expect “put a bunch of smart people in a room and have them talk to each other” to produce something more connected to reality than Freudian psychoanalysis or Dianetics. After all, “id-ego-superego” really isn’t a terrible model of a person, and studies have trouble telling if one kind of therapy is better than another. So you’ll get something that kind of makes sense and may even seem to work but isn’t necessarily related to truth in any meaningful way.
Is there anything I should know that might cause me to update my beliefs?
Going off-topic here: is it just me, or is Elon Musk’s greatest superpower the ability to get ridiculous amounts of funding for very unusual ventures? (SpaceX, Boring Company, Neuralink, etc?) There must be many more people besides Elon Musk who could be an effective CEO for a speculative technology company, but I suspect that if anyone other than Elon Musk had tried to pitch The Boring Company to venture capitalists they’d be laughed out of the room.
(Similarly, right now Warren Buffet makes money not by outwitting the market the way he did years and years ago, but because people are willing to give him better terms than other investors because having the Warren Buffet Seal Of Approval is valuable.)
So as beliefs get closer to reality overall, the bar for thinking of something as crazy lowers?
Some people did see the mortgage-backed securities crash of 2008 coming and made money on it!
No, I contradicted a crackpot claim by stating that the opposite was true. I didn’t refute it; that would have required providing evidence (in this case, by explaining how someone without budget constraints actually could go about making a replica of the Great Pyramid using modern technology).
My counterargument to Humbali would go like this: “Suppose I tell you I’ve already taken you might be wrong into account. If you ask me to do it again, then you can just do the same thing to my more uncertain estimate—I’d end up in an infinite regress, and the argument would become a statement that no matter how uncertain you are, you should be more uncertain than that. And that is ridiculous. So I’m going to take you might be wrong into account only once. Which I already have. So shut up.”
you might be wrong
Which is why I specified “an algorithm” and not “a proof”.
Also, if my understanding is correct, simulating quantum systems is in PSPACE, so one thing this would do is make nanotechnology much easier to develop...
An algorithm for solving PSPACE complete problems in polynomial time would probably get you a good chunk of the way there, although there’s no particular reason to believe this is possible other than the fact that nobody has yet proven it to be impossible.
So, start making the diplomatic situation around Taiwan as bad as possible? ;)
I think the optimistic case might be that in order to get the AGI to do anything useful at all you have to get at least part-way to a solution to the alignment problem, because otherwise its outputs will include many that will be so obviously “wrong” that you’d never actually let it do anything in which being wrong mattered.
I linked to it because it seemed like Robin Hanson was saying something close to the opposite of this.
Right now the bottleneck for becoming able to legally practice medicine as a doctor in the US is the number or residency positions for training medical school graduates, not the number of people graduating from medical schools.
Maybe imagine a dog?
My impression as an outsider (I met him once and heard and read some things people were saying about him) was that he seemed smart but also seemed like kind of a kook...
As the joke goes, there’s nothing crazy about talking to dead people. When dead people respond, then you start worrying.
To me, it seems like the “obvious” equivalent to a search engine in 1960 is a librarian or other professional researcher, much in the same way than the 1860 equivalent of a clothes washing machine was a domestic servant.
I’ve heard that minivans replaced large station wagons largely because station wagons counted as cars for purposes of fuel economy laws (which mandated that cars sold by a manufacturer achieve a “fleet average” of a certain number of miles per gallon) and minivans didn’t.
The most horrific case I know of LSD being involved in a group’s downward spiral from weird and kinda messed up to completely disconnected from reality and really fucking scary is the Manson family, but that’s far from a typical example. But if you do want to be a cult leader, LSD does seem to do something that makes the job a lot easier.
One takeaway I got from this when combined with some other stuff I’ve read:
Don’t do psychedelics. Seriously, they can fuck up your head pretty bad and people who take them and organizations that encourage taking them often end up drifting further and further away from normality and reasonableness until they end up in Cloudcuckooland.