Here are the top quotes I can find about the content from the Eliezer Reddit thread:
This was dark. The part that really got me was the discussion about human time vs AI time. The fact that AI is running 24⁄7 at gigahertz speeds and the human brain runs about 200 hertz in short bursts is worrisome. If AGI did want to escape it would happen before we knew it.
I also keep thinking about Dune: “Once men turned their thinking over to machines in the hope that this would set them free. But that only permitted other men with machines to enslave them.”
Surprised to read so many skeptical comments here about Yudkowsky. I’ve been somewhat occasionally following his writings on rationality and I am absolutely convinced this guy is very brilliant in his niche slice of topics. His AI conversation with Sam Harris a few years ago is my favorite AI podcast where he hits the nail on the head about why we should be worried about AI. I have never hear someone talk so coherently on the topic as there. Really excited for his one.
Personally, I’m glad to hear voices on the extreme and opposing side as a counterweight to all the “AI is totally cool, bro!” AI positivity and optimism. We’ve been caught with our pants down even as people have tried to sound the alarm on this for years now. If unmitigated disaster is a possibility, we should damn well be hearing from those voices too.
Compare to top comments from the Sam interview with Lex Friedman:
I was very interested in hearing this interview but goddamn I can’t stand 2 hours of that guys vocal fry.
The whole bit about Jordan Peterson and other controversial figures at the beginning was really difficult to listen to. Sam sidestepping the topic by trivializing Lex was hilarious.
Only 1 hour in so far, but is it just me or Sam Altman evading every technical question? It’s as if he’s too afraid to give out any secrets. I’m pretty sure Lex repeated one of the questions twice too but no bite (I think it was the safety one?).
I guess that’s okay but I’m used to the Elon-like “here’s every detail I know and I don’t care about the competitors”. Though maybe the former approach is understandable considering the competitor in this case is Google.
This sampling methodology overall of course isn’t great, and I do think Eliezer obviously reads as someone pretty weird, but I think he also reads as quite transparently genuine in a way that other spokespeople do not, and that is quite valuable in-itself. Overall, I feel pretty uncompelled by people’s strong takes saying that Eliezer did a bad job on the recent podcasts.
I don’t find this argument convincing. I don’t think Sam did a great job either but that’s also because he has to be super coy about his company/plans/progress/techniques etc.
The Jordan Peterson comment was making fun of Lex and a positive comment for Sam.
Besides, I can think Sam did kinda bad and Elizier did kind of bad but expect Elizier to do much better!
.
I’m curious to know your rating on how you think Eliezer did compare to what you’d expect is possible with 80 hours of prep time including the help of close friends/co-workers.
I would rate his episode at around a 4⁄10
Why didn’t he have a pre-prepared well thought list of convincing arguments, intuition pumps, stories, analogies, etc. that would be easy to engage with for a semi-informed listener? He was clearly grasping for them on the spot.
Why didn’t he have quotes from the top respected AI people saying things like “I don’t think we have a solution for super intelligence.”, “AI alignment is a serious problem”, etc.
Why did he not have written notes? Seriously… why did he not prepare notes? (he could have paid someone that knows his arguments really well to prepare notes for him)
How many hours would you guess Eliezer prepared for this particular interview? (maybe you know the true answer, I’m curious)
How many friends/co-workers did Eliezer ask for help in designing great conversation topics, responses, quotes, references, etc.?
This was a 3-hour long episode consumed by millions of people. He had the mind share of ~6 million hours of human cognition and this is what he came up with? Do you rate his performance more than a 4/10?
I expect Rob Miles, Connor Leahy, or Michaël Trazzi would have done enough preparation and had a better approach, and could have done an 8+/10 job. What do you think of those 3? Or even Paul Christiano.
Eliezer should spend whatever points he has with Lex to get one of those above 4 on a future episode is my opinion.
50k views is actually relatively little for a tweet. The view numbers seem super inflated. I feel like I’ve seen tweets with 1000+ likes and 100k+ “views” on the topic of the Lex Friedman podcast (I think positive, but I really don’t remember).
I didn’t mean to bring up the Reddit comments as consensus, I meant them as a relatively random sample of internet responses.
The top reactions on Reddit all seem pretty positive to me (Reddit being less filtered for positive comments than Youtube): https://www.reddit.com/r/lexfridman/comments/126q8jj/eliezer_yudkowsky_dangers_of_ai_and_the_end_of/?sort=top
Indeed, the reaction seems better than the interview with Sam Altman: https://www.reddit.com/r/lexfridman/comments/121u6ml/sam_altman_openai_ceo_on_gpt4_chatgpt_and_the/?sort=top
Here are the top quotes I can find about the content from the Eliezer Reddit thread:
Compare to top comments from the Sam interview with Lex Friedman:
This sampling methodology overall of course isn’t great, and I do think Eliezer obviously reads as someone pretty weird, but I think he also reads as quite transparently genuine in a way that other spokespeople do not, and that is quite valuable in-itself. Overall, I feel pretty uncompelled by people’s strong takes saying that Eliezer did a bad job on the recent podcasts.
I don’t find this argument convincing. I don’t think Sam did a great job either but that’s also because he has to be super coy about his company/plans/progress/techniques etc.
The Jordan Peterson comment was making fun of Lex and a positive comment for Sam.
Besides, I can think Sam did kinda bad and Elizier did kind of bad but expect Elizier to do much better!
.
I’m curious to know your rating on how you think Eliezer did compare to what you’d expect is possible with 80 hours of prep time including the help of close friends/co-workers.
I would rate his episode at around a 4⁄10
Why didn’t he have a pre-prepared well thought list of convincing arguments, intuition pumps, stories, analogies, etc. that would be easy to engage with for a semi-informed listener? He was clearly grasping for them on the spot.
Why didn’t he have quotes from the top respected AI people saying things like “I don’t think we have a solution for super intelligence.”, “AI alignment is a serious problem”, etc.
Why did he not have written notes? Seriously… why did he not prepare notes? (he could have paid someone that knows his arguments really well to prepare notes for him)
How many hours would you guess Eliezer prepared for this particular interview? (maybe you know the true answer, I’m curious)
How many friends/co-workers did Eliezer ask for help in designing great conversation topics, responses, quotes, references, etc.?
This was a 3-hour long episode consumed by millions of people. He had the mind share of ~6 million hours of human cognition and this is what he came up with? Do you rate his performance more than a 4/10?
I expect Rob Miles, Connor Leahy, or Michaël Trazzi would have done enough preparation and had a better approach, and could have done an 8+/10 job. What do you think of those 3? Or even Paul Christiano.
Eliezer should spend whatever points he has with Lex to get one of those above 4 on a future episode is my opinion.
@habryka curious what you think of this comment
I’m talking about doing a good enough job to avoid takes like these: https://twitter.com/AI_effect_/status/1641982295841046528
50k views on the Tweet. This one tweet probably matters more than all of the Reddit comments put together
50k views is actually relatively little for a tweet. The view numbers seem super inflated. I feel like I’ve seen tweets with 1000+ likes and 100k+ “views” on the topic of the Lex Friedman podcast (I think positive, but I really don’t remember).
I didn’t mean to bring up the Reddit comments as consensus, I meant them as a relatively random sample of internet responses.
Fair enough regarding Twitter
Curious what your thoughts are on my comment below