I think that’s a great answer—assuming that’s what you believe.
For me, I don’t believe point 3 on the AI timelines—I think AGI will probably be here by 2029, and could indeed arrive this year. And even if it goes well and humans maintain control and we don’t get concentration-of-power issues… the software development skills your students are learning will be obsolete, along with almost all skills.
While I still have a dominant probability that human civilization will be broadly recognizable in 2034, I know that my confidence on that is proportional to my LW karma.
There’s a lot about communicating with kids as a teacher that pushes me towards Simulacra Level 2 or higher. If we’re on something closer to your 2029 timeline, my honest advice to students would be Get Out of School ASAP and look for something interesting on the Jagged Frontier (use AI in a way you find useful that few others understand) or dedicate time to building craft skills that would have been recognizable 100 years ago.
My estimate is that I could give that advice to 3-5 students before I had to look for another job.
What about “Keep studying and learning in the hopes that (a) I’m totally wrong about AGI timelines and/or (b) government steps in and prevents AGI from being built for another decade or so?”
What about “Get organized, start advocating to make b happen?”
I’m on the PauseAI discord in part to expose my students to that level of coordinated planning and direct action.
My Simulacra Level 1 perspective is that most students generally benefit from being in school. While some of that benefit comes from way-down-stream consequences (thankfully, I took an EE class in in 1983..:), a vast majority of the positive benefits happen in the immediate-term.
”Keep studying and learning” is a Simulacra Level 2 admonition that helps maintain the benefits I truly believe are there. (Yes, there are lots of problems in lots of schools. I can only ever speak in aggregates here).
Importantly, a significant number of adolescent problems come from antagonistic relationships between children and theri parents/caregivers. If those adults are supportive of a student leaving school , then I would happily hand them a copy of Blake Boles “College Without High School” ( https://www.blakeboles.com/cwhs/ ). If the adults are insistent on normal “go everyday” school, I think the negative consequences from that fight will most often dominate the positive changes.
What are your thoughts on skills that the government has too much control over? For example If we get ASI in 2030 do you imagine that a doctor will be obsolete in 2032 or will the current regulatory environment still be relevant ?
And how much of this is determined by “labs have now concentrated so much power that governments are obsolete”.
If we get ASI in 2030, all humans will be economically and militarily obsolete in 2030, and probably politically obsolete too (though if alignment was solved then the ASIs would be acting on behalf of the values and intentions of at least some humans). The current regulatory regime will be irrelevant. ASI is powerful.
Also agree on the timelines. If we don’t take some dramatic governance actions, then AGI looks probable in the next 5 years, and very probable in the next 10. And after that, the odds of the world / society being similar to the way it has been for the past 50 years seems vanishingly small. If you aren’t already highly educated in the technical skills needed to help with this, probably political action is your best bet for having a future that conforms to your desires.
I think that’s a great answer—assuming that’s what you believe.
For me, I don’t believe point 3 on the AI timelines—I think AGI will probably be here by 2029, and could indeed arrive this year. And even if it goes well and humans maintain control and we don’t get concentration-of-power issues… the software development skills your students are learning will be obsolete, along with almost all skills.
Thanks for the reply to a first post.
While I still have a dominant probability that human civilization will be broadly recognizable in 2034, I know that my confidence on that is proportional to my LW karma.
There’s a lot about communicating with kids as a teacher that pushes me towards Simulacra Level 2 or higher. If we’re on something closer to your 2029 timeline, my honest advice to students would be
Get Out of School ASAP and look for something interesting on the Jagged Frontier (use AI in a way you find useful that few others understand) or dedicate time to building craft skills that would have been recognizable 100 years ago.
My estimate is that I could give that advice to 3-5 students before I had to look for another job.
Gotcha. A tough situation to be in.
What about “Keep studying and learning in the hopes that (a) I’m totally wrong about AGI timelines and/or (b) government steps in and prevents AGI from being built for another decade or so?”
What about “Get organized, start advocating to make b happen?”
I’m on the PauseAI discord in part to expose my students to that level of coordinated planning and direct action.
My Simulacra Level 1 perspective is that most students generally benefit from being in school. While some of that benefit comes from way-down-stream consequences (thankfully, I took an EE class in in 1983..:), a vast majority of the positive benefits happen in the immediate-term.
”Keep studying and learning” is a Simulacra Level 2 admonition that helps maintain the benefits I truly believe are there. (Yes, there are lots of problems in lots of schools. I can only ever speak in aggregates here).
Importantly, a significant number of adolescent problems come from antagonistic relationships between children and theri parents/caregivers. If those adults are supportive of a student leaving school , then I would happily hand them a copy of Blake Boles “College Without High School” ( https://www.blakeboles.com/cwhs/ ). If the adults are insistent on normal “go everyday” school, I think the negative consequences from that fight will most often dominate the positive changes.
What are your thoughts on skills that the government has too much control over? For example If we get ASI in 2030 do you imagine that a doctor will be obsolete in 2032 or will the current regulatory environment still be relevant ?
And how much of this is determined by “labs have now concentrated so much power that governments are obsolete”.
If we get ASI in 2030, all humans will be economically and militarily obsolete in 2030, and probably politically obsolete too (though if alignment was solved then the ASIs would be acting on behalf of the values and intentions of at least some humans). The current regulatory regime will be irrelevant. ASI is powerful.
Also agree on the timelines. If we don’t take some dramatic governance actions, then AGI looks probable in the next 5 years, and very probable in the next 10. And after that, the odds of the world / society being similar to the way it has been for the past 50 years seems vanishingly small. If you aren’t already highly educated in the technical skills needed to help with this, probably political action is your best bet for having a future that conforms to your desires.
You may have already qualified this prediction somewhere else, but I can’t find where. I’m interested in:
1. What do you mean by “AGI”? Superhuman at any task?
2. “probably be here” means >= 50%? 90%?
Yep. Or if we wanna nitpick and be precise, better than the best humans at X, for all cognitive tasks/skills/abilities/jobs/etc. X.
>50%.