Have you assessed your predictions? What have you learned?
skinks_basking
But if you aren’t religious, what even are morals if not feelings?
There may be some writing, on LW and elsewhere that could help you with this question.
“We, of course, never target civilian targets.”
I’m mostly being silly, but one might claim this is a Freudian slip: Hegseth referred to “civilian targets”, as of this is one of kinds of targets in discussion. Like that is a phrase he’s been using. He could have referred to them as civilians, but he referred to them as targets. Source
I get the feeling as I’m reading the Fun Theory sequence that I’m not supposed to be reading this, if I haven’t answered its questions already to my own satisfaction.
Lol I assumed it was Claude’s mischaraterisation !
skinks_basking’s Shortform
Enjoyed, thankyou!
“I was as clever as you, once. Cleverer, perhaps. I looked at the wizarding world and I saw everything you see — the inefficiency, the waste, the tradition without reason, the power unused.
This seems like a mischaracterisation. From what I remember, Dumbledore in HPMOR seemed to have a perspective different enough from Harry’s such that he didn’t really understand him at times. Harry’s optimisation-lust was foreign to him, not a familiar feeling from his youth he’d grown out of.
Is anyone tuned into how the Australian government is thinking about AI? I listened to an interview with Australia’s chief scientist Tony Haymet on ABC Radio National. The primary concern he mentioned is AI’s effect on jobs and securing its benefits, which according to the economic analysis he cited, would include 20% growth national GDP. Secondary concern was superstimulus (citimg social media as example). Does Haymet consider x-risk as a possibility? does anyone in government?
What would AI labs do differently if this were made law? Couldn’t they build datacenters outside the US?