Thank you for writing this up, as I’ve been thinking the same thing for a while.
Totally agree re “slowburn realism”. Start with things exactly as they are today. Then move to things that will likely happen soon, so that when those things do happen, people will be thinking directly back to the film they saw recently. Keep escalating until you get to whatever ending works—maybe something like AI2027, maybe something like in A Disneyland Without Children.
It doesn’t even have to be a scenario where the AI is intentionally evil. We’ve had a thousand of those films already. An AI that’s just trying to do what it’s been told but is misaligned might be even scarier. No-one’s done a paperclip maximiser film.
Whatever ends up destroying us in the script, if you must have a not-totally-bleak ending, maybe the main characters manage to escape into space. Maybe they look back to watch a grey mass visibly spreading across the green Earth.
Thank you for writing this up, as I’ve been thinking the same thing for a while.
Totally agree re “slowburn realism”. Start with things exactly as they are today. Then move to things that will likely happen soon, so that when those things do happen, people will be thinking directly back to the film they saw recently. Keep escalating until you get to whatever ending works—maybe something like AI2027, maybe something like in A Disneyland Without Children.
It doesn’t even have to be a scenario where the AI is intentionally evil. We’ve had a thousand of those films already. An AI that’s just trying to do what it’s been told but is misaligned might be even scarier. No-one’s done a paperclip maximiser film.
Whatever ends up destroying us in the script, if you must have a not-totally-bleak ending, maybe the main characters manage to escape into space. Maybe they look back to watch a grey mass visibly spreading across the green Earth.