Not necessarily. If humans don’t die or end up depowered in the first few weeks of it, it might instead be a continuous high-intensity stress state, because you’ll need to be paying attention 24⁄7 to constant world-upturning developments, frantically figuring out what process/trend/entity you should be hitching your wagon to in order to not be drowned by the ever-rising tide, with the correct choice dynamically changing at an ever-increasing pace.
“Not being depowered” would actually make the Singularity experience massively worse in the short term, precisely because you’ll be constantly getting access to new tools and opportunities, and it’d be on you to frantically figure out how to make good use of them.
Crypto is the only market that trades 24⁄7, meaning there simply was no rest for the wicked. The game was less about brilliance and more about being awake when it counted. Resource management around attention and waking hours was a big part of the game. [...]
My cofounder and I developed a polyphasic sleeping routine so that we would be conscious during as many of these action periods as possible. It was rare to get uninterrupted sleep for more than 3 hours at a time. We took tactical naps whenever possible and had phone alarms to wake us up in case important headlines came out during off hours. I felt like I had experienced three days for every one that passed.
There was always something going on. Everyday a new puzzle to solve. A new fire to put out. We frequently would work 18 hour days processing information, trading events, building infrastructure, and managing risk. We frequently moved around different parts of the world, built strong relationships with all sorts of people from around the globe, and experienced some of the highest highs and lowest lows of our lives.
Those three years felt like the longest stretch I’ve ever lived.
This is pretty close to how I expect a “slow” takeoff to feel like, yep.
This comment has been tumbling around in my head for a few days now. It seems to be both true and bad. Is there any hope at all that the Singularity could be a pleasant event to live through?
Well, an aligned Singularity would probably be relatively pleasant, since the entities fueling it would consider causing this sort of vast distress a negative and try to avoid it. Indeed, if you trust them not to drown you, there would be no need for this sort of frantic grasping-at-straws.
An unaligned Singularity would probably also be more pleasant, since the entities fueling it would likely try to make it look aligned, with the span of time between the treacherous turn and everyone dying likely being short.
This scenario covers a sort of “neutral-alignment/non-controlled” Singularity, where there’s no specific superintelligent actor (or coalition) in control of the whole process, and it’s instead guided by… market forces, I guess? With AGI labs continually releasing new models for private/corporate use, providing the tools/opportunities you can try to grasp to avoid drowning. I think this is roughly how things would go under “mainstream” models of AI progress (e. g., AI 2027). (I don’t expect it to actually go this way, I don’t think LLMs can power the Singularity.)
Not necessarily. If humans don’t die or end up depowered in the first few weeks of it, it might instead be a continuous high-intensity stress state, because you’ll need to be paying attention 24⁄7 to constant world-upturning developments, frantically figuring out what process/trend/entity you should be hitching your wagon to in order to not be drowned by the ever-rising tide, with the correct choice dynamically changing at an ever-increasing pace.
“Not being depowered” would actually make the Singularity experience massively worse in the short term, precisely because you’ll be constantly getting access to new tools and opportunities, and it’d be on you to frantically figure out how to make good use of them.
The relevant reference class is probably something like “being a high-frequency trader”:
This is pretty close to how I expect a “slow” takeoff to feel like, yep.
This comment has been tumbling around in my head for a few days now. It seems to be both true and bad. Is there any hope at all that the Singularity could be a pleasant event to live through?
Well, an aligned Singularity would probably be relatively pleasant, since the entities fueling it would consider causing this sort of vast distress a negative and try to avoid it. Indeed, if you trust them not to drown you, there would be no need for this sort of frantic grasping-at-straws.
An unaligned Singularity would probably also be more pleasant, since the entities fueling it would likely try to make it look aligned, with the span of time between the treacherous turn and everyone dying likely being short.
This scenario covers a sort of “neutral-alignment/non-controlled” Singularity, where there’s no specific superintelligent actor (or coalition) in control of the whole process, and it’s instead guided by… market forces, I guess? With AGI labs continually releasing new models for private/corporate use, providing the tools/opportunities you can try to grasp to avoid drowning. I think this is roughly how things would go under “mainstream” models of AI progress (e. g., AI 2027). (I don’t expect it to actually go this way, I don’t think LLMs can power the Singularity.)