I would not have expected progress to have sped up. But I agree that lots of recent progress could be naively Interpreted this way. So it makes sense to keep in mind that the current deep learning paradigm might come to a halt. Though the thing that worries me is that deep learning already has enough momentum to get us to AGI while slowing down.
Mostly abstract arguments that don’t actually depend on DL in particular (or at least not to a strong degree). Eg. stupid evolution was able to do it with human brains. This spreadsheet is nice for playing with the implications for different models (couldn’t find Ajeya’s report this belongs to). Though I haven’t taken the time to thoroughly think through this, because playing through resonable values gave distributions that seemed to broad to bother.
The point I wanted to make is that you can believe that things are slowing down (I am more empathetic to the view where AI will not have a big/galactic impact until things are too late) and still be worried.
I would not have expected progress to have sped up. But I agree that lots of recent progress could be naively Interpreted this way. So it makes sense to keep in mind that the current deep learning paradigm might come to a halt. Though the thing that worries me is that deep learning already has enough momentum to get us to AGI while slowing down.
Out of curiosity, what is your reasoning behind believing that DL has enough momentum to reach AGI?
Mostly abstract arguments that don’t actually depend on DL in particular (or at least not to a strong degree). Eg. stupid evolution was able to do it with human brains. This spreadsheet is nice for playing with the implications for different models (couldn’t find Ajeya’s report this belongs to). Though I haven’t taken the time to thoroughly think through this, because playing through resonable values gave distributions that seemed to broad to bother.
The point I wanted to make is that you can believe that things are slowing down (I am more empathetic to the view where AI will not have a big/galactic impact until things are too late) and still be worried.