Very nice! I would’ve liked to have seen either a call to action (e.g., “ban all training of models larger than GPT-4”) or an exploration of the emotional implications (e.g., “don’t put your hope in the future, because there probably isn’t much future left”, which Eliezer said during his interview with Lex Fridman) but overall very helpful.
Very nice! I would’ve liked to have seen either a call to action (e.g., “ban all training of models larger than GPT-4”) or an exploration of the emotional implications (e.g., “don’t put your hope in the future, because there probably isn’t much future left”, which Eliezer said during his interview with Lex Fridman) but overall very helpful.