FWIW, my understanding is that Evo 2 is not a generic language model that is able to produce innovations, it’s a transformer model trained on a mountain of genetic data which gave it the ability to produce new functional genomes. The distinction is important, see a very similar case of GPT-4b.
I don’t feel equipped to assess this.
FWIW, my understanding is that Evo 2 is not a generic language model that is able to produce innovations, it’s a transformer model trained on a mountain of genetic data which gave it the ability to produce new functional genomes. The distinction is important, see a very similar case of GPT-4b.
This may help with the second one:
https://www.lesswrong.com/posts/k5JEA4yFyDzgffqaL/guess-i-was-wrong-about-aixbio-risks
How about this one?
https://scottaaronson.blog/?p=9183
That appears to be the same one I linked.
Though possibly you grabbed the link in a superior way (not to comments).