Beijing Academy of Artificial Intelligence announces 1,75 trillion parameters model, Wu Dao 2.0

Link post

BAAI researchers demonstrated Wu Dao’s abilities to perform natural language processing, text generation, image recognition, and image generation tasks during the lab’s annual conference on Tuesday. The model can not only write essays, poems and couplets in traditional Chinese, it can both generate alt text based off of a static image and generate nearly photorealistic images based on natural language descriptions. Wu Dao also showed off its ability to power virtual idols (with a little help from Microsoft-spinoff XiaoIce) and predict the 3D structures of proteins like AlphaFold.

How big of a deal is that? Seems huge. Bigger than switch transformers and 10x bigger than GPT-3.