Gadgets News

China’s vast AI is not a fraud

When Open AI The GPT-3 version was launched in May 2020, its performance is considered to be a real skill. Able to produce anonymous transcripts from man-made materials, GPT-3 introduced a new approach to in-depth learning. But oh what a change for the year. Researchers from Beijing Academy of Artificial Intelligence announced Tuesday the release of their in-depth learning approach, Wu Dao, mammoth AI seems to be able to do whatever GPT-3 can do, and much more.

First of all, Wu Dao is very smooth. 1.75 trillion apprentices (in particular, self-selecting coefficients) which is ten times larger than the 175 billion GPT-3 which was trained by 150 billion older than Google Edit Transformers.

To teach color on most of this I do fast – Wu Dao 2.0 arrived just three months later The release of version 1.0 in March – BAAI researchers began developing open-source learning machines similar to Google’s Mix of Experts, characters FastMoE. This system, which is run PyTorch, enabled the nation to be trained in groups of large computers and ordinary GPUs. This gave FastMoE more flexibility than the Google system since FastMoE does not require commercial tools such as Google TPUs and as a result it can run on the shelf – the best clusters though.

With all the computing power it comes with all the capabilities. Unlike many forms of in-depth learning that do the same thing – write a book, create in-depth fiction, recognize faces, success on Go – Wu Dao is Competitive modal, similar to the doctrine of Facebook anti-hate AI or the recently released Google MUMI. BAAI researchers demonstrated Wu Dao’s potential for language production, drafting, photographic recognition, and photographic work at the annual second-day lab conference on Tuesday. The genre is not only able to write texts, poems and poems in Chinese culture, it is all able to create writing based on traditional images and create images with images based on natural language descriptions. Wu Dao also demonstrated his ability to power images (with a little help from Microsoft-spinoff XiaoIce) and predicted 3D rendering of proteins such as AlphaFold.

“Technological advances are great examples as well as a large computer,” said Drs. Zhang Hongjiang, BAAI chairman at the conference Tuesday. “What we’re building is the future of AI technology, I end up with mega, mega computing power, and mega models, we can change the information to help use AI in the future.”

All products selected by Engadget are selected by our publishing team, excluding our parent company. Some of our articles include helpful links. If you purchase one of these links, we will be able to make a donation.


Source link

Related Articles

Leave a Reply

Back to top button