AI Can Soon Write a Code Based on a Foreign Language
[ad_1]
In recent years, researchers have used artificial intelligence that edit translation between programming languages or only fix problems. For example, AI System DrRepair has been shown to solve many problems that lead to error messages. But some researchers dream of a day when AI could write programs based on simple explanations from non-experts.
Tuesday, Microsoft and OpenAI shared plans to bring GPT-3, one of the world’s leading text editors, to programs based on natural language descriptions. This is the first sale of GPT-3 since Microsoft invested $ 1 billion in OpenAI last year and acquired GPT-3 rights.
“If you can articulate what you want to do in a natural language, the GPT-3 will make a list of the most important things to choose from,” Microsoft CEO Satya Nadella said at a Build developer’s general conference. These rules are self-explanatory. ”
Microsoft VP Charles Lamanna told WIRED that the GPT-3-enabled lead can help people deal with stress and empower less experienced people. GPT-3 translates native language into PowerFx, a simple language similar to the rules of Excel that Microsoft introduced in March.
This is a recent demonstration of the use of AI in writing. Last year at Microsoft’s Build, OpenAI CEO Sam Altman displayed a language version that is better managed by a number from GitHub that just generates Python code lines. As WIRED reported last month, startups like SourceAI are also in use GPT-3 to create code. IBM last month showed how its Project CodeNet, with 14 million examples from more than 50 languages, could reduce the time it takes to upgrade a program with millions of Java code on a car company from one year to one month.
A new Microsoft feature was created by neural networks a structure called Transformer, used by major technical companies including Baidu, Google, Microsoft, Nvidia, and Salesforce to create major types of languages using hand-drawn lessons from the internet. The language barrier is growing. Google’s largest version of BERT, the language version released in 2018, had 340 million shares, which are neural building blocks. GPT-3, released a year ago, has 175 billion shares.
These efforts have been a long one, however. In another recent measure, the best brand achieved just 14 percent of the same in software development challenges created by a team of AI researchers.
[ad_2]
Source link