Tech News

AI Can Write Code Like People-Bullets and All

Some software developers now allows artificial intelligence help write their number. They find that AI also has human-like flaws.

Last June, GitHub, the company has a Microsoft which provides tools to facilitate co-operation and collaboration, was released a beta type of software that uses AI to support developers. Start typing a command, a secret question, or a request to the API, with the program, called Police, You think your goal is to write down the rest.

Alex Naka, a commercial scientist at the biotech company who signed up to test Copilot, says the program could be very useful, and has changed the way it works. “It allows me to spend less time browsing to view API scripts or Stack Overflow models,” he says. “It seems like my job has changed from being a digital machine to being a racist.”

But Naka has found that errors can enter his code in a variety of ways. “Sometimes I’ve missed some hidden mistakes when I get one of his thoughts,” he says. “And it can be hard to follow this, probably because it seems to make mistakes that can’t be compared to what I’m making.”

The risks of creating the wrong AI codes can be surprisingly large. Researchers at NYU recently an analysis number generated by Copilot and found that, in some cases where security is important, the rules contain 40% of the time limitations.

The figure is “a little more than I think,” he says Brendan Dolan-Gavitt, a professor at NYU who participated in the study. “But Copilot’s teaching did not imply good writing, he simply wrote the kind that would be written later.”

Despite such shortcomings, Copilot and similar AI tools can announce fluid changes in the way developers write. There is a growing interest in using AI to help create common tasks. But Copilot also highlights some of the pitfalls of modern AI systems.

Examining the number found to create the Copilot program, Dolan-Gavitt he found that contained a list of prohibited items. This appears to have been initiated so that the system would not publish disturbing messages or copy a known number written by someone else.

Oege de Moor, Vice president of research at GitHub and co-founder of Copilot, says security has been a major concern since the beginning. He also said the amount of malpractice laws cited by NYU investigators is only important on the code list where security errors are possible.

De Moor came out CodeQL, a tool used by NYU researchers who only identify bed bugs. He also said that GitHub encourages developers to use Copilot in conjunction with CodeQL to ensure that their work is efficient.

The GitHub app is built on top of the AI ​​type created by OpenAI, a well-known AI company that specializes in cutting-edge operations machine learning. This type, called the Codex, contains a large amount of excavation material neural networks trained to predict the following characters in code and computer. The algorithms put billions of lines stored on GitHub – not all of them the best – to learn to write numbers.

OpenAI has developed its own AI coding tool on top of Codex that can do some amazing writing things. It can modify instructions, such as “Create several colors between 1 and 100 and then return the largest one,” to work in multiple programming languages.

Another version of the same OpenAI program, called GPT-3, is available make a coherent statement on a given topic, but it can also provide energy offensive or biased language learned from the darkest corners of the internet.

Copilot and Codex have been led some manufacturers to surprise if AI can fire them. Instead, as Naka’s experience shows, developers need special skills to use the program, because they often have to test or change their mind.




Source link

Related Articles

Leave a Reply

Back to top button