Tech News

Language types like GPT-3 can announce a new type of search engine

Now the Google search team has published the idea of ​​a major reformation which extracts the layout method and converts it to a single type of AI language, such as BERT or GPT-3—On the future. The idea is that instead of searching for information on a wide range of pages, users ask questions and have the language type taught on those pages answer them directly. The process can change not only how the search engine works, but also what they do — and how we work with them

Search engines have become faster and more accurate, just as the internet has grown exponentially. AI is now used to sort out results, and Google uses BERT to understand search queries all right. However under these tweaks, all search engines still work the same way they did 20 years ago: web pages are indexed and indexed (a program that reads a web site and keeps a list of all its findings), results that match the query’s query was collected from the index this, and the results are set.

Donald Metzler and colleagues at Google Research wrote, “the enrollment process has been a success for a long time and has not been tried or seriously considered.”

The problem is that even the best search engines today are still responding to a list of articles that include what is being asked, not just the same. Search engines are also not good at answering questions that require answers from a variety of sources. It is as if you asked a doctor for advice and you received a list of notes to read instead of giving a direct answer.

Metzler and his colleagues are fascinated by search engines that act as a human expert. It should formulate the answers in the natural language, made up of a number of texts, and reciprocate the answers according to the supporting evidence, as required by Wikipedia articles.

The great variety of languages ​​allows us to get there. Online-trained and hundreds of books, GPT-3 produces information from a variety of sources to answer questions in the natural language. The problem is that it does not keep track of where it came from and cannot provide evidence of its answers. There’s no way to know if GPT-3 describes reliability or distraction — or just throwing nonsense on its own.

Metzler and his colleagues refer to these types of speech as dilettantes – “They are known to know a lot but know a lot.” The answer, he says, is to create and train future BERTs and GPT-3s to preserve the history of their origins. No such species could have done this, but it is possible mentally, and there is an initial task.

There have been decades of research in various fields of research, ranging from answering questions to summarizing information documents, says Ziqi Zhang at the University of Sheffield, UK, who specializes in online repetition. But none of these techniques have completely changed the search because each one encounters difficulties and cannot happen. The interesting thing on this page is that the major language groups can do it all at once, he says.

However, Zhang observes that language types do not perform well in technical or professional studies because there are very few examples in the literature that are taught. “There may be several times more online content on online marketing than what is described on quantum machines,” he says. Language types today are also supported in English, which leaves non-English pages unacceptable.

However, Zhang has accepted the suggestion. “This was not possible in the past, because the major languages ​​are just beginning,” he says. “If it works, it could change our research.”


Source link

Related Articles

Leave a Reply

Back to top button