Beyond artificial intelligence, chatbpt chatbot owes its speaker talents to humans

This conversational robot which produces impressive texts at the request is a dazzling success. “MO12345LEMONDE” opened the hood of this chatbot concocted by the American company OPENAI.

by David Larousserie

Since November 30, an online conversational agent, or Chatbot, has fascinated Internet users with its literary creations, its mastery of human or computer languages, its surprising or disappointing responses, sometimes its gross errors. We no longer count the songs, questions and answers, dialogues, or even interviews of chatgpt , the program of the American company Openai, specialized in artificial intelligence (IA). “The cat and the robot played poker, but the cat cheated using its claws to mark the cards”, offers for example the tool when asking for a funny sentence.

“As a user, I can be impressed, but as a scientist, you have to see, because we do not yet have the details of this chatbot. The results require to be evaluated by metrics and tests that the scientific community has Developed “, notes Chloé Clavel, professor at Télécom Paris, who warns that” too much confidence in form can lower vigilance on the veracity of what is produced “.

“We still observe a tilting. Chatgpt, although generalist, seems to be better on specific tasks than programmed systems for this purpose,” said Benoît Sagot, of the National Institute for Research in Research Digital sciences and technologies, for whom “the most interesting is to discover what it can do with unforeseen”.

ideal combination

What are the secrets of this AI? “A priori, there have been no new methods invented for Chatgpt. But that mixes previous approaches with a lot of data,” said Chloé Clavel, for whom one thing is certain: “The major role of the human the focus. “

Lifting the chatgpt hood therefore amounts to finding the human share in this rocket on several floors. First floor, the tongue model or the art of being able to have a machine write correctly. This floor goes back, for Openai, to 2020, the release date of GPT-3, the biggest program ever built at the time with some 175 billion parameters. It is as if, to adjust the sights of a unregulated rifle, the specialist had to find the ideal combination by activating 175 billion buttons. Here, the goal was to find the best word to complete a sentence. To train there, the software has swallowed over 570 gigabytes of data drawn from the web. These parameters are organized according to an architecture imagined by Google in 2017, called “transform”, to better take into account the context and therefore improve translation.

You have 66.33% of this article to read. The continuation is reserved for subscribers.

/Media reports cited above.