GitHub will increase productivity and safety of Copilot tool in next update

Github Copilot was released in June 2022. It is a software based on artificial intelligence, which uses Openai Codex to generate the source code and recommendations on Visual Studio functions in real time. The company claims that soon Copilot will become even more powerful and safer.

According to GITHUB representatives, the new artificial intelligence model, which will be presented to users this week, offers better recommendations in a shorter time, which further increases the efficiency of software developers using this model.

Copilot will present a new function called “Fill-in-The-Middle”, which uses the library of well-known code suffixes and leaves a gap to fill in the II instrument, ensuring greater relevance and consistency with the rest of the project code . In addition, GitHub updated the Copilot client, reducing the number of undesirable recommendations by 4.5%, which increased the total level of acceptability of code.

“When we first launched Github Copilot for individuals in June 2022, on average more than 27% of the developers code files were generated by him. However, today the GitHub Copilot has an average of 46% of the developers code in all programming languages, and for Java this figure reaches 61%, ”said Shuin Zhao, senior director of GitHub products.

One of the main improvements in this Copilot update is the introduction of a new security vulnerability system, which will help identify and block unsafe recommendations, such as rigidly programmed accounting data, the implementation of the path and the implementation of SQL.

“The new system uses LLM (large language models) to approximate the behavior of static analysis tools. And since GitHub Copilot launches expanded AI models on powerful computing resources, it is incredibly fast and can even detect vulnerable templates in incomplete code fragments. This means that unsafe templates are quickly blocked and replaced by alternative safe ones, ”Zhao said.

GitHub states that Copilot can generate the so -called “secrets”: keys, accounting data and passwords. However, they cannot be used in real projects, since they are completely fictitious and will be blocked by a new filtration system.

An example of a real -time key block

The appearance of “secrets” caused a sharp criticism from the community of software developers. Many accused Microsoft of using large sets of publicly available data to teach their AI models without much attention to security and copyright issues. Including the sets that mistakenly contain “secrets” .

…..

/Media reports cited above.