Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Bug reports created by fake AI are driving open source developers into a frenzy


Not only is artificial intelligence flooding social media, it’s clearly hurting the open source programming community, and so are fact-checking tools like X’s community notes struggle to deny With a deluge of misinformation, open source project contributors bemoan the time wasted evaluating and identifying bug reports generated by AI code generation tools.

Register reports today In a recent blog post by Seth Larson, a security developer at the Python Software Foundation, Larson says he’s noticed “an increase in extremely low-quality, spammy, and LLM-hallucinated security reports in open source projects.”

“These reports appear at first glance to be potentially legitimate and therefore take time to disprove,” Larson added.That can be a big problem for open source projects (ie Python, WordPress, Android) that provide most of the Internet because they are often maintained by small groups of unpaid contributors.Ubiquitous code libraries can be dangerous because they have a wide range of potential impacts, Larson said is only a relatively small number of AI-generated junk reports, but the number is growing.

Another developer, Daniel Sternberg, called out the wrongdoer for wasting his time on a report he thought was generated by AI;

You presented a seemingly obvious AI “report” where you say there is a security problem, probably because the AI ​​tricked you into believing it. Then you waste our time by not telling us it’s artificial intelligence has done it for you and you then continue the discussion with even more nonsensical answers that also appear to have been generated by AI.

Code generation is an increasingly popular use case for large language models, although many developers are still confused about how useful they really are. Programs like GitHub Copilot or ChatGPT’s own code generator can be quite effective. They can also be useful for finding functions in a programming library that a developer may not be familiar with.

But as with any language model, they will hallucinate and produce incorrect code. Code generators are probabilistic tools that guess what you want to write next based on the code you give them and the code they’ve seen before. Developers still need to understand programming the language in which they work and know what they are trying to build; similarly, essays written by ChatGPT must be manually reviewed and edited.

Platforms like HackerOne offer rewards for successful bug reports, which may encourage some individuals to ask ChatGPT to search the bug repository and then submit the bugs LLM returns.

Spam has always been on the web, but AI is making it much easier to create. to fight it.Unfortunate situation and a big waste of time for everyone.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *