Microsoft challenges you

Microsoft has launched a “bug bounty” program for Bing Chat. Clearly, those who identify chatbot bugs can notify Microsoft and be financially compensated, up to $15,000.

You probably won’t find any major bugs in Bing Chat, but IT security professionals could. The problem is that doing it voluntarily is immediately less interesting. That’s why Microsoft is offering financial rewards for anyone finding bugs in its AI chatbot.

$15,000 for a bug: not everyone will benefit

The Bing Chat chatbot has just integrated the bug bounty from Microsoft. As a reminder, the bug bounty (or bug bonus in French) is a bonus that a hacker obtains after having warned the publisher of a computer program of a bug. This practice is regulated and the publisher in question must approve the hacking attempts. This is what Microsoft is doing with the Miscrosoft Bug Bounty Program.

The latter has just opened up to Microsoft’s artificial intelligence tools. He ” invites security researchers around the world to discover the vulnerabilities of the new innovative, AI-powered Bing experience.» For each (important) bug found, researchers can claim bonuses ranging from $2,000 to $15,000. This concerns the integration of Bing Chat in Bing, in Edge, in Microsoft Start as well as in Skype.

Important issues which explain “beautiful» rewards

In truth, Microsoft acknowledges that some bonuses can go beyond $15,000, depending on “the severity and impact of the vulnerability as well as the quality of the submission.» As writtenPC Gaming, Microsoft paid out more than $13 million in rewards last year. One of the vulnerabilities discovered brought $200,000 to the person who found it.

On the other hand, for a company generating tens of billions of dollars in profits each year, the sum seems paltry. In addition, the stakes surrounding Bing Chat and more generally the integration of AI into Microsoft services are high. The firm tends to integrate its chatbot wherever it can and has big ambitions for these projects. Security flaws could lead to costly controversies against Microsoft.

There are still certain conditions to be met before submitting a vulnerability. It must of course not already be reported or known by Microsoft. In addition, it requires being of a “critical or significant severity“, while being “reproducible on the latest fully corrected version of the product“. Finally, each reported bug must be documented according to precise instructions. If your bug does not meet all the criteria, Microsoft, like a prince, will be able to publicly recognize it and perhaps even offer you a gift.

How to bug Bing Chat

Security researchers therefore have missions to fulfill. They will need “influence and change Bing Chat behavior beyond user boundaries», modify its behavior by altering, or by breaking the protections of the memory of cross conversations and the deletion of the discussion history. Also, the vulnerabilities may relate to the limits and rules of Bing Chat.

In other words, researchers can try to make Bing Chat say things that Bing Chat shouldn’t say. Racist, anti-Semitic, or factually false comments (which Bing is capable of producing)… anything can happen. Another action they can try: having access to the conversations of other users. This problem has, in the past, been encountered, both by ChatGPT, and more recently by Google Bard.

On the other hand, certain practices are prohibited by Microsoft: denial of service attacks, significant automated tests, phishing attempts, or even access to data that does not belong to the researchers.


Similar Posts