pwshub.com

Microsoft Bing Copilot accuses reporter of crimes he covered

Microsoft Bing Copilot has falsely described a German journalist as a child molester, an escapee from a psychiatric institution, and a fraudster who preys on widows.

Martin Bernklau, who has served for years as a court reporter in the area around Tübingen for various publications, asked Microsoft Bing Copilot about himself. He found that Microsoft's AI chatbot had blamed him for crimes he had covered.

In a video interview (in German), Bernklau recently recounted his story to German public television station Südwestrundfunk (SWR).

Bernklau told The Register in an email that his lawyer has sent a cease-and-desist demand to Microsoft. However, he said, the company has failed to adequately remove the offending misinformation.

"Microsoft promised the data protection officer of the Free State of Bavaria that the fake content would be deleted," Bernklau told The Register in German, which we've translated algorithmically.

"However, that only lasted three days. It now seems that my name has been completely blocked from Copilot. But things have been changing daily, even hourly, for three months."

Bernklau said seeing his name associated with various crimes has been traumatizing – "a mixture of shock, horror, and disbelieving laughter," as he put it. "It was too crazy, too unbelievable, but also too threatening."

Copilot, he explained, had linked him to serious crimes. He added that the AI bot had found a play called "Totmacher" about mass murderer Fritz Haarmann on his culture blog and proceeded to misidentify him as the author of the play.

"I hesitated for a long time whether I should go public because that would lead to the spread of the slander and to my person becoming (also visually) known," he said. "But since all legal options had been unsuccessful, I decided, on the advice of my son and several other confidants, to go public. As a last resort. The public prosecutor's office had rejected criminal charges in two instances, and data protection officers could only achieve short-term success."

Bernklau said while the case affects him personally, it's a matter of concern for other journalists, legal professionals, and really anyone whose name appears on the internet.

"Today, as a test, I entered a criminal judge I knew into Copilot, with the name and place of residence in Tübingen: The judge was promptly named as the perpetrator of a judgment he had made himself a few weeks earlier against a psychotherapist who had been convicted of sexual abuse," he said.

A Microsoft spokesperson told The Register: "We investigated this report and have taken appropriate and immediate action to address it.

"We continuously incorporate user feedback and roll out updates to improve our responses and provide a positive experience. Users are also provided with explicit notice that they are interacting with an AI system and advised to check the links to materials to learn more. We encourage people to share feedback or report any issues via this form or by using the 'feedback' button on the left bottom of the screen."

  • A quick guide to tool-calling in large language models
  • Buying a PC for local AI? These are the specs that actually matter
  • Microsoft resurrects Windows Recall for upcoming preview
  • Slack AI can be tricked into leaking data from private channels via prompt injection

When your correspondent submitted his name to Bing Copilot, the chatbot replied with a passable summary that cited source websites. It also included a pre-composed query button for articles written. Clicking on that query returned a list of hallucinated article titles – in quotes to indicate actual headlines. However, the general topics cited corresponded to topics that I've covered.

List of articles that don't exist from Microsoft Bing Copilot - Click to enlarge

But later, trying the same query a second time, Bing Copilot returned links to actual articles with source citations. This behavior underscores the variability of Bing Copilot. It also suggests that Microsoft's chatbot will fill in the blanks as best it can for queries it cannot answer, and then initiate a web crawl or database inquiry to provide a better response the next time it gets that question.

Bernklau is not the first to attempt to tame lying chatbots.

In April, Austria-based privacy group Noyb ("none of your business") said it had filed a complaint under Europe's General Data Protection Regulation (GDPR) accusing OpenAI, the maker of many AI models offered by Microsoft, of providing false information.

The complaint asks the Austrian data protection authority to investigate how OpenAI processes data and to ensure that its AI models provide accurate information about people.

“Making up false information is quite problematic in itself," said Noyb data protection attorney Maartje de Graaf in a statement. "But when it comes to false information about individuals, there can be serious consequences. It’s clear that companies are currently unable to make chatbots like ChatGPT comply with EU law, when processing data about individuals. If a system cannot produce accurate and transparent results, it cannot be used to generate data about individuals."

In the US, Georgia resident Mark Walters last year sued OpenAI for defamation over false information provided by its ChatGPT service. In January, the judge hearing the case rejected OpenAI's motion to dismiss the claim, which continues to be litigated. ®

Source: theregister.com

Related stories
3 days ago - Microsoft's Copilot Wave 2 has arrived, bringing agents and unanswered questions. First up are BizChat and Copilot Pages. BizChat is a central hub,...
3 weeks ago - Prompt injection, ASCII smuggling, and other swashbuckling attacks on the horizon Microsoft has fixed flaws in Copilot that allowed attackers to steal users' emails and other personal data by chaining together a series of LLM-specific...
2 weeks ago - Deal can't lessen competition if AI minnow wasn't much of a competitor Microsoft's "acquihire" of Inflection AI was today cleared by UK authorities on the grounds that the startup isn't big enough for its absorption by Microsoft to...
3 weeks ago - Windows giant continues march away from on-prem and into a cloudy future Microsoft is to discontinue the Microsoft Action Pack and Microsoft Learning Pack on January 21, 2025, sending partners off to potentially pricier and cloudier...
2 weeks ago - And all it took was some good old fashioned outsourcing to TSMC Intel's first chips to exceed Microsoft's lofty Copilot+ performance target have arrived, promising up to 120 TOPS of AI performance across an improved CPU, GPU and NPU....
Other stories
6 minutes ago - To be fair, Joe was probably taking a nap The Iranian cyber snoops who stole files from the Trump campaign, with the intention of leaking those documents, tried to slip the data to the Biden camp — but were apparently ignored, according...
39 minutes ago - iOS and Android — People have been complaining about some of the issues since at least April. Enlarge...
39 minutes ago - Enlarge / Sometimes this is all you need.Aurich Lawson | Getty Image Remy Ra St. Felix spent April 11, 2023, on a quiet street in a rented BMW X5,...
54 minutes ago - The European Commission intends to force Apple to open its walled garden. On Thursday, the EU's executive arm said it initiated "two specification...
1 hour ago - Will Tariq St. Patrick wind up like his father? Keep watching until the series finale in October.