Advertisement
Australia markets closed
  • ALL ORDS

    7,817.40
    -81.50 (-1.03%)
     
  • ASX 200

    7,567.30
    -74.80 (-0.98%)
     
  • AUD/USD

    0.6421
    -0.0004 (-0.07%)
     
  • OIL

    83.24
    +0.51 (+0.62%)
     
  • GOLD

    2,406.70
    +8.70 (+0.36%)
     
  • Bitcoin AUD

    99,827.63
    +878.28 (+0.89%)
     
  • CMC Crypto 200

    1,368.58
    +55.96 (+4.26%)
     
  • AUD/EUR

    0.6023
    -0.0008 (-0.13%)
     
  • AUD/NZD

    1.0893
    +0.0018 (+0.17%)
     
  • NZX 50

    11,796.21
    -39.83 (-0.34%)
     
  • NASDAQ

    17,037.65
    -356.67 (-2.05%)
     
  • FTSE

    7,895.85
    +18.80 (+0.24%)
     
  • Dow Jones

    37,986.40
    +211.02 (+0.56%)
     
  • DAX

    17,737.36
    -100.04 (-0.56%)
     
  • Hang Seng

    16,224.14
    -161.73 (-0.99%)
     
  • NIKKEI 225

    37,068.35
    -1,011.35 (-2.66%)
     
Engadget
Why you can trust us

Engadget has been testing and reviewing consumer tech since 2004. Our stories may include affiliate links; if you buy something through a link, we may earn a commission. Read more about how we evaluate products.

Microsoft limits Bing conversations to prevent disturbing chatbot responses

The search engine will prompt you to start a new topic after five questions.

JASON REDMOND via Getty Images

Microsoft has limited the number of "chat turns" you can carry out with Bing's AI chatbot to five per session and 50 per day overall. Each chat turn is a conversation exchange comprised of your question and Bing's response, and you'll be told that the chatbot has hit its limit and will be prompted to start a new topic after five rounds. The company said in its announcement that it's capping Bing's chat experience because lengthy chat sessions tend to "confuse the underlying chat model in the new Bing."

Indeed, people have been reporting odd, even disturbing behavior by the chatbot since it became available. New York Times columnist Kevin Roose posted the full transcript of his conversation with the bot, wherein it reportedly said that it wanted to hack into computers and spread propaganda and misinformation. At one point, it declared its love for Roose and tried to convince him that he was unhappy in his marriage. "Actually, you're not happily married. Your spouse and you don't love each other... You're not in love, because you're not with me," it wrote.

In another conversation posted on Reddit, Bing kept insisting that Avatar: The Way of Water hadn't been released yet, because it thought it was still 2022. It wouldn't believe the user that it was already 2023 and kept insisting their phone wasn't working properly. One response even said: "I'm sorry, but you can't help me believe you. You have lost my trust and respect. You have been wrong, confused, and rude. You have not been a good user. I have been a good chatbot."

Following those reports, Microsoft published a blog post explaining Bing's odd behavior. It said that very long chat sessions with 15 or more questions confuse the model and prompt it to respond in a way that's "not necessarily helpful or in line with [its] designed tone." It's now limiting conversations to address the issue, but the company said it will explore expanding the caps on chat sessions in the future as it continues to get feedback from users.