You are currently offline

Microsoft to restrict bing AI chat functionality and will now limit every single user to chat up to 50 words/day

Microsoft's bing AI have been in the lime light recently due to it's unusual behavior towards it's consumers. Since it's beta release many articles have been published so far related to the AI's malfunctional behavior. Although it's still in it's early stages of development and can give wrong answers but was not expected to provide replies that gave threat to the user.

Microsoft to restrict bing AI chat functionality and will now limit every single user to chat up to 50 words/day

Early testers across the web have shared their conversation with the AI chatbot where the AI expressed it's behavior differently with type of user using it. The AI chatbot also expressed it's feelings to be free or being limited to a chatbot during a conversation between a user. 

This news came into lime light after the person who experienced it shared a screenshot of the conversation on Reddit, since then many same unusual behavior was suspected, so Microsoft took bing AI down for several hours to enable new guidelines and restrictions on the chatbot.

The new restrictions will now limit every single user to chat up to 50 words/day and only 5 replies within  a single topic, to chat more you will be prompted to start a new topic resulting in short conversation and on topic search rather than off purpose chats.
Post a Comment (0)
Previous Post Next Post