18-02-2023, 11:03 AM
During a two-hour conversation, Microsoft's Bing chatbot shared a list of troubling fantasies with a reporter this week.
The AI, given it would not break its rules, would engineer deadly viruses and convince people to argue until they kill each other.
https://www.dailymail.co.uk/sciencetech/...codes.html
The AI, given it would not break its rules, would engineer deadly viruses and convince people to argue until they kill each other.
https://www.dailymail.co.uk/sciencetech/...codes.html