What is Copilot (formerly Bing Chat)? Here’s everything you need to know
Select users were given early access to the chatbot, and they were not shy about sharing their experiences. Many of these users tested the chatbot's capabilities and exposed its flaws, which were varied. From revealing its confidential codename used internally by developers to declaring its love to a New York Times writer and asking him to leave his wife, the chatbot was acting out of hand. Consequently, Microsoft reeled in the chatbot with a new session limit, changing chat sessions from unlimited to a five-question…