Good, this was a poorly conceived idea to build into an OS and turn on by default.
It is the same problem with a lot of the AI tools right now. Using them for your code, looking at your documents, etc etc. Unless you self host it or use a 'private' service from Azure or AWS (which they say is safe...) who knows where this information is ending up.
This is a major leak waiting to happen. It scares me to think what kind of data has been fed into ChatGPT or some code tool that is just sitting somewhere in a log or something plaintext that could be found later.
That's not what this is about. This is about the M365 tools that you can add to outlook/teams etc. It needs separate licensing and isn't enabled by default, you have to pay for it and assign to users/groups.
Microsoft's branding is seriously all over the place with all of this... They have at least 3 "Copilot" things now I think? Github copilot, the one built into Windows, and now this apparently. sigh.
Regardless, my other points still stand. All of these tools remain a leak waiting to happen.
It's a separate GPT instance specifically designed for corporate data per company, it not the same model the public uses. You could say Gmail is a leak waiting to happen because you're not hosting your own email server. This is a product specifically designed to work with enterprises data. The risk isn't any greater than Azure getting hacked or something.
Recently, one of my friends asked ChatGPT about an internal tool with a funny name and ChatGPT seemed to know the meaning behind the name & options to run the tool properly for his usecase.
We googled around to see if there was any information on the web about the tool & there’s nothing on Google which makes sense since it’s a boring internal tool for a financial services company.
Ofcourse it could be a lucky guess or it could be an intern had uploaded the manual to GPT :D
It would not be impossible for a LLM to hallucinate a correct answer based on the name of a well named tool that had sane default choices in its architecture
It is the same problem with a lot of the AI tools right now. Using them for your code, looking at your documents, etc etc. Unless you self host it or use a 'private' service from Azure or AWS (which they say is safe...) who knows where this information is ending up.
This is a major leak waiting to happen. It scares me to think what kind of data has been fed into ChatGPT or some code tool that is just sitting somewhere in a log or something plaintext that could be found later.