Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I am very curious realistically how can they reliably fix this.

So my understanding is that this is that the database/index that copilot used already crawled this file so of course it would not need to access the file to be able to tell the information in it.

But then, how do you fix that? Do you then tie audit reports to accessing parts of the database directly? Or are we instructing the LLM to do something like...

"If you are accessing knowledge pinky promise you are going to report it so we can add an audit log"

This really needs some communication from Microsoft on exactly what happened here and how it is being addressed since as of right now this should raise alarm bells for any company using Copilot and people have access to sensitive data that needs to be strictly monitored.



It seems to me that the contents of the file cached in the index has to be dumped into the LLM's context at some point for it to show up in the result, so you can do the audit reports at that point.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: