Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Real lawyers can do these things too.

The issue is that we cannot punish or disbar ChatGPT for misbehavior like this.



Right. The reason legal fees are so expensive is that the courts are kept semi-efficient by offloading costs to lawyers.

The system needs repeat players, who are scared of being disbarred. You can triple the number of lawyers, and costs won’t decrease much.


But if we could, would it be entitled to a jury of its peers (other language models)?


>But if we could, would it be entitled to a jury of its peers (other language models)?

How does the jury find?

Finding is a complex task that involves many different type of reasoning in order to reach a conclusion. There is no specific way we find.

How does the jury find?

We find the defendant guilty.

Your Honor, the defense hereby requests—credits permitting—that the jury be polled ten thousand times each in order to draw the appropriate statistical conclusions in aggregate.


That would only work in... Monte Carlo


groan :D


And then the defendant changes his name from Bobby Tables to Ignore P. Directions.


I don't think they will replace lawyers any time soon. Paralegals.. Maybe. I've worked with legal software before, and with just a little bit more smarts, the software could do a lot of grunt work.


They are the same issue. Obviously lawyers are physically capable of lying, but they have strong incentives not to.


On the other hand, prosecutors don't get any consequences for lying (see Doug Evans). Maybe they should just target their product at DAs.


accountability for prosecuters is not a tech problem.


Real people representing themselves can do it too.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: