Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Agreed; feels to me that people here is underestimating malice on the Internet.

I don't think we do. We just prefer to put our trust in algorithms and verifiable data sources. It's not like Google et al are the pinnacle of altruism, there have been cases where the promoted results are faked copies of the actual site you want to visit, fooling less computer savvy users to install malware.

The trust is put into the code, same principle as reproducible builds. It doesn't matter where you get the source, as long as the checksum matches. This way the censor side of the problem is solved.

That leaves the spam, which isn't really solved by the big corporations either. Last time I used google I got 2-3 pages of the same auto-generated bullshit on every technical search term I tried. This could be fixed by having the main index limited to trusted sites at the expense of discovering new content. The latter can be handled by opt-in indexes. If the goal is to index everything users could have their own filters for sites they don't want.

If you really want to spice it up allow me to maintain my own query function (dangerous and potential exploit yes) that I send to the nodes and I can handle my own ranking.

There's nothing that makes a distributed index more unsafe than one run by Google. If every query picks 2 random nodes and compares the results I would trust that query more than current Google execs opinions of what I'm allowed to see.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: