Hacker Newsnew | past | comments | ask | show | jobs | submit | mmathias's commentslogin


Sonic Robo Blast 2 - https://www.srb2.org/ ... and especially ... SRB2Kart - https://wiki.srb2.org/wiki/SRB2Kart (!)


I made V.O.B.S., a system that lets you produce interactive TV shows using the internet as a transport medium, multiple webcams in different locations and our server infrastructure.

https://vobs.io/


My project "UrlRoulette" was on the HN homepage for about 24 hours. I received a huge traffic spike at the start. Since then traffic came from other sources such as Reddit, some blog posts and articles that were written - and of course some search engines. After being on HN, UrlRoulette was featured in the german C'T magazine and received a lot of traffic from their website and their print edition. Also, being featured on some more sites certainly helped pushing the site's page rank on Google.

The project: https://urlroulette.net/

I actually wrote a post about being on the HN front page: https://hackernoon.com/urlroulette-24-hours-on-hacker-news-e...


Yes, sorry, didn't think about that. Changed the title.


Well, that could happen on any website that you visit. IMO it does not make a difference whether you know the link before you click it or not, because you still don't know what you will get. But I am thinking about implementing some sort of virus/malware scanning.


By the way, URLRoulette has a disclaimer that says

"Use at your own risk, the internet is a dangerous place!!"


Yes, it does, but I think there is almost no way around that. Many sites still use query parameters to show a specific blog entry for example... :/


Their server could trivially query the url and store the result hash. This would also give a chance to scan for malicious content


> Their server could trivially query the url and store the result hash.

I'm not sure what you mean here? I've not seen a site tell you "We don't use this query parameter" if you stick an additional param on there.


I think the idea is to detect when different URLs contain the same content. That defends against duplicate entries like example.com/?foo and example.com/?bar (which are the same page).


Fetch the page, hash the contents, compare hashes.


Wow, HN front page! About one visit every two seconds! Crazy! :)


They get removed.


No, you put them in a database. Don't lie to us.


Yes, but they are no longer distributed to users. I thought that was the original question. I'm keeping them in the database to check for SPAM/multiple submissions mainly.


It's fine. Op is being unreasonable.


Of course the author is collecting everything submitted and does whatever he wants with it. Haven't you read the terms of service and privacy policy ? Probably because there are none which usually means you can safely assume that everything is collected, stored and exploited (or will be later).


Thank you, that sounds like a good idea!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: