Hacker Newsnew | past | comments | ask | show | jobs | submit | jacksonsabey's commentslogin

Location: Vancouver, Canada

Remote: Yes

Willing to relocate: Possibly (within Canada)

Technologies: Golang, RethinkDB, SMTP, PHP, MySQL, PostgreSQL, MongoDB, Elasticsearch, Memcached, Redis, Ubuntu, FreeBSD, ZFS

Résumé/CV: on request

Email: jackson.sabey+hn0318@gmail.com

Recent Work:

https://0ut.ca/documentation - Link Shortener, Tools, Parser/Validators, and API for +16 correct and complete RFC implementations for URI/IRI and Email components

Github: https://github.com/sabey https://github.com/sabey/ishtargate - contextual template engine for iptables/firewalls

I'm interested in continuing work with Golang, specifically on backend infrastructure. Web Services, Email, Storage, Security, and Distributed Systems are my main interests.


I have a similar problem with email spam, except at the bottom of every email there is a link to "disavow_contact" - the problem is it does nothing


The new AMD APUs don't have ECC support either, I'm hopeful the future versions will


Why would they disable ECC support for the APU version?


this always happens anytime there's a sale on aliexpress, most of time it actually costs more when it's "on sale"


will I be able to build a laptop with ecc? if not buy a laptop with it, buy ram that will be supported


would 2 different torrents that contain a file with the same exact physical hash share the same torrent file hash?

if torrent A and B both contain the exact same file, but torrent A only has the first half available, and torrent B has the second half available, could I combine both torrents to download that file? this could help fix old dead torrents or at least make the file searchable elsewhere by it's sha256 for example


As long as they also have the same piece size.


Not necessarily. If you have a smart client it should be able to combine them at the file-level since individual files are hashed as a whole now, regardless of the block size within it.


you just described usenet, requests or fills are usually done over irc


Usenet was a different system entirely. It was an application layer broadcast mechanism. All of the content pushed to Usenet was replicated all across the world so people could locally access it. Fantastic system for distributing current information to thousands of people worldwide over slow backbone links. Unfortunately it was also fantastic for broadcasting advertising to the entire world for free, which is why Usenet for news is dead.


I have an implementation, although it's currently closed source and is only available via API: http://0ut.ca/documentation

I believe it's closest to the standard that I've found, and if it isn't I would like to correct that.

There is a Strict parser which will fail on any error, and Loose parser which will discard errors when possible and follow the defacto parsing implementations.

It should be able to handle any of the edge cases, such as partially percent encoded unicode, invalid characters, normalization, or octal/hex ipv4 addresses. The only thing from your linked unittests that it will not handle is | and \ for windows paths, they will be encoded.

You can easily compare the expected output in your browser here if anyone is interested in seeing how parsing is done: http://0ut.ca/api;v1.0/validate/uri/after?hTtPs://foo:%F0%9F... You can also try validating strange relative URIs: http://0ut.ca/api;v1.0/validate/uri/after?+invalid-scheme:/p...?

I would be happy to explain any of the reasoning behind the parsing if anyone is interested.


Wow, thanks!

Your tool helps me because it's like an EXAMPLES section of a man page.


Location: Vancouver, Canada

Remote: Yes

Willing to relocate: Yes

Technologies: Golang, RethinkDB, PHP, MySQL, PostgreSQL, MongoDB, Elasticsearch, Memcached, Redis, Ubuntu, FreeBSD, ZFS

Résumé/CV: on request

Email: jackson.sabey+hn1216@gmail.com

Recent Work:

https://0ut.ca

- Created my first MVP for a SAAS platform that's focused on Link Shortening and Link Tools

- Implemented Parsers and Validators following RFC guidelines for +16 different common components that make up a URI, including Email

Github: https://github.com/sabey

- the spoofgo repo is the latest/largest public project I've released as an example of my coding style

I love working with Go and it would be great to continue working with the technology that I am familiar with. However, I am open to learning new languages. The ability to learn on the job is important to me. I'm interested in Distributed Systems, Ad Networks, Security Tools, and Crypto Currencies.


I've recently released a beta platform and API for tools for working with links: https://0ut.ca/

There's currently a Link Shortener, UTM Campaign Builder, Parser and Validators for 15+ RFC implementations for different URI components.

I have a lot of continuing work to do, such as better analytics, a user system, and more tools.

I haven't got any feedback yet, I would love to hear just about anything, it would be encouraging. Feedback about my implementations would be greatly appreciated. Thanks!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: