Hacker Newsnew | past | comments | ask | show | jobs | submit | bogidon's commentslogin

Chiming in with something I've posted in the past that I've found reduced about 80% of this spam mail and only takes 10 minutes to set up: https://news.ycombinator.com/item?id=31070730



In my experience lots of unwanted mail comes from credit card offers, insurance, etc. Turns out that unless you have an account with them, all these financial service companies get their prospective mailing data through the credit bureaus. However there is a 2003 law that required the bureaus to create an opt out mechanism, which is available here: https://www.optoutprescreen.com/

Doing this has cut out 99% of my spam paper mail for the last year. Would highly recommend for sanity and as an easy way to cut down some on environmental impact. You can opt out for up to 5 years through the website or permanently by writing a paper letter. I did the latter through one of the "you write digitally, we'll send physical letter" services you can find on google and it's been great.


One of the problems I have with websites for official interaction with US government stuff is that, a lot of times, it's some random .com website or a 3rd party with no direct integration with the government entity they're interfacing with. This site has all the potential hallmarks of being a scam, so if I just found this on my own I would have a hard time believing it would work after giving it my personal info.


For sure. Except I don't think this site provides "official interaction with US government stuff". My expectation is that the government was involved only to create the regulation and the implementation was left to the credit bureaus. Which of course are incentivized to make the conversion rate of this site as low as possibly. So offputtingly sketchy is probably a good thing for them.


I just spent two months apartment searching in NYC. I talked to many brokers but not one was interested in actively searching on my behalf. The market is incredibly disadvantageous to renters right now. Maybe parent is sharing experience from a different time or has broker connections I did not.

As an aside I also tried to automate my apartment hunt. The main thing that matters in NYC is time to respond. Unfortunately Zillow, StreetEasy, etc are not very easy to automate on the messaging side due to bot countermeasures. It was an incredibly time consuming, manual process. Happily found a great place though.


I have a poor understanding of electrical engineering so I am not sure this would actually work. But I've wondered if one could make a fairly cheap gadget for turning a decent USB-PD supply into a somewhat useful hobbyist variable voltage DC power supply.

Especially with USB-PD 3.1 which has an adjustable voltage supply mode [1] so you are not constrained to only a handful of power profiles. I think in theory you could offer 5V, 9V, and 15-48V.

It would be cool because it would be much easier to carry (keep in your backpack) and I imagine cheaper than a benchtop power supply so long as you have access to a capable USB-C charger. But I'm not sure what you'd be missing out on versus a proper benchtop.

[1] https://www.usb.org/usb-charger-pd


Like Op, I also have been tinkering with USB-C PD and found another hobbyist tool on Tindie that does exactly like you describe! You can specify the voltage and current needed, and it'll then make sure you get that power from a compatible charger.

This is the one I got a while ago it and it works great: https://www.tindie.com/products/clarahobbs/pd-buddy-sink/ though there's several other options out there


Oh that’s awesome. Thanks for sharing.


You might be better off getting a fixed power profile from the brick and regulating down to your desired voltage separately. While it's possible the brick will be well regulated, it's likely to have some noise and output voltage variability. Even for a hobbyist, the DC supply should be clean and well regulated, or you might be stuck debugging your hobby design only to realize it was a power issue all along.

A small board/box with a DC to DC regulator, knob, and display, could still be quite a bit smaller than a big bench supply, and provide reasonable performance. USB-C in (at a fixed power profile) to an output from a variable buck supply, for example.



I still have an iPhone for now (am waiting for mobile Linux to become just a little more mature) but I’ve turned off iMessage. When I get asked about how I can be reached by IM I tell people they can find me on Signal.

Been doing this for about two years and now ~80% of my personal conversations are over Signal. And most of these friends are not very technical.

I love my little successful rebellion against the walled garden. And it will make my eventual transition away from iOS much easier.

Just a somewhat related personal anecdote, sorry to deviate a bit from topic.

EDIT: Yes, Signal is centralized and I would have loved to use Matrix. But the clients are bad. Email me if you want to build a better one


Big plus one. In order to encourage (what I strongly believe to be much needed) experimentation with personal messaging we have to break away from having each. new. client. establish it's own network. Otherwise competition in this field will always be limited. Matrix I believe is currently the most promising answer to this problem.


or the Internet Standard XMPP


I only started feeling a sense of actually understanding CSS (as opposed to working on it based on acquired intuition) when I started reading the official specs. I think that most other material is an abstraction over those that usually doesn’t explain the broader context surrounding a feature or its edge cases / interactions with other features. For example here’s the spec on positioning: https://www.w3.org/TR/css-position-3/

I’ve found directly useful information in specs that was not commonly pointed out in tutorials/stack overflow/etc when discussing those features (not remembering a specific example right now unfortunately). Can say the same for JavaScript.

I think in general when learning about an API it’s best for understanding to read primary sources. And thankfully the W3C specs are quite good documents and not too inaccessible in my opinion.



I was surprised recently to learn about the back/forward cache (bfcache) [1], a common browser optimization that sort of does what you're describing but stores a more complete snapshot of the page, including the JS heap. I'm not sure if it's used in this new FF feature (the article doesn't give details about the mechanism of unloading).

However the bfcache unsurprisingly causes edge cases in JS-heavy pages/SPAs, and a prevailing solution [2] is for the page to force a reload when it is restored from this cache rather than explicitly handling those edge cases, nullifying the optimization.

[1] https://web.dev/bfcache/

[2] https://stackoverflow.com/a/13123626/14665201


Only recently when I flew backwards through about 50 history steps (entirely different webpages) on my phone did I think more deeply about how impressive it was that it was so seamless.


Sadly lots of modern sites break it completely. I regularly have issues getting back to HN threads from websites, usually I end up at the HN home instead. Quite frustrating.


This should help, if it ever gets implemented: https://mozilla.crowdicity.com/post/728459


You can get this effect just by loading some document into the sidebar e.g. Show all bookmarks -> check Load this bookmark in the sidebar -> click that bookmark - I'm not sure how to do this easy in newer Firefox versions - with Browser Console maybe ?

If you want full Side by side browsing mode:

Some part of it is extremely easy - to display ALL tabs side by side:

  profile/chrome/userChrome.css :

  tabpanels {
    display: -moz-box !important;
  }

  tabpanels > notificationbox {
    -moz-box-flex: 1;
    border-width: 2px !important;
    border:solid #888;
  }
For the other part - to have only two panels, sibling is easy, otherwise you would need some JS to mark a tab as something like content-secondary and handle it (e.g. there was an extension last_selected_tab doing like this AFAIR) - as you have only:

  tabpanels > notificationbox > browser[type=content-primary]
(Just tested in Firefox 60.4.0.esr - newer versions may need some minor adjustements in CSS, toolkit.legacyUserProfileCustomizations.stylesheets preference set to true and probably some other fixes. In 2008 I had a Norton Commander 'clone' made in such incredibly simple way, ALL file and media formats preview (plugins), with a bit of custom code for nsIProtocolHandler, and nsIURIFixup fixes :) , kudos to archView extension by Pike/Solar Flare.)

R.I.P. Firefox.


I'm not convinced that you fully understood the point of this idea. The point is not just to just have any two web pages sitting next to each other (this is what your modifications do, right?), but more importantly the interaction between those pages.

I can already have two pages next to each other using Windows 10, by snapping two windows to the two sides of the screen. However, while that provides the correct visual shape, the user experience is completely different, as clicking a link in the left window will not open that page in the right window, replacing what's already there.


That's exactly how Load this bookmark in the sidebar works, you can have any document loaded into the sidebar, clicking any link on it will result in replacing document in the active tab (the only one visible).

https://mozilla.crowdicity.com/post/728459 says about Side by side browsing mode which I see as: with any two tabs, the content of the default active/content-primary tab will be replaced by link clicked on the other visible tab.


Nice idea.

Again something that used to work in Firefox.

FWIW: I use Tree Style Tabs (or Sideberry) to see why I arrived at a page. Not exactly the thing specified here but it works today.


> Again something that used to work in Firefox.

Yeeep, I used it for a while: https://archive.is/ovRDX (addon no longer exists, it broke with Firefox Quantum)


Yet another instance of SPAs pointlessly reinventing problems we solved decades ago.


Can you seriously not think of a single use case for SPAs?


There are a few. But they're good for a vastly smaller range of use cases than they're applied to.

You're writing Google Docs? Good for you, you can use an SPA. You're writing a social networking site? That should be a regular HTML page, with a sprinkling of JS. Blog? HTML page. Issue tracker? HTML page.

Every site I've used that's been plain HTML and CSS, with maybe minor supplements in Javascript, is much nicer to use than any SPA. People claim they load faster, but I've never experienced that in practice - a plain HTML page loads in milliseconds, but downloading a giant gzip of Javascript and waiting for it to make a zillion API calls takes seconds.

GMail's HTML-only interface fully loads and renders in ~630ms for me. The SPA version takes 11.69 seconds.

lite.cnn.com loads in 287ms. CNN.com takes 4.25 seconds.

thin.npr.org loads in 112ms. NPR.org takes 1.47 seconds.

Need I go on? Even for the lightest one, NPR, if you assume loading an article on the SPA version is instantaneous, you'd need to load 14 articles to make up for the slowness of the initial SPA load. The others are all much worse. And this is with an adblocker enabled - this will only get worse with one disabled.

Using SPAs when they aren't strictly necessary is a great way to shoot your site's performance, usability, and accessibility in the foot. You can fix the usability and accessibility problems, but you need to devote substantial engineering resources to doing so, and those are free if you just use the damn plain HTML.


To me a SPA is fine and usually preferable for anything I keep open for long periods of time. Sure, Gmail takes 10+ seconds to load. But I only load it once every few days at most because it stays open in my browser.


Cool. I don't feel like permanently dedicating a couple hundred megs of RAM to every site I need to use that decided to go chasing the latest counterproductive trend.


Single-page apps are forgivable for media oriented sites, so the video/music can keep playing as you navigate the site. in the past this was done with pop-up players and was generally messier, so I like SPAs for this.

Literally every other type of website on the planet should use traditional loading pages.


SPAs can easily handle routing via the history API, the problem is that companies haven’t migrated correctly.


It’s usually crappy though, like the URL and the history don’t accurate recreate the state of the app on refresh at all, or the history gets so clogged with every mutation of app state that the back button is effectively useless.


Twitter does it rather well, and that’s a site where it’s difficult and matters.


Again, that’s a development shortcoming, not a technical limitation. All of your concerns can show up in non-SPAs too.


Yes, I know in some theoretical scenario where the developers do everything exactly right none of these problems would exist. My argument is not that it's intrinsic to the technology, my argument is that my _experience_ is that the vast majority of (needless) SPAs are like this. Whether it's because frontend developers get all hot to re-implement standard browser features themself using redux, because the tooling is bad, or what, I can't tell you.


Aside from cost-savings for the business I can’t really think of one.


This is not an angle I’ve heard before - how does an SPA save money?


It’s cheaper to build single page apps using web developers than to build and maintain multiple platform-specific native apps.


Not everything needs to be a native application, nor should it be. The vast majority of SaaS and such apps are SPA/equivalent and would NOT benefit from being yet-another-installable app.

Could you provide examples (not exceptions) maybe to support your PoV?


> The vast majority of SaaS and such apps are SPA/equivalent and would NOT benefit from being yet-another-installable app.

Could you provide few examples of such apps that would not benefit from being native?


So the smaller-scale correlate would be that SPAs can make more sense for proof-of-concept or hobby-developed applications that need some cross-platform access.


Offload computations and rendering to the client


Which computations and what rendering? Very little money is saved from SSR vs CSR, rather the lower data transfer saves everyone money and time.


> Very little money is saved from SSR vs CSR

this seems counterintuitive.

SSR means most processing happens on big iron inside a dc, this has to be more efficient than each user buying a new machine every few years just to have that peak js-perf needed for the latest frameworks.


Opera (with the Presto engine) used to do this really well.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: