Hacker Newsnew | past | comments | ask | show | jobs | submit | ansgri's commentslogin

Probably the one type of PHEV that should survive is basically a BEV with builtin backup generator. One that's not necessarily powerful enough to drive you directly at full speed, but enough to basically eliminate range limitation of a (cheaper and smaller) battery by continuously charging it when needed. Maybe this 'backup generator' can even be made as a removable option.

I'm thinking of a semi-rural use case, when your typical daily trip is 20-50 km, but the charging infrastructure is poor and occasionally you do need to drive 200-300 km in winter.


There are now quite a few options for wifi APs with cellular backup. I use TP-Link, and it's ok for the price, I guess, and supports adding OneMesh range extenders.

The problem with this setup for me is that it doesn't work with uplink that sometimes becomes unstable yet nominally working, and in general LTE fallback triggers slowly.

Are there any prosumer-friendly options for connection bundling, which can balance uplinks continuously?


Assuming you're talking about running like a UI router and doing multi-WAN uplinks from it, you can.

They support load balancing (e.g. 95% WAN1, 5% WAN2) and SLA monitoring (ping/packet loss/jitter) with some voting options on what triggers a swap.

I think pfsense has similar options for WAN balancing if you don't like UI for routing.


Probably has more to do with responsibility outsourcing: if SaaS has security breach AND they tell in the contract that they’re secure, then you’re not responsible. Sure, there may be reputational damage for you, but it’s a gamble with good odds in most cases.

Storing lots of legal data doesn’t seem to be one of these cases though.


I see profits and outsourcing.

Selling an on-premise service requires customer support, engineering, and duplication of effort if you’re pushing to the cloud as well. Then you get the temptations and lock in of cloud-only tooling and an army of certified consultant drones whose resumes really really need time on AWS-doc-solution-2035, so the on premise becomes a constant weight on management.

SaaS and the cloud is great for some things some of the time, but often you’re just staring at the marketing playbook of MS or Amazon come to life like a golem.


For me the most annoying would be a technically correct solution that completely ignores the “higher-level style” of the surrounding code, at the same time defending the chosen solution by referencing some “best practices” that are not really applicable there for some higher-level reasons, or by insignificant performance concerns. Incidentally, LLMs often produce similar problems, only one doesn’t need to politely argue with them.

Compressed jsonlines with strong schema seems to cover most cases where you aren't severely constrained by CPU or small message size (i.e., mostly embedded stuff).

There should be two dark modes: a simple dark mode, like most dark themes today, to work in dimmed lighting, and an actual night mode, designed to be legible but not mess with adaptation in total darkness. I don't know the research on this (and I'm sure military and aviation have lots of data here), but intuitively it should use mostly thin red and green lines.

Some cars (Honda Civic) had this feature where you could change the theme of the UI and it had 3-4 presets (blue, red, amber, green IIRC). That was before those screens were just a built-in Android tablet and sadly it didn't carry over.

But you're right and it's the reason most german cars had red illuminated gauge clusters until a few years ago.


Give me back the Saab Night Panel. All I want is my speedo.

https://i0.wp.com/saabblog.net/wp-content/uploads/2024/08/Ni...



Typical web image quality is like it is partly because of lack of support. It’s literally more difficult to show a static HDR photo than a whole video!

PNG supports HDR with up to 16 bits per channel, see https://www.w3.org/TR/png-3/ and the cICP, mDCV and cLLI chunks.

With incredibly bad compression ratios.

HDR should not be "typical web" anything. It's insane that websites are allowed to override my system brightness setting through HDR media. There's so much stuff out there that literally hurts my eyes if I've set my brightness such that pure white (SDR FFFFFF) is a comfortable light level.

I want JXL in web browsers, but without HDR support.


There's nothing stopping browsers from tone mapping[1] those HDR images using your tone mapping preference.

[1]: https://en.wikipedia.org/wiki/Tone_mapping


What does that achieve? Isn't it simpler to just not support HDR than to support HDR but tone map away the HDR effect?

Anyway, which web browsers have a setting to tone map HDR images such that they look like SDR images? (And why should "don't physically hurt my eyes" be an opt-in setting anyway instead of just the default?)


> What does that achieve?

Because then a user who wants to see the HDR image in all its full glory can do so. If the base image is not HDR, then there is nothing they can do about it.

> And why should "don't physically hurt my eyes" be an opt-in setting anyway instead of just the default?

While I very much support more HDR in the online world, I fully agree with you here.

However, I suspect the reason will boil down to what it usually does: almost no users change the default settings ever. And so, any default which goes the other way will invariably lead to a ton of support cases of "why doesn't this work".

However, web browsers are dark-mode aware, they could be HDR aware and do what you prefer based on that.


What user wants the web to look like this? https://floss.social/@mort/115147174361502259

That video is clearly not encoded correctly. If it were the levels would match the background, given there is no actual HDR content visible in that video frame.

Anyway, even if the video was of a lovely nature scene in proper HDR, you might still find it jarring compared to the surrounding non-HDR desktop elements. I might too, depending on the specifics.

However, like I said, it's up to the browser to handle this.

One suggestion I saw mentioned by some browser devs was to make the default to tone map HDR if the page is not viewed in fullscreen mode, and switch to full HDR range if it is fullscreen.

Even if that doesn't become the default, it could be a behavior the browser could let the user select.


> That video is clearly not encoded correctly.

Actually I forgot about auto-HDR conversion of SDR videos which some operating systems do. So it might not be the video itself, but rather the OS and video driver ruining things in this case.


Ideally, browsers should just not support HDR.

Well I strongly disagree on that point.

Just because we're in the infancy of wide HDR adoption and thus experience some niggling issues while software folks work out the kinks isn't a good reason to just wholesale forego the feature in such a crucial piece of infrastructure.

Sure, if you don't want HDR in the browser I do think there should be a browser option to let you achieve that. I don't want to force it on everyone out there.

Keep in mind the screenshot you showed is how things looked on my Windows until I changed the auto-HDR option. It wasn't the browser that did it, it was completely innocent.

It was just so long ago I completely forgot I had changed that OS configuration.


If you want to avoid eye pain then you want caps on how much brightness can be in what percent of the image, not to throw the baby out with the bathwater and disable it entirely.

And if you're speaking from iphone experience, my understanding is the main problem there isn't extra bright things in the image, it's the renderer ignoring your brightness settings when HDR shows up, which is obviously stupid and not a problem with HDR in general.


If the brightness cap of the HDR image is full SDR brightness, what value remains in HDR? As far as I can see, it's all bath water, no baby

> If the brightness cap of the HDR image is full SDR brightness, what value remains in HDR?

If you set #ffffff to be a comfortable max, then that would be the brightness cap for HDR flares that fill the entire screen.

But filling the entire screen like that rarely happens. Smaller flares would have a higher cap.

For example, let's say an HDR scene has an average brightness that's 55% of #ffffff, but a tenth of the screen is up at 200% of #ffffff. That should give you a visually impressive boosted range without blinding you.


Oh.

I don't want the ability for 10% of the screen to be so bright it hurts my eyes. That's the exact thing I want to avoid. I don't understand why you think your suggestion would help. I want SDR FFFFFF to be the brightest any part of my screen goes to, because that's what I've configured to be at a comfortable value using my OS brightness controls.


I strongly doubt that the brightness to hurt your eyes is the same for 10% of the screen and 100% of the screen.

I am not suggesting eye hurting. The opposite really, I'm suggesting a curve that stays similarly comfortable at all sizes.


I don't want any one part of my screen to be a stupidly bright point light. It's not just the total amount of photons that matters.

It is not just the total amount.

But it's not the brightest spot either.

It's in between.


I just don't want your "in between" "only hurt my eyes a little" solution. I don't see how that's so hard to understand. I set my brightness so that SDR FFFFFF is a comfortable max brightness. I don't understand why web content should be allowed to go brighter than that.

I'm suggesting something that WON'T hurt your eyes. I don't see how that's so hard to understand.

You set a comfortable max brightness for the entire screen.

Comfortable max brightness for small parts of the screen is a different brightness. Comfortable. NO eye hurting.


It's still uncomfortable to have 10% of the screen get ridiculously bright.

Yes, it's uncomfortable to have it get "ridiculously" bright.

But there's a level that is comfortable that is higher than what you set for FFFFFF.

And the comfortable level for 1% of the screen is even higher.

HDR could take advantage of that to make more realistic scenes without making you uncomfortable. If it was coded right to respect your limits. Which is probably isn't right now. But it could be.


I severely doubt that I could ever be comfortable with 10% of my screen getting much brighter than the value I set as max brightness.

But say you're right. Now you've achieved images looking completely out of place. You've achieved making the surrounding GUI look grey instead of white. And the screen looks broken when it suddenly dims after switching tabs away from one with an HDR video. What's the point? Even ignoring the painful aspects (which is a big thing to ignore, since my laptop currently physically hurts me at night with no setting to make it not hurt me, which I don't appreciate), you're just making the experience of browsing the web worse. Why?


In general, people report that HDR content looks more realistic and pretty. That's the point, if it can be done without hurting you.

Do they? Do people report that an HDR image on a web page that takes up roughly 10% of the screen looks more realistic? Do they report that an HDR YouTube video, which mostly consists of a screen recording with the recorded SDR FFF being mapped to the brightness of the sun, looks pretty? Do people like when their light-mode GUI suddenly turns grey as a part of it becomes 10x the brightness of what used to be white? (see e.g https://floss.social/@mort/115147174361502259)

Because that's what HDR web content is.

HDR movies playing on a livingroom TV? Sure, nothing against that. I mean it's stupid that it tries to achieve some kind of absolute brightness, but in principle, some form of "brighter than SDR FFF" could make sense there. But for web content, surrounded by an SDR GUI?


> when their light-mode GUI suddenly turns grey as a part of it becomes 10x the brightness of what used to be white

I don't know why you're asking me about examples that violate the rules I proposed. No I don't want that.

And obviously boosting the brightness of a screen capture is bad. It would look bad in SDR too. I don't know why you're even bringing it up. I am aware that HDR can be done wrong...

But for HDR videos where the HDR actually makes sense, yeah it's fine for highlights in the video to be a little brighter than the GUI around them, or for tiny little blips to be significantly brighter. Not enough to make it look gray like the misbehavior you linked.


> I don't know why you're asking me about examples that violate the rules I proposed. No I don't want that.

Other than the exaggerated 10x, I don't understand how it violates the rules you proposed. You proposed a scheme where part of the screen should be allowed to be significantly brighter than the surrounding SDR GUI's FFF. That makes the surrounding GUI look grey.

> And obviously boosting the brightness of a screen capture is bad. It would look bad in SDR too. I don't know why you're even bringing it up.

I'm bringing it up because that's how HDR looks on the web. Most web content isn't made by professional movie studios.

The example video I linked conforms with your suggested rules, FWIW: most of the image is near black, only a relatively smart part of it is white. The average brightness probably isn't over SDR FFF. Yet it still hurts.


The whole chip in the middle is brighter than white. Half that video is super bright, making this example way more than I was suggesting in both area and average brightness.

> most of the image is near black, only a relatively smart part of it is white. The average brightness probably isn't over SDR FFF.

It's a lot more than I suggested, and I said average brightness half of FFF for my example.

Also if I knew you were going to hammer on the loose example numbers I would have said 2% or 1%.

> I'm bringing it up because that's how HDR looks on the web.

But I'm not defending how it looks. I'm defending how it could look, since you don't see why anyone would even want HDR on the web.


it actually is somewhat an HDR problem because the HDR standards made some dumb choices. SDR standardizes relative brightness, but HDR uses absolute brightness even though that's an obviously dumb idea and in practice no one with a brain actually implements it.

In a modern image chain, capture is more often than not HDR.

These images are then graded for HDR or SDR. I.e., sacrifices are made on the image data such that it is suitable for a display standard.

If you have an HDR image, it's relatively easy to tone-map that into SDR space, see e.g. BT.2408 for an approach in Video.

The underlying problem here is that the Web isn't ready for HDR at all, and I'm almost 100% confident browsers don't do the right things yet. HDR displays have enormous variance. From "Slightly above SDR" to experimental displays at Dolby Labs. So to display an image correctly, you need to render it properly to the displays capabilities. Likewise if you want to display a HDR image on an SDR monitor. I.e., tone mapping is a required part of the solution.

A correctly graded HDR image taken of the real world will have like 95% of the pixel values falling within your typical SDR (Rec.709/sRGB) range. You only use the "physically hurt my eyes" values sparingly, and you will take the room conditions into consideration when designing the peak value. As an example: cinemas using DCI-P3 peaks at 48 nits because the cinema is completely dark. 48 nits is more than enough for a pure white in that environment. But take that image and put it on a display sitting inside during the day, and it's not nearly enough for a white. Add HDR peaks into this, and it's easy to see that in a cinema, you probably shouldn't peak at 1000 nits (which is about 4.x stops of light above the DCI-P3 peak). In short: your rendering to the displays capabilities require that you probe the light conditions in the room.

It's also why you shouldn't be able to manipulate brightness on an HDR display. We need that to be part of the image rendering chain such that the right decisions can be made.



How about websites just straight up aren't allowed to physically hurt me, by default?

Web sites aren’t made for just you. If images from your screen are causing you issues, that is a you / your device problem, not a web site problem.

I agree, it's not a web site problem. It's a web standards problem that it's possible for web sites to do that.

Note the spec does recommend providing a user option: https://drafts.csswg.org/css-color-hdr-1/#a11y

You asked “which web browsers have a setting to tone map HDR images such that they look like SDR images?”; I answered. Were you not actually looking for a solution?

I was looking for a setting, not a hack.

Mint/Cinnamon has been my favorite for at least 5 years, after an assortment of KDE/Gnome distros. Basically the only argument for me being on Mac now is full Adobe graphics software support. (I dislike their business practices as anybody, but Lightroom CC is genuinely good tech, and gets useful updates at least yearly).


This is probably due to most software working with graphics somewhere (GUI, printing), and font sizes and resolutions are predominantly imperial (pt/dpi).


Luckily there are still the extra snouty breeds like Belgian shepherds and various sighthounds.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: