Hacker Newsnew | past | comments | ask | show | jobs | submit | iamchp's commentslogin

> immersion effect Totally agree. At the moment, add to home screen creates a shortcut that opens up Safari (along with its other distracting tabs ;) would be great if Sit could be set up as a web app too!


I’m seriously considering piracy because streaming services don’t offer subtitles in my mothertoungue. I come from a non-english speaking country and I live in a foreign country. Some movie offerings provide my mother tongue, but others don’t, and this is super annoying :’( I really wish content providers would just add on all subtitle languages :(


Computer architects use the term 'memory wall' to refer to this latency problem. CPU microarchitectures are constantly improved to improve IPC, Die technology helps increase CPU frequency, but memory access latency is not keeping up with the CPU improvements.


Right - I think, at this point, that L1 cache latency is worse than main-memory latency on a 'per CPU clock' basis than it was in the late 1990s!


Makes sense for a larger cache to take longer to decode an address.


This would benefit sequential access, but it'd either be disabled for random-access or pollute the caches with unused lines.

But in cases where sequential memory bandwidth is required, this is pretty cool! (But I assume Intel only, which would also be a bummer)


RAM is the new tape...


Somewhat ironic that we just reduced RAM channels from 64b down to 32b wide in DDR5 (but each DIMM has two channels). (newsflash if you missed it: desktop DDR5 is quad channel, but 32b, yes.)

SK Hynix: what if we increase the ram width to 64?


Thank you for this pointer. I was also wondering what the motivation for FFM was (I'm not a Java person), but reading the JEP answered my question!

The JEP is really elegantly written!


Thank you for this (albeit a different project). My immediate question when I read the last part of the page was 'What about PyPy!?' PyPy might be different for the original post's execution, but I still assume some significant speedups?


> PyPy might be different for the original post's execution, but I still assume some significant speedups?

Yes, I believe so as well. They have done many CPU optimisations, so it is likely to be faster


Awesome to read about the memory management in Windows. (To me Windows seems to much of a black-box, but this is also because I only develop on Linux)


The JDK issue ticket linked towards the bottom of the post has some interesting discussions. Apparently Developers at RedHat ran into a similar problem and decided to build an 'agent' to help developers identify code patterns triggering the issue.

Slightly different approach to the Netflix post, but still an interesting line of effort to find problematic Java code!

For anyone interested: https://bugs.openjdk.org/browse/JDK-8180450?focusedCommentId...


Totally agree. There should be a cheaper option of one-screen 4K and without the games. I don't understand why I need to pay for the games that I don't want to play.


Oh yeah, totally forgot about the games. Instead of renewing shows, just dump money into games for you VIDEO STREAMING service. Because of course!


Think of the people who would be interested and who would be able to play 4k.

Do they have money? Absolutely. To stream 4k you need a good internet connection and good equipment.

With this in mind, it completely makes sense to charge the most for 4k. Someone that cares is more than wealthy enough to pay for it. Having a 4k 1 person plan would be a loss of profits.


You can stream 4K on common network connections (20mbps), and 4K TVs are cheap. While I agree that it’s premium, just because many people don’t notice the quality difference, the real problem is the competition. Disney Plus, HBO Max, and Amazon Prime (and even Peacock!) have 4K HDR on their basic paid plans. Netflix is drastically behind here.


Think of the shareholders!


Reflecting, this is quite off topic, but due to more voltages supported by USB PD, I thought I'd bring this up.

A few years back, I really wished all the AC/DC converters for smaller home appliances would ditch the fragmented round pins, and go for a universal USB-A (at the time) standard. E.g. some fans, routers, my smart desk, my key light, etc. Perhaps we would be able to optimize better unified AC/DC converters.

Perhaps it would be nice to think about something like this. Instead of having bulky AC/DC converters, go for single multi-port USB-C PD converters?


I have a Home Pod mini in my kitchen which is powered via USB-C. Right now Its just using Apple's 18 w power brick. One of these days I mean to to install on of those mains outlets with built in USB-C ports:

https://www.leviton.com/en/products/residential/usb-wall-out...

It will eliminate a bulky adapter and let me use the plug for things that actually need AC, like my toaster or kettle.

One wonders if it will start becoming common to have all plugs in newer/refab houses have these and they will be okay enough for most people.

This is a little different then what your talking about, but both could be the norm. I imagine the form of what your talking about would be that power strips with a bunch of type - c ports on them (and maybe some AC) replace the power strips we use now.


Apparently the HomePod mini does something weird and won't work with that outlet:

https://old.reddit.com/r/UsbCHardware/comments/t4l1dr/levito...


> One of these days I mean to to install on of those mains outlets with built in USB-C ports:

I really wonder about those.

I remember reading an article where someone tested a bunch of different power bricks and found that they really vary in quality.

The cheap ones had really bad voltage sag and other problems. And they were an unsafe design (not enough separation between the AC and DC sides).

Personally, I'd trust an anker or apple brick more than whatever lowest bidder device is in those outlets.


Leviton is pretty trustworthy when it comes to outlets but I have doubts about a 6A power brick being stuck inside a wall without any cooling.

Edit: LOL these are going to be $90 a pop. Just get one of these instead: https://www.amazon.com/gp/product/B0874GDG93/


And in a year or two when the standard changes again, you don't have a bunch of outdated ports hardwired into your house.


These ports are not really "hard wired". They're modules that fit into the mains plug well/niche, just like any other thing that fits in there. They can be changed for anything else once they become outdated, you just shut off your mains, unscrew three wires and put the new thing in. But they probably won't become outdated, because:

1. the USB C plug is here to stay for a longer while

2. with 240W we've already reached power delivery levels that exceed most small appliances you'd want to hook up like that; most larger appliances will take mains. 240W is a practical limit because if you go much above that, the consideration isn't really how much power can your device suck out of the wall, the consideration becomes: if you hold this thing in your hand, will it burn you? You can't escape the physics: 240W power dissipated means 240W power coming out of your device as heat.

3. we've just reached a new, more reliable and more power efficient semiconductor process (GaN) which only happens once every decade or two, so anything made in the next 20 years or so at the least will have roughly the same kind of performance, and anything in the next 50 years or so will probably not exceed it by unacceptable levels


USB C has been around for some time, same with USB A. If you had both ports in your power outlet you'd have been set for nearly 20 years worth of peripheral charging.


I don't agree. The past few years have seen steady growth in terms of power delivery. Devices have gone from 5V 0.2A which equates 1W, to 240W. That's a huge leap. Most older chargers are simply insufficient for more modern devices, but can still be used to power more conservative things like small desk fans, led lights, night lights, and the likes. You can repurpose an old wall wart by putting it somewhere else physically where you need it, but with an in-wall power outlet you'd need to replace the whole module every 1-3 years. The jumps in power delivery were drastic at times.

However, now it's a good idea to go with these because it's unlikely power delivery for a lot of things will increase for a long while - see my other comment.


It was ~4.5 W before, wasn’t it? With some off-spec supplies up to 4 to 5 amps or 25 W. This is still a big deal, but only one order of magnitude, not two.


Often you can convert things yourself. Back in the day USB only provided 5V, so anything that needed 12V still used a barrel plug wall wart. But nowadays you should be able to have simple breakout boards that negotiate PD to deliver the correct voltage (or correct "enough" that the thing can still operate, eg 14V instead of 12V).

I've been converting some of my more problematic devices, for example here's a pretty clean conversion I did of my Wii U Gamepad back in the day. That was before USB C ports and cables were easily available, so I went with micro. USB C wasn't even a consideration back then.

https://imgur.com/gallery/cx2gzxQ


I have a number of Dell Wyse 3040 thin clients (5v version, there's also a 12v version) scattered around my property. I've converted two of them to PoE by the simple expedient of buying a USB PoE splitter, cutting off the USB connector, and soldering on the appropriate barrel connector (for future searchers: 4mm x 1.7mm, pin == positive). I also just purchased some pre-made USB-A to 4mm x 1.7mm cables to try out on the rest.


For anyone who wants to do this: it’s not that easy with PD.

I have a dozen USB-C PD bricks of various brands. Only 3 of them properly provide 12v, and one of those only does it properly when there are no other devices plugged in to it’s 3 ports.


Which is why you use a buck-boost converter if you want to get a specific voltage above 5V out of a usb wall wart.


I’m no electronics expert, but that seems iffy...

Personally I would love to use USB-C PD for longer strings of 12V APA RGB LEDs as interior lights. In that case: not only is voltage a concern but also total wattage, heat, and long-term reliability. Is 60 or even 100W through a boost converter actually safe and easy?


There's nothing inherently unsafe about buck-boost converters, they are used on millions of every day products, including the highest end of high tech stuff like motherboards and gpus.

If a circuit is in itself safe (it might not be if it's executed poorly), then safety of the whole device depends on how you're using that circuit.

100W is just as fine as 1W. It just depends on whether you're being scammed. If you don't have the knowledge to find out whether what you're doing is safe... don't do it.


You’re agreeing with me then: it’s not inherently safe for me to use a buck-boost converter.


Boost converter's the technology and can be done as safely or as dangerously as you'd like. Waiting for a totally safe commercial product will take longer than DIY, naturally.


It’s slowly happening - a great many small electronics have been transitioning to various USB connectors for charging (everything from bicycle lights to small drones). Those are all 5V though, which is great for typical one cell lithium, which is 3.7-4.2V, and where 500ma-2A is the most you’d need or want anyway.

The issue so far is that for higher power levels or voltages the electronics are still complex and expensive, even with the PD standards, so it adds a lot of cost doing it that way compared to a typical barrel connector.

Maybe in another 4-5 years and with some standardized and hopefully dirt cheap PD chips?


Such a thing would have to be really smart about advertising power-delivery ability for each port without exceeding a total commitment of 1,800 watts (US 120 volts x 15 amps), but not actually needing an inefficient 1,800-watt AC/DC transformer, and then somehow still managing to be able to charge at least a couple high-power laptops simultaneously. All this to satisfy the reasonable desire to replace six bulky 15-watt transformers with one simple device.

Contrast your typical $10 AC power strip, which lets you plug in 10,000 watts of hair dryers at once, because it's understood that a circuit breaker will (hopefully) blow if you do. It doesn't need to be smart.

I think the basic problem is whether USB-PD allows a source to commit to delivering 100 watts to a port that asked for 20V/5A, but then notice that the port only needs 2 watts right now (because the device finished charging), so that it can reclaim 98 watts in its total power budget. So far I haven't seen a power supply that does this well.


Not sure what's the exact problem you talk about.

If a USB-PD over commits, then the devices it tries to power will obviously get less power than advertised, or the internal over-delivering protections trigger and the source shuts down. The user will (maybe) notice and can do something about it.

I'm pretty sure everybody understands that you can't just plug 6 high power consumers in one USB device just because it has 6 ports and it supports USB-PD, just like people don't plug 6 kettles in one extension cord.


People absolutely do plug tens of thousands of watts' worth of appliances into a single 15-amp home circuit that can supply only 1,800 watts at once. And as long as they don't exceed 1,800 watts of momentary usage by too much or for too long, it works. AC power doesn't have a "commitment" in any sense but a physical plug shape. It just has circuit breakers to prevent fires.

A good example of the problem is a hypothetical USB-PD power strip that has 5 ports. The Amazon listing says it can deliver 120 watts. A person buys it, thinking they can use it for their 100-watt laptop and four other small things that each need barely 5 watts, 1 x 100 + 4 x 5 = 120, so the math adds up. They're still working from home post-pandemic, so they leave their laptop plugged in all the time on their desk. The laptop charges for maybe 45 minutes each day, but it still tells the power supply that it needs 20V x 5A = 100W constantly, and each of the small devices asks for the minimum of 5V x 3A = 15W.

For actual USB-PD supplies I've bought in this approximate situation, the laptop won't charge because the supply can't promise 100W. You hook up your Kill-A-Watt and find that the supply is actually drawing only 20 watts. The power supply is keeping all its promises -- 15W to each of four small devices, meaning it can't agree to supply 100W to the fifth device. So it instead says it can do maybe 12V x 3A for the laptop, which says forget about it and refuses to charge. Yeah, the laptop sucks for lying that it needs 100W 24/7, but what am I going to do? Get a new laptop? Or just put the USB-PD supply on a shelf and return to a regular old AC power strip with five individual USB transformers plugged into it? That's what I'll do, because that's what works.

In today's world, the "If" in your comment happens at the AC circuit-breaker level. In the USB-PD world, it would happen at the level of this little multi-port power supply. Your average consumer in a hurry doesn't want to be bothered by this kind of detail.


It seems there are more alternatives. Use the USB-PD supply for the laptop and another one for you 5 devices.

It's a bad laptop design. An iPad or Android phone/tablet can fast charge with a good supply or slow charge with a cheap one. The laptop should do the same, accept whatever the supply can deliver.

Also, what's the problem with the user using 5 individual small USB chargers versus a big one? The efficiency of all USB chargers is about the same, so it doesn't matter how you split them up. In fact, a small USB charger working at maximum output is more efficient than a large one working at 20% load.

You are correct that ideally the USB-PD supply should monitor the actual usage and route power accordingly, but I'm not sure it's such a problem in practice.


what's the problem with the user using 5 individual small USB chargers versus a big one?

:) This thread started with iamchp asking about "single multi-port USB-C PD converters."

The laptop should [...] accept whatever the supply can deliver.

And both sides should renegotiate when available/needed power changes. If a laptop could say "I could use 240W but I can deal with 18W," a supply said "I can give you only 18W, but I'll let you know when I can give you 240W," and the laptop later said "you know what? Now I really need only 36W," then I think USB-PD would be good. My experience is the negotiation happens once, possibly days earlier, when the device is first plugged in.

(By the way, it's not always the right product decision to include boosting circuitry from lower-voltage supplies. While a MacBook will charge from a 5-volt supply, you'll be sad when it dies during a critical work presentation even though it's charging, because the presentation needed more than 15 watts. It's a defensible product decision to require a minimum wattage, and thus effectively a minimum voltage, or refuse to charge.)


It would be pretty neat to have some kind of refresh interval, where both devices list their current capabilities and have it renegotiate to some optimal level.

As for the refusing to charge, my ThinkPad on Windows will alert me when the dock or power source isn't putting out enough power for it to maintain the current power usage trend.


Wouldn't that kind of overcommit be disastrous though?

If a device was granted a 100W budget, but only needs 2W now, maybe it suddenly needs to burst up to the full 100W? If the hub reclaimed those other 98W, now it can't deliver what it promised, so something fails to work, or gets powered off.


Depends on your definition of disaster.

In the hair-dryer case, all six actually do turn on -- but only for a few seconds. The circuit breaker heats up and trips before the wires in the wall get hot enough to ignite the wood. Which sounds scary, but that's part of the design that enables normal spiky power usage of refrigerator compressors starting up, running a garbage disposal for a few seconds, etc., without constantly tripping a home's breakers.

If USB-PD has some way for a single supply to "overbook" its commitments, then it could model this system. Otherwise, its competition will be people plugging lots of individual USB power supplies to AC power strips, which is bulky and wasteful, but extremely functional.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: