You didn't read the latest guidelines? They specifically ban [redacted] unless you're arguing that [redacted] didn't have anything to do with [redacted]. Unless of course, you've accepted money from [redacted], then you are welcome to criticize [redacted] as long as you mark posts as promotional.
It's really pretty easy. Just visit: [redacted] for more details.
This resonates with me. I use caution now, as much security (and cryptography) advice on StackOverflow is wrong and Perpetuates harmful patterns, yet is never taken down. What’s worse? It often shows up at the top of Google. Many devs don’t show the same level of caution until something blows up
Heh, just this week I was struggling to get a device to work with my laptop, until after 90+ minutes I realized there were different types of USB-C, and the one I had (the Apple charger cable) only did charging.
I needed one that transmitted video, so I then went to order a $10 USB-C, only to realize those also only did charging. Finally found what I needed, but I thought USB-C was finally "universal", but turns out much of that is just marketing...
Edit: Turns out the one I just bought ($40 USB-C Apple thunderbolt cable) doesn't even fit into the device because the edges are too thick. Ridiculous
I think it's one of those things where it sounds good as an idea but breaks down in implementation.
"Let's design a port/interconnect standard that does everything" is great in concept, but one manufacturer or another is going to leave out bits in the cable to keep costs down or not implement something in their device-side firmware. You end up with the situation we have now where there's insane levels of fragmentation, and you can never be quite sure whether two things are going to work together or not.
Clear branding would help with a lot of it. E.g. mark dumb cables that only do charging as USB-PWR while full feature cables are USB 50 or USB 100, for example, depending on data rate.
Which only covers a couple of the dozen different use cases, now covered by USB. Which is why thunderbolt and the like would require yet another marking.
All this nullifies the point of having a single port+cable.
If your going to mark it, why not just modify the cable to have different keys, one for charging, one for higher speed, etc. Then at least you know right away its not going to work.
But for that matter, once you have fixed the plug & port compatibility you might have well have just used different ports. Because that is what you have, they just look similar enough to cause confusion.
The whole thing, is just a false set of choices brought on in large part by the same industry (mobile phones) which couldn't be bothered to actually make their parts compatible with actual standards. Its doubtful you will ever see any of those manufactures actually build a fully compliant USB part either, since they have regularly proven to be unable to do it even with the simpler standards. This despite charging top dollar for parts that are built with the cheapest design/etc.
Did micro USB have different keyings for different power levels? No, we just plugged in our device and it charged at the rate the charger could support, and if you needed fast you found a fast charger.
Broadly speaking faster charging and faster speeds are just... faster. The beauty of USB is that you can plug into a billion different chargers around the world, anywhere, and while the charge rates might vary, you can charge up anywhere. Keying breaks that.
There is only one power spec for usb cables before type C. Anything beyond that was a proprietary variant that, yep, wasn’t always obvious upon inspection.
Sorting by speed, plus an icon for higher power cables, covers almost all real world uses. It would not have to be complex, and would not nullify the point of a single port/cable.
Even just marking speed would mean a cable never unexpectedly fails to do its job.
Worth mentioning that USB Type C is the plug and it is indeed universal. The protocols are a whole different matter and finding a good cable is a challenge.
They should've just color coded the connectors to capabilities... Like with USB 1/2/3 as black-white/blue/red
Now they had a new form so they could've just used black for charging, blue for some data and red for display. I guess that ship sailed already though.
That's true, we swapped physically visible complexity to software invisible one.. one set of wires and one connector but 24 different set of features depending on who knows what.
Fortunately the time will likely sort it out. We had similar issues with the original USB at the beginning. (When you actually needed extra pci card to support it) Now USB connections "just work". Hopefully it will happen to usb-c as well.
It’s been 6 years since the 2015 single-USB-C MacBook came out. Patience is wearing thin. Also I don’t recall the original USB having widespread problems with non compliant cables that can fry your device or burn down your house. And of course there still aren’t any true USB-C hubs available.
All this would be even worse without the tireless efforts of Benson Leung and his merry band of USB-C avengers.
Yes, and unlikely to change unless USB4 spec mandate anything. And so far this doesn't seems to be happening.
i.e There is no real solution insight. And USB-C will remain the same for the next 3 - 5 years. And yet proponents are still defending it. It baffles me. Why cant we just admit mistakes and walk back to simple solution?
There was only a very short time with hardware issues on USB. What really plagued it in the beginning was that Play and Play (the big selling point for USB 1.0) was really mostly Plug and Pray.
Not quite - I use a wired Apple keyboard (otherwise a great keyboard, feels way better than the magic keyboard and the usb on the side connects to my mouse) that doesn’t work on my MacBook unless I use the extension cord, on another MacBook it works without. It might have to do with something with the NVRAM battery being weak but I didn’t have the time to explore further. [1]
In theory we should be able to get rid of cables, but that is also a messed up standard, I’d rather connect my keyboard that extends to the mouse than switch devices with Bluetooth.
It doesn't just work. If I get a wireless keyboard and mouse that "just works" in theory but in practice does not "just work" always, then that's a load of hassle that far outweighs any advantages it may have.
In essence, the drawback is the lack of trust, because the promises of wireless that "just works" have been repeatedly made and broken. Fool me once, shame on you, fool me twice... I won't allow myself to be fooled again by such promises.
So if you do have a system that just works, then I'd refuse it on principle (because I simply won't believe your claims, no matter what you say) until at least multiple years have passed with it being widely used resulting in a general consensus that it really does just work in all cases, without there being e.g. a 1% minority talking about all the many edge cases they encountered where it turns out that it does not actually "just work".
>I won't allow myself to be fooled again by such promises
I get the sentiment, but its kind of silly in practice. Things improve almost constantly.
Just trying to say I have been using a wireless mouse and keyboard for years, they have never given me any trouble. In fact, even less trouble than wired peripherals have given me over the years. Maybe you should bend your rule and try out some modern stuff if your previous experience was not a modern device.
They don't "just work". Whether it's Logitech or Apple mouse, my cat laying in front of the pad breaks the connection. (Not enough to disconnect, just adds jitter)
When playing games, wireless keyboard/mouse introduce enough latency that twitchy platformers or competitive shooters are impacted. (Unless you buy super expensive game-oriented device)
Bluetooth devices are terrible... by design because of Bluetooth and its problems. Special dongles don't function well when plugged in on the back of monitors and sometimes need manual "replug" to register after a reboot.
I hope this doesn't get treated as an advertisement since I'm not affiliated -- but do try Logitech G900 (also branded as Chaos Spectrum) or G903 mice.
My wife played competitive Overwatch for a few months some years ago and she said the mouse feels exactly as a wired device. I play Quake every now and then and can confirm the same. Best mouse we ever used, we have 4 at home.
It has the added benefit that it disables its wireless circuitry when you plug it in -- it both recharges itself and becomes a wired mouse.
(Yes, both mice can be viewed as expensive. But we took the plunge to invest in reliable periphery and have only been happy with our investment.)
Wireless is subject to interference and introduces new security concerns. They run on batteries which you then have to replace. They tend to cost more. That's a lot of trouble to avoid a two foot USB cable.
Latency -- You'll never get a wireless mouse faster than fiber-optic, since the computer screen would interfere with optical wireless transmission ;)
I jest. The real theoretical issue is pairing (portability). I can't think of a wireless solution that provides all three of:
1. No dongle. I don't want an extra thing to keep track of.
2. Fast and convenient pairing ("plug" and play). Connecting to a different computer should not be a hassle.
3. Secure. Device must only pair with computers of my choosing.
Dongles, WPS buttons, and bluetooth number confirmation are existing solutions that provide any two. A physical plug combines "connect" and "authorize" steps, neatly sidestepping the issue.
The closest thing I could think of to solve this would be a standard, as ubiquitous as USB, for wireless connection: each device has a code that serves as protocol negotiation, identification, and authorization. So you open up a "connect a device" dialog on the computer, type in your device's code, and the computer automatically discovers the device and pairs with it. But even this is a compromise on both usability (another thing to remember...) and/or security (it could be printed on the device, like a serial number, but then...).
And, of course there's the myriad practical issues mentioned elsewhere in the thread. I think if they were (truly) solved, I would buy wireless devices for my home setup, to do away with cable management, and keep cheap wired ones for use with other people's computers. I don't anticipate this happening for at least a decade or two.
I may be misunderstanding your point, but couldn't the USB-based pairing that Apple uses for their keyboards and mice solve functionally all of these issues?
There's no dongle (they're bluetooth) and pairing is dead simple (plug it in once and it's available everywhere). I don't think I've ever seen a security analysis of their peripherals, but the potential for a mitm attack on the keyboard seems like it would be equal to or lower than a wired version.
This assumes, of course, that you're using Apple products across the board but it is a solution nonetheless.
D'oh! Yes, if I understand correctly (I don't use Apple products[0], so this is the first I'm hearing of it), that would solve the issue, since USB cables are common enough that I wouldn't need to bring one with me just for pairing. I'll still be sticking to wired for practical reasons, for the time being, but maybe those will be fixed sooner than I expect :)
[0] I swore off of walled gardens after an unsavory experience regarding EOL of the 1st gen iPod touch — Apple effectively bricked my working hardware by taking all apps for it off the app store.
It’s not practical when you have to do it several times a day and having a cable lying around and having to connect it sort of defies the purpose of a wireless keyboard/mouse.
Same applies to Bluetooth speakers by the way, pairing is just horribly annoying. I wish they’d just replace the volume buttons for a dedicated pairing button instead of having a different way to pair on each device (do I hold the volume or that other button? Was it 3 or 15 seconds)
I must admit having several computers is a bit of an edge case, but still.
For a keyboard, wireless buys me nothing but introduces a number of new potential issues.
For a mouse, wireless buys you some convenience and freedom of movement. However (and this is slightly niche so it really doesn’t apply in general) I use a trackball, so the wireless advantages are 100% negated.
Being able to move around (several monitors) and easy use of a standing desk are some advantages. I have a workstation under my desk and a laptop that I frequently hook up to my monitors frequently.
Philosophically, wireless is just going to be more complicated than wired. Say I'm sitting in front of 4 computers with 1 keyboard. Without interacting with the system at all, which computer is connected to the keyboard? With wireless its difficult/impossible to tell whereas it's fairly clear with wired.
It is! In engineering, as discrete from science, the point of theory is to know what is going to happen in practice. If theory can't tell you, then it's useless for practical purposes. If the theory is ignoring important aspects of reality that make it useless, then what's the point of theory?
Eg In theory we can move faster than light, we just need a source of infinite energy. In practice, "infinite" is an impossibly large amount of energy, so any theories that allow for faster than light travel are only interesting for entertainment/theoretical purposes.
For wireless connections, the gulf between theory and practice is just too large, so theory remains theoretical.
It's more stuff that can go wrong. I understand why you would want a wired mouse for a weird setup where you don't want a wire from the mouse or /can't/ have a wire from the mouse to the machine. But I don't see the need just like I don't see a need for wireless charging or wireless hdmi or wireless southbridge.
Wireless hdmi: Though it’s technically not the same, think of chromecast, I think it’s nice to be able to stream stuff on the tv without the need to physically connect it.
Since power cables are still the easiest to find in a normal household (compared to any cables that carry data), I’d argue that anything except for these can have at least a little bit of value if it’s wireless and done in a convenient way.
The problem is unnecessarily occupying wireless spectrum for devices than can use a cable.
You can make a wireless monitor. I expect it to work just fine in a home environment. I expect it to fail in all kinds of crazy ways in an office with dozens of these monitors side by side among other devices also using wireless communications like phones.
Stopping to work in the middle of a live presentation or in the middle of a game. My mouse is excellent, I love it but I have a magnetic USB cable next to the mouse pad for the times where I get a warning 10 seconds before shutting down.
Maybe I am a stupid layman but isn't that mouse pad basically a low power induction cooker? Am I superstitious to not want to keep my hand on an induction cooker for hours per day?
Potential interference, crowded spectrum, battery required, extra weight due to the presence of battery, and if the encryption is weak then you have a signal broadcasting all that you type in the vicinity which could be a security nightmare.
I didn't realize people had such low expectations for "just works". Getting a reliable signal for such a short range and low bandwidth is obviously possible, as is using proper encryption. Batteries can be light, with the weight compensated for, and charge wirelessly.
The reason I said "just works" was to avoid an exhaustive list of issues that have already been solved in other products, or could be solved with some effort.
Just wondering, by Apple Keyboard, is that the same as old Scissors Keyboard ? And by Magic Keyboard, you are referring to the new Keyboard on MacBook Pro 16"?
The solution to this is a "self test" every time a device is plugged in.
Each device should run a full test suite of the cable and the device at the other end, and if any fail, it should refuse to work.
Part of the test suite should be checking that the device at the other end of the cable is also running the test suite.
Everything should be tested - for example if USB can transmit video over certain pins, the test suite should involve sending a frame of video, even if your device is a usb stick and doesn't need video. That way nobody can leave out bits of the spec.
USB is insanely fast, so thousands of tests should be doable in under a second, and in fact if the other device can't pass the tests quick enough that should be reason for failure.
Who would buy, e.g., a laptop that refuses to work with most existing cables / devices? What do you gain from having a flash drive fail to work because you can't send video over it?
I can see the value of having this test suite for you to run personally, when you want to test. Or having someone certify capabilities and publish the test results for a particular piece of hardware.
But I can't imagine anyone (especially a non-tech-savvy person) using it by default and without an escape hatch - it should "just work".
I think the point is that if every device did this from the start, there wouldn't be a market for anything but cables that work for everything (because a cable that doesn't work for everything then works for nothing), so all cables would just work, for everything.
Looked at the other way, who would buy a faulty cable that doesn't meet this self-test specification? That would be a feature imo. Cable makers will need to all start meeting the specification very quickly or nobody will buy their cable.
It always seems that the drivers or something could say what the cable did or if it was detecting a signal or something.
I had a ton of octopus cable swag and took a bunch without testing on a trip and found out none of them charged my iphone! Frustrating to say the least!
I bought a bunch of Micro USB cables for the same reason. Most of the ones I had on hand didn't have data lines as they were only for charging, which can be fun to troubleshoot as devices connected to the cables just don't show up.
Some of them will let you trick the phone into charging with a full 2/3 amps, even though you've plugged it into a laptop or something that would normally show "charging slowly".
Obviously you're relying on the laptop circuitry to not burn out, but laptops are typically designed with very robust USB circuitry because people often short them out by putting in broken cables.
That's strange, I've never encountered a usb-c cable that did not support data at all. My understanding (and the article suggests) that charging and data are fine in general, but just not as fast as the device can support. Basically they will charge or transfer data at regular usb or micro-usb speeds.
> Compared with Apple USB-C Charge Cable
The Apple USB-C Charge Cable is longer (2m) and also supports charging, but data-transfer speed is limited to 480Mbps (USB 2.0) and it doesn't support video. The Apple Thunderbolt 3 (USB-C) cable has Thunderbolt logo on the sleeve of each connector. Either cable can be used with the Apple USB-C Power Adapter.
I definitely have encountered this. One was even a device intended for development via USB. For some strange reason, the USB-C cable shipped with the product only supported power, requiring the customer to buy another cable for data transfer.
Just to clarify, are you talking about a usb-c to usb-a cable, or a usb-c to usb-c cable? I think the usb-c to usb-a cables are just temporary while we get to usb-c everything, and so are not implemented that well. With usb-c to usb-c cables, I've never had a problem with power or charging.
I’ve amassed quite a collection of charging-only Type-C to Type-C cables over the years. People usually don’t realize they have such cables because they typically come alongside a charger, so people only ever attempt to charge devices with them.
Grab a USB-C to USB-C cable that came with a phone or similar device and was intended for use with a USB-PD charger and give it a test. You might get USB 2.0 speeds if you’re lucky, but you’re unlikely to see anything beyond that.
> Even more fun, many of them don't properly list what power they can handle and fry your device or catch fire if you put to much through them!
That doesn't make sense. Any cable will be capable of many more volts than USB will ever put through it. How is anything going to get fried?
I suppose a cable could get too warm, but lying about capacity only takes you from 3 to 5 amps. That little bit extra should never be enough to cause a fire. And if it can't even handle 3, then the problem was not that it was lying about capacity. It also costs extra money to lie about being a 5 amp cable since that requires a chip.
Are you talking about the thing where sending 9 volts on a data pin fries the switch? I don't blame the spec or Nintendo for that one.
That cable is more of a complexity problem, but it wasn't because it misrepresented capabilities or anything. They put in a chip which didn't reset the connection when you unplugged one end. I don't know if that's really a spec problem, though.
The first MacBook model with USB-C shipped with a USB 2.0 cable for its charger that only supported charging. I don't know if this is still true, though (I haven't checked the one that came with my newest one).
> Compared with Apple USB-C Charge Cable
The Apple USB-C Charge Cable is longer (2m) and also supports charging, but data-transfer speed is limited to 480Mbps (USB 2.0) and it doesn't support video. The Apple Thunderbolt 3 (USB-C) cable has Thunderbolt logo on the sleeve of each connector. Either cable can be used with the Apple USB-C Power Adapter.
I haven't seen one either but I can easily believe it, as there are proprietary magnetic USB-C adapters that do not support data (they only have 5 or 6 pins). I can imagine a "normal" cable behaving the same way.
I have a somewhat similar problem, where my nvme drive enclosure works with the USB-C port on one side of my laptop but not with the other - but both ports should be identical spec wise. Nothing makes sense any more.
I have a MIDI piano that takes USB 2 Type B. I bought a B-to-C cable and it works fine with my laptop and desktop. I decided I needed an extension cable (for connecting a microphone) and I opted for USB-C.
The piano's cable doesn't work with the extension, unless you connect it the right way 'round. USB-C shouldn't need a right way 'round...
And yes, when you get a USB-C that does data, it could also have been a cable that only do USB 2.0 Speed, or USB 3.1 Speed but not USB 3.2 2x2 Speed. And with USB4 there will be additional layer with Thunderbolt 4.
I read through the Penrose presentation slides a few weeks ago and was very impressed with the perspective, idea, and implementation compared to the other dozens of tools I've looked at. I'm not particularly a math specialist, but I do think I'll use Penrose soon. However, it might not immediately fit my needs today.
The presentation made it appear that this was an initial implementation, and that more progress will be needed to fit more use-cases, which is great! Hopefully I can contribute.
I would have to agree. Unless they only interviewed the people who saw a targeted campaign of just that fake info, and then asked how many of them believed it. But broadly there's no way that's an accurate percentage
> I'm not sure using a touch screen in a high-g environment is a good idea.
Agreed. All of the events during this time are fully automatic. If the astronauts needed to personally abort for a reason that ground control did not see, I saw what appeared to be a physical abort switch that needed to be turned and pulled (but I could be wrong)
> they have unprecedented ability to forecast future events.
Per Matt Levine's "anything can be securities fraud if you don't tell your shareholders about it", then having the "unprecedented ability to forecast future events" and not informing your shareholders about it could be argued as securities fraud in court. Something to think about.