USB-C hubs and my slow descent into madness (2021)
82 comments
·July 18, 2025kbos87
geor9e
Because the USB Consortium made a terrible mistake. Instead of speccing USB-PD power supplies to default to 5V <3A when there are no resistors in the port of the other device, the default is do nothing. So in order to be in spec, you have to refuse to charge non-compliant ports. This means the compliant power supplies are worse, in a way. So you need to use a "dumb" USB-A power supply and USB-A to C cable, which does default to 5V <3A to matter what. As for why some devices choose to use non-compliant ports - I assume it's extreme cheapness. They save a penny on a resistor.
cFyrute
It's not a terrible mistake. A terrible mistake would have been having such power available on ports that even a reasonable person might short out by innocently connecting a USB C cable between them.
A couple 5.1k resistors add about $0.00001 to the BOM cost. The terrible mistake is on the designers of devices who try to forego these.
marcosdumay
At this point I'm even surprised that compliant cables and chargers exist so the GP can have that problem.
But I believe the specs are that way to avoid problems with OTG devices. If both devices decide to just push a voltage into the cable at the same time, you risk spending a great deal of energy or even start a fire. That said, there are better ways to deal with this issue; those are just slightly more expensive.
ncann
I guess this is only partially true, as I have a A-to-C charger cable from Huawei that works with everything except my Pixel 4A phone. And my Pixel 4A phone works with everything except that specific cable.
geor9e
Maybe the cable is missing the CC pin resistors (all USB-A to C cables are supposed to have them to identify themselves as such), and maybe only the phone cares.
lxgr
Some badly designed USB-C devices don’t properly negotiate power supply, and as a result, only USB-A (since these always output 5V without any digital or electrical negotiation) or other non-compliant USB-C devices will actually charge them.
rdl
They are devices that don't do USB PD. Usually it is a USB-A to USB-C cord, and just provides 5V 500mA or higher.
Gigachad
It’s not really PD. It’s just they aren’t usb c spec compliant at all. USB-C has the power pins at 0v by default, and you have to signal there is a connected device to activate 5v. While usb-a has 5v hot all the time.
Since there aren’t any active chips in these cables, an A to C cable happens to have 5V hot on the usb c side, but this should not be relied on as it isn’t true for C to C
lxgr
PD is optional for USB-C devices, but these out of spec devices don’t even support the basic USB-C resistor-based identification scheme (which is mandatory).
jjice
I've always ignored instructions that say to only use that product's USB cord (things like my kitchen scale and water flossed) and have never had an issue. Sounds like I've just gotten lucky though, based on your experience.
I was under the impression that the USB protocol just fell back to 1a 5v when power negotiation was unsure.
What kinds of devices?
Gigachad
USB-C is 0v by default and you have to signal to get anything at all. A lot of junky devices are non compliant and aren’t set up to signal 5v so they get 0 when plugged in to a C-C cable.
orbital-decay
How does it negotiate with a host-powered device if it's unpowered to begin with?
lxgr
USB-C hosts and power adapters are only allowed to provide 5V if they can sense a downstream device (either via a network of resistors or via explicit PD negotiation).
Out-of-spec USB-C devices sometimes skip that, and out-of-spec USB-C chargers often (somewhat dangerously) always supply 5V, so the two mistakes sort of cancel out.
markbao
I’ve experienced this too and it’s not just no-names. I have a wireless gaming keyboard from SteelSeries, certainly a very legit brand. I lost the original USB-C cord. Tried every USB-C cord I could find, and they power the keyboard and charge it to exactly 1%, but no more.
Found plenty of people online with the same issue but no resolution.
Finally just paid the $25 to get the OEM SteelSeries replacement cable and it charges fully again. wtf… I guess the replacement cable was USB-A to C and I’ve only tried USB-C to C cables?
mrheosuper
That's a big red flag. IF their engineers wont even bother reading the usb-c documents, how can i trust them doing their job right?
buccal
Actually, in most situations with this problem it is possible to solder 2 additional resistors inside the offending USB-C device. I have done that on a flashlight and can confirm that it fixed the problem.
dwood_dev
I have purchased multiple devices like this over the years. In all cases, it is that it doesn't have whatever circuitry is required to have a USB-C PD charger send 5v down the line. Using a USB A to C cable works every time. Ironically, using a C to A then A to C then makes it work with a USB-C charger.
null
colechristensen
They are likely not following the USB spec correctly. Things like pulling certain pins high or low or having a set resistance between certain pins or communications between the host and device will all affect what goes over the wire and whether the host or the device will accept this. Cables will also have some pins entirely unconnected.
Cheap, bad, shortcuts, etc. will result in an out of spec cable being necessary for an out of spec device to work correctly with an in or out of spec hub. It's terrifically frustrating but a fact of the world.
And this isn't just random no name knockoffs. The Raspberry Pi in certain versions didn't correctly follow the power spec. Both the Nintendo Switch and Switch 2 either incompletely, incorrectly, or intentionally abused the USB spec. The Lumen metabolism monitoring device doesn't follow the USB spec. This is one of those things where you want a bit of a walled garden to force users of a technology to adhere to certain rules. Especially when power and charging is involved which can cause fires.
rjh29
The Nintendo Switch PD charges from every adapter and cable I have tried. However the Switch's power brick itself won't PD charge any other device.
jamesy0ung
The charger included with the original Nintendo Switch charges my MacBook Pro at 40 watts.
jonathanlydall
More accurate to say it’s a dock than a hub, but I’m using a Dell 2427DEB monitor[0] with my Dell work notebook and a second monitor daisy chained from it on the DP out port.
My work laptop has just a single USB-C cable plugged into the monitor for everything making it super trivial to plug it back in when I use it away from my desk (which I do regularly).
My personal desktop has a single DP and also a USB A to B cable. The monitor has KVM capability so I can super conveniently switch between them even with the two screens.
Cables are completely minimized with this set up, I’m very happy.
The only thing that’s unfortunate is that I occasionally work on a MacBook Pro 16” M4 and it can’t drive the second monitor over the same USB-C cable as Apple CBA to have support for DP MST on even their premium priced hardware. So I have to also plug in an HDMI cable to the second monitor.
Also unfortunate with the MacBook Pro is that macOS UI scaling doesn’t allow ratios like 125% meaning the UI elements aren’t quite at my preferred size. My Windows 11 handles this perfectly.
[0] https://www.dell.com/en-us/shop/dell-pro-27-plus-video-confe...
donatj
I had my own decent into madness this spring.
I slowly replaced my home network piece by piece trying to find the bottleneck that was causing my gigabit internet to top out at ~300kbps in my office on the other side of the house from the modem.
After replacing the Ethernet cable run from the second floor to the basement with fiber optic... And the switches in between... And seeing no improvement... I tried a different computer with a built-in ethernet port on the same cable, and pulled 920kbps.
The problem... Was my Caldigit Thunderbolt Dock. I replaced it with an OWC one from Facebook marketplace for cheap and it solved the problem... Roughly $500 in. I'm still angry I didn't check that early earlier.
My network is 10 gigabit now though.
billforsternz
I think you mean 300Mbps and 920Mbps (M not K right?)
geor9e
USB-PD hubs are very annoying. Devices with no battery on a hub (Raspberry Pi etc) will just get hard rebooted if anything else gets plugged into the same hub. I looked at a lot of hubs and they all behaved this way. They all cut power to everything, then re-negotiate the power allowance each device gets from the shared wattage, every time anything new connects. I could not find a hub that honored existing power contracts and gave new devices the remainder. My guess is the average customer expect the newest plugged in device to get max power (at the expense of everything else) or they return it to the store thinking it's broken.
mystifyingpoi
Unfortunately there is no real solution to this, that would work in general case. With renegotiation, the power gets cut off, but most likely every device will get the allowance and all of them will still charge. With no renegotiation, a newly plugged in device might not charge at all. Not sure what's worse.
sciencerobot
I’m not sure if it’s a USB-PD hub but Jeff Geerling posted a video about a USB-C charger that doesn’t suffer from the renegotiation issue https://m.youtube.com/watch?v=dG2v4FHwJjE
k8sToGo
Not the case anymore with more modern chargers e.g. Anker GaN
x-complexity
Previous discussion (520 comments): https://news.ycombinator.com/item?id=30911598
tomhow
Thanks!
USB-C hubs and my slow descent into madness (2021) - https://news.ycombinator.com/item?id=30911598 - April 2022 (513 comments)
whatever1
The problem that I have is that I have a ton of usb cords in my drawer. I DONT KNOW WHAT THEY ARE CAPABLE OF.
Is it a charge only cable (no data)? Is it usb3 5gbps ? Is it 100 watt power delivery? Is it thunderbolt 3/4/5?
CharlesW
You need a tester like the FNIRSI FNB58 (not affilate link: https://www.amazon.com/gp/product/B0BJ253W31). This is just an example and not a personal recommendation, as I've just started looking into these myself.
Gigachad
Thunderbolt cables have always been marked either on the ends or with a sticker wrapped around the cable. Everything else can be assumed to be a 2.0 cable at 60w/140w
ethan_smith
Colored electrical tape or heat shrink labels at both ends of each cable with a simple coding system (P=Power delivery wattage, D=Data speed, T=Thunderbolt) solves this problem permanently.
ianburrell
The USB IF really should have specified some labeling for cables. The icons used on USB-A connectors are too complicated. What I think would work well is colored rings, with some for the different USB3 speeds, and some for the 100W and 240W charging.
crote
They did! [0] The problem is that the vast majority of manufacturers have chosen to just completely ignore it.
[0]: https://www.usb.org/sites/default/files/usb_type-c_cable_log...
whatever1
Really what we need as consumers are just 2 numbers: X GBps | Y Watts. I don’t need to know what freaking protocol it uses under the hood.
ncann
You can test them to a certain extent using a USB tester device like RYKEN RK-X3
miek
I printed details on labels using a Brother label printer and then attached them to one end of each cable.
alanbernstein
Details you found using a tester? I label some USB cables, but without a tester there is a limit to how much I know about them.
rr808
Not strictly related but I just bought a USB4 USB-C cable which is rated at 40 Gbps. I still can't really believe it. (I still remember saving to cassette tape)
cyral
I use one of these for two 4k monitors, sound, ethernet, mouse, webcam, AND charging. It's amazing having one cable to plug in when I work from my desk. Unfortunately requires one of those $400 docks though.
gleenn
Have you tried to benchmark it at all?
rr808
I just have the cable. I dont have a computer nor a device that can transfer that quickly. Wont be long though! Actually looked like those 6k monitors @60Hz will fill the pipe.
jeffbee
Do you doubt that they work? Your can demonstrate it to yourself pretty easily with 2 computers and iperf.
acheron9383
Man that is a lot of computer to put into that product for as little money as possible. I'm not intending to excuse the products with likely bad firmware causing most of these issues, especially Ethernet PHYs. Though, in my professional experience doing embedded device firmware, Ethernet PHYs are always the biggest pain, hands down. The firmware included with them has many many ways to configure it wrong, and the defaults are typically just random settings, not a reasonable configuration. Just getting the drivers even running sometimes involves just knowing a few tricks. Anyways, it doesn't surprise me many have trouble working right, especially when they indicate they are all running OEM firmware essentially.
userbinator
However, I can’t help but feel a little bit cheated by companies just buying off-the-shelf products, slightly modifying the case layout, and then quadruple the price because it’s “from a reputable company”.
LOL. Welcome to the world of OEM/ODM. As a conservative estimate I'd guess >95% of all consumer electronics is done this way. Even the big names like Apple, Dell, Lenovo, etc. do it.
However, if you are - according to Wikipedia - a two-billion-dollar company like Realtek, then I expect you to get your shit together. There are exactly zero excuses for Realtek to not have a driver release ready almost a year after Big Sur has been announced. Zero.
Mac users are in the minority. It's worth noting that the RTL8153 is a native RNDIS device, which has its history on the Windows side, and Realtek has only started contributing drivers to Linux relatively recently.
FWIW I've had great luck with Realtek NICs, although I don't specifically recall using their USB NICs.
0manrho
> I've had great luck with Realtek NICs, although I don't specifically recall using their USB NICs.
I envy you. Realtek NICs (especially over USB) are tantamount to trash in my mind after 2 decades of fighting their frequent failures. Be it failure to initialize at all to driver crashes to pisspoor feature sets (or claiming to have features that don't work at all), and a myriad of other problems. Granted, they usually work in Windows, but I don't work in windows (I live and work in linux/BSD). It has become my personally policy to avoid/disable and realtek NICs and replace them with something actually functional whenever possible.
Hopefully their work on linux-side drivers will change this given their proliferation.
To be honest I've yet to find a reliable USB based network interface regardless of chipset/brand/manufacturer, outside of the ones that do PCIe passthrough via USB4/Thunderbolt and those tend to be quite expensive (though they are starting to come down in price).
userbinator
I suspect a lot of the flakiness is not the chip itself but the fact that, because it's cheap, the bottom-of-the-barrel manufacturers will always use it instead of the significantly more expensive alternatives, and then further cut corners with the board layout and design.
Ironically, the only problems I've had with NICs were on an Intel and a Broadcom.
bigstrat2003
Unfortunately I can't say I'm surprised that the common thread was Realtek network chips. I've found their NICs to be fairly flaky over the years - the majority work, but a solid minority don't. In contrast, Intel NICs have been bulletproof for me and I seek them out whenever I have any choice in the matter.
internet2000
Looks like he only bought cheaper things, so no wonder they all eventually died. My USB-C hub is an HP Thunderbolt dock. It's beefy as heck, lasted for years with no issues. It has a tiny fan inside, which I assume helps with the longevity. I hear good things about CalDigit docks too. Those also are very expensive.
noisy_boy
Expensive doesn't equal great either. My ThinkPad thunderbolt hub, which is not cheap by any standards, can't route HDMI without randomly restarting itself every few minutes. Connecting the same cable directly to my ThinkPad laptop works perfectly fine. Sort of defeats the whole point of buying a hub. I have sort of given up on hopes of getting a high quality hub - it's a money sink
bigstrat2003
I agree that if you buy cheap devices you shouldn't expect them to last. But the first device was almost $100, which I certainly wouldn't call cheap.
mrheosuper
$70 for a hub is not cheap at all.
Gigachad
I’ve got a few of the Apple ones because every new job just gives you one, and they have always worked in every way in every device I’ve used them on.
Yeah they cost more but they actually work properly.
daveidol
Apple ones? Like this? https://www.apple.com/shop/product/MW5M3AM/A/usb-c-digital-a...
Gigachad
Yep that one. I guess they don’t have everything, but the everything hubs tend to come with a lot of confusing pitfalls. I use that one in particularly on my steam deck mostly.
daft_pink
I’ve also given up on USB hubs and I’m using a Thunderbolt 4 dock to get more IO out of my Mac Studio. It feels crazy to spend that much $’s, but it solved my problems.
Can anyone tell me why I have several devices in my home that demand a certain USB-C cord in order to charge? They are mostly cheap Chinese devices that won’t acknowledge a more expensive (e.g., Apple) uSB-C cord plugged into them. Even when plugged into the same transformer. They only charge with the cheap USB-C cord they came with. What gives?