Got an Android (or other) device that never seems to pull more than 500mA, and always thinks it's charging from a USB port, even when you've got a 3rd party charger that claims to provide 1 amp or more?
Ever wonder why?
The answer is stupid, and it's basically: "Apple."
The USB spec (through 2.0) specifies a maximum per-device draw of 500mA. This is why, if you plug a device in to your USB port in your computer, it'll charge slowly (or, in some cases, not at all).
Of course, when USB first became popular as a charging standard for consumer electronics (around the time of the old school iPod), this created an issue: 500mA sucks. Lots of devices need (or, at least, could utilize) much more current than that, and so these devices often ship with their own AC adapter which can provide much higher current.
But how does the device know how much current it should be pulling? Simply attempting to draw more than 500mA from a PC USB port, which is spec'd at 500mA, would be a dick move. Up to now, PC manufacturers designed USB components with the assumption that nobody would be drawing more power, and drawing too much could do bad things (probably "just" blowing a fuse... but that's still an annoying no-no).
So, manufacturers needed a way to determine whether their devices were plugged in to a "data" port (at which point they'd limit current draw to the USB spec), and if not they'd open her up and pull a higher current. And they needed this to be cheap.
Problem: there was no standard for negotiating draw higher than 500 mA.
Let's back up: USB cables have four pins. Two of these are dedicated to power (+5 Volts/red and ground/black). Two of them are data pins that provide for data exchange (green/white).
Apple had an ingenious idea: what if we base our charging mode on *the voltage on the data pins*? If you detect the nominal PC voltage, you operate in "USB" mode. If you detect other voltages, you operate in... "other" modes.
Problem is, nobody else did this.
Now, as USB charging was becoming really popular, manufacturers figured they needed a standard way to address the issue. And, simple it was: if you short the data pins together, the device draws the full power supported. If not, it did the normal USB negotiation thing, and it limited its draw to 500mA.
But then came the iPhone. And you know what? Apple ignored the new standard, and still did their own thing (which continued with the iPad, natch).
And, today, the most popular phone in the world and the most popular tablet in the world have their own proprietary way to determine whether to limit their current draw to the USB spec.
What's a charger manufacturer to do?
Well, in most cases, the answer is: do what Apple wants. In others, it's ignore both of these things and produce something with *neither* Apple's voltage-testing solution *nor* the shorted pin solution. And, yes, in some cases there are chargers with the data pin shorted as well.
But, you know what? It's incredibly difficult to figure out which path a particular 3rd party charger manufacturer has taken, short of scouring product reviews.
Now, I'm here to tell you, dear readers with Android devices so afflicted, who have sat through this long, sad tale: there's a solution, should you have a device with not-shorted data pins! The solution is:
Short out the green/white data wires *in your USB cable itself* - and you can use it with any charger. Well, you shouldn't use it with a PC, for reasons mentioned above. Splice open the wire, connect white to green, tape it closed, and voila. Happy sailing!
Or, you could buy the "charge only" USB cables which are already cabled thusly. But good luck with that, since even Amazon offers inconsistent results (often, these are really "normal" cables simply *labeled* as charge-only cables).
Comments