Electrical Dumbass-Check

Sep 28, 2007 18:19

I have a safety question for the electronically-minded among you; I dearly want to go through with a particular test (as I'm hungry for knowledge of the world), but I have a strong suspicion that I might set a multimeter on fire in the process.

Read more... )

Leave a comment

Comments 22

jeffspender September 29 2007, 02:31:27 UTC
I should be able to give you a fairly unambiguous answer to that question when scanwidget gets home in like 20 mins. ;)

Reply

jeffspender September 29 2007, 02:59:43 UTC
scanwidget says:

Your test should be fine, assuming you don't short your multimeter leads together while trying to get the measurement :)

The current rating (10A in your case) on the meter is only applicable when that current is actually flowing through the meter. When measuring voltage, the meter is in a parallel circuit with your load. Both the meter and the load see the same voltage, but since the meter in voltage mode has such a high resistance (megaohms or so) only a very tiny amount of current will flow through the meter itself.

Now, if you were trying to measure the current into your PC that would be a different story. You'd need either a beefier multimeter, or you'd need a shunt resistor that you could measure the voltage across.

Reply

willskyfall September 29 2007, 21:50:38 UTC
Thanks!

I'm still trying to gauge whether or not it's a Good Idea (given others' objections), but if I can assure myself that I'm taking necessary precautions, I'll totally give it a try.

. . . the voltage-testing, I mean, not the sticking the multimeter in a wall socket. :)

Reply


partly_cloudy September 29 2007, 02:33:53 UTC
Dunno, but to me it sounds like for (potentially) $20 and in less than 1 second, you can find out whether or not you need to give up and track down the amperage requirements on every piece of computer hardware you own. :P

Reply

willskyfall September 29 2007, 21:36:15 UTC
Heh. :) Did I mention I'm using (mostly) junk hardware to test this particular theory out?

(Though admittedly, I'm primarily using junk hardware because I intend to hook it up to a power supply that's not beefy enough for it, and Interesting Things might happen; though I'm sure that "sparks from multimeter probes fry running computer" would be a dramatic alternate option.)

Reply


ziqueenmab September 29 2007, 02:48:14 UTC
A common-sense observation from someone with experience only in non-XTREEM multimeter use:

A multimeter set to measure voltage will have a very high internal resistance (it's connected in parallel across the thing you're measuring so any current that went through it instead of the thing would affect the very voltage you were trying to measure). A multimeter set to measure current will have a very low internal resistance (connected in series with things, so any internal resistance will mean there's some voltage drop across it which in turn can affect the circuit you're measuring). So it stands to reason that the multimeter-innards you're working with (and possibly pushing the limits of) depend on what you're set to measure.

So the 10A cap on measuring current probably doesn't apply so strongly to measuring voltage. I wouldn't take this as a "totally go ahead and do it", though, since 20A is still a lot of current in my mind.

Reply

ywalme September 29 2007, 03:10:06 UTC
True. To get quantitative, I just looked up the internal resistances of multimeters these days and they seem to be on the order of megaohms. So, unless the resistive element you're measuring across has comparable resistance, you shouldn't be drawing too much current if you're in voltmeter mode. And if you're measuring across the internal resistance of the supply itself, power supplies tend to have very low internal resistance -- I pegged a lab supply at a measly 37 ohms last semester -- so unless something is very wrong, your power supply should not have a resistance comparable to your multimeter's. Still, you should check on this. (Especially since I've never done this on a computer power supply, and I don't know exactly how their innards are set up ( ... )

Reply

willskyfall September 29 2007, 22:01:52 UTC
If you really feel you must do it, at least keep one hand in your pocket.

I can do that. It might make placing the multimeter probes a little more awkward but I definitely understand the sentiment.

I know that "it's the current that kills" and all that, and I understand that 20 A is a pretty ridiculous value compared to other systems, but wouldn't the overly-low voltages that I'm testing (5V, 12V) make things a bit safer?

This may be an abuse of Ohm's Law, but if the power supply is pushing a high current through a low voltage drop, it's because the internal resistance of the power supply is miniscule. Would the resistance of human skin be enough to significantly cut the current if something went wrong?

. . . maybe I should wear rubber gloves too. :)

Reply

No real help here, just a data point ziqueenmab September 29 2007, 22:17:37 UTC
I've seen a person's resistance (measured with one multimeter probe held in each hand) vary between the low tens of thousands and ~1 million ohms, probably mostly dependent on the clamminess of the person's skin.

Reply


camlost September 29 2007, 03:01:26 UTC
No expert here, but I agree with ziqueenmab. Generally speaking, a multimeter has two spots to put in the positive leads, one for most measurements (including voltage), and another for amperage. The former has a very high resistance, the latter has very little. While the power supply is drawing 10-20A, almost none should be flowing through the multimeter because of Ohm's law. If you were to measure the output of a single lead, you'd put the multimeter in series with the circuit, and then you'd have to worry about a maximum amperage rating.

Reply

willskyfall September 29 2007, 21:42:40 UTC
If you were to measure the output of a single lead, you'd put the multimeter in series with the circuit, and then you'd have to worry about a maximum amperage rating.

Is it a problem that this thread is full of statements (like the above) that make my brain joke, "Leeeeeeeroyyy . . ."?

Reply


gdarklighter September 29 2007, 05:48:01 UTC
My advice, which I'm sure you'll find unpleasant, is not to go poking around in these things. Find the power ratings of all your equipment, and then buy a power supply that gives you a margin of safety on that number. Poking around a running piece of computer hardware without appropriate probe points is generally unwise.

Reply

ziqueenmab September 29 2007, 06:39:22 UTC
Poking around...without appropriate probe points is generally unwise.

Obligatory baw-mah-now.

Reply

sithjawa September 29 2007, 18:15:41 UTC
The problem with that theory is that the majority of power supply vendors actually don't advertise their power ratings as well as they should, and some even lie. Apparently it's really costly to make the higher-voltage lines not suck, so a lot of companies like to make a power supply with a really beefy 5-volt line and then go "Look, a 500 watt power supply! You should buy it!" when it won't power jack in a modern computer.

So that's easier said than done.

Last night I uninstalled a 480-watt power supply and installed my old 350-watt power supply, because the 480-watt power supply was underpowering the graphics card. So much WTF. WTF, power supply vendor, WTF?

Reply

sithjawa September 29 2007, 18:17:17 UTC
(I knew it was underpowering the graphics card because it did so on another computer with a beefier graphics card, so when my computer crashed on playing a video and then completely refused to send picture to the monitor no matter how many times I power-cycled it, I thought 'I wonder what would happen if I switched power supplies?' It worked. I boggled.)

Reply


Leave a comment

Up