I have a safety question for the electronically-minded among you; I dearly want to go through with a particular test (as I'm hungry for knowledge of the world), but I have a strong suspicion that I might set a multimeter on fire in the process.
Having lost several pieces of expensive computer hardware to an overtaxed power supply, and knowing other people who have suffered similar losses, I am somewhat paranoid when it comes to power supply problems. I've spent a good chunk of time putting together computers and diagnosing hardware problems, but the subleties of power supply failures hit me in a serious blind spot. It's not enough to know that a power supply is working, apparently--you need to know whether a power supply can support the power draw of your system, or if it will start undervolting and potentially damaging any or all of your computer's equipment.
The best way around this problem, it seems, is to laboriously calculate the amperage requirements of each piece of hardware in your computer on each rail (5V, 12V, etc.), sum up the values, and double-check that your power supply is rated for the required current. I've had difficulties tracking down those numbers, and--I'll admit it--I'm looking for a lazy alternative. I want to be able to determine, as problems are happening, that the power supply is undervolting my computer's components.
The Internet says that software voltage readings are usually inaccurate, but that you can easily test the voltage readings of an idle power supply with a power supply tester or a multimeter. (The power supply tester is really nothing more than a very specialized multimeter strapped to a paper clip, to short out the "I'm connected to a motherboard" pins on the power supply plug.) I've done this myself and confirmed that it works.
The problem, however is that the power supply that undervolted my machine tests as fine when subjected to this test. Well, of course it does--the power supply's just fine when idle! It just chokes when you overstress it with top-of-the-line hardware . . . what I really need is to be able to test the voltage on the power supply's connectors when it's under heavy load, since I could then diagnose "strange performance issues" by hooking up my test rig and seeing if the voltage readings were below tolerance.
Here's the thing, though: I've yet to find a single internet article stating unambiguously that one can measure voltage with a multimeter on a PC currently drawing 10-25A of current. I've seen people talk about measuring voltage drops over car batteries with $20 multimeters, but I'm not confident in any of the sources. My $20 multimeter is rated--when measuring current--for only 10 A, but I've yet to find a multimeter for sale on the internet that has a higher amperage tolerance. Are there specially-rated multimeters that can handle the job, or do multimeters operate differently enough when measuring voltage (compared to current) to be safe with 20+ A?
A physicist coworker of mine warns me that I may very well slag my multimeter by trying to measure the voltage, but that I might be able to preserve it by adding resistors in series with the meter. Of course, this would affect my voltage readings. The Anonymous Internet is failing to give me a decently satisfactory answer, so I turn at last to you, dear friends. :)
So, my question is: if I were to turn on my PC, put a heavy load on the power supply with a good 3D benchmarking test (so that it was most-likely drawing ~20 A or so of current), and then attempt to read the voltage across the 12V line of one of the power supply's power connecters with a multimeter rated for 600V (but only 10 A when reading current), would I break the multimeter?
If so, what could I do instead to test the voltage in a live, heavy-load situation? Or should I just give up and track down the amperage requirements on every piece of computer hardware I own? :P