Instructions
Disconnect the positive terminal of the power supply from the circuit. Connect the positive lead of the ammeter to the positive power supply and connect the negative lead of the ammeter to the circuit. If using a multimeter, be sure to plug the leads into the current sockets.
Set the meter to its maximum scale. Different meters have different selectable ranges, so the maximum could be something like 120 milliamps, 3 amps, 10 amps --- however the particular meter is designed.
Turn the selector knob to progressively lower scales, until the meter is in the most accurate range for the measured current. If you're using a digital meter, this will be when the meter has the greatest number of non-zero digits. For example, 21.46 is more accurate than .0215. For an analog meter, the most accurate reading will be possible when the needle is at its maximum deflection while still staying in range.
Note the selected range. The range selection shows the maximum value of the panel reading.
Read the display. For a digital meter, this is simply reading the numbers. The location of the decimal point is determined by the range selector. If the setting is 20 milliamps, then the decimal point will separate the "1 milliamp" digit from the "tenth of a milliamp" digit. For an analog meter, there will be separate scales on the display. Read the scale that corresponds to the range selector. As with any analog display, the value is the number corresponding to the smallest division mark the needle has passed, plus a fractional part that you estimate.