# Electronics questions

I'm pretty bad with electronics... I'm more about putting things together I guess. But I need a little help:

I don't understand how a power source (battery) has amps. Isn't current (amps) a function of voltage and resistance through ohm's law? Is it possible to blow something with too many amps like it is with volts?

While we're at it, what happens when you put a voltmeter behind a resistor? Where does that number come from?

Also, let's say I tried to power (among other things) a picaxe with a 12v 8 amp computer power supply. Could I get it down to the 5 volts required with resistors? I know I can use a 5v regulator, but I don't want to waste a lot of power to heat. Would the amps be to much for the picaxe?

edit: alright, apparently power supplies usually have a 5v output.  still, will this give two many amps for a picaxe?

thanks!

## Comment viewing options

so if I use a 5v regulator a high-current input, will the current be reduced also to prevent destroying the picaxe? or. what if I used the 5v output (very very high current) from a power supply? or will the picaxe only pull the current it needs? sorry if these are dumb questions and thanks for the answers
The PICAXE will only pull the current it needs.
okay, so it won't burn out. Thanks!

Maybe have a look at this, wayland:

http://www.eecs.tufts.edu/~dsculley/tutorial/

Short, simple and straightforward tutorials with a few pracs you can do. Don't cover everything, but I found them interesting.

Might help.

A resistor network will waste at least as much power as a voltage regulator, and as Grue points out it cannot provide an output that is as stable.

A battery or other power source that generates a particular voltage will deliver that entire voltage to the load circuit. Current on the other hand is drawn from the power supply according to Ohm's Law (which only applies to completely Ohmic components) or whatever the current pull of the load circuit is at the supplied voltage. The power source will supply as much current as the load demands, up until the rated max, at which point the source will start to struggle and you'll typically see a 'sag' in the voltage delivered.

Batteries are not rated in Amp/Hour. The unit of electrical charge is defined as A·h. Let's say, on the battery is mentioned 12V/1.5 Ah. This means a connected circuit, which draw 1.5A at the rated voltage, will work 1 hour, till the battery is empty. If the circuit draw only 0.75A, it will work 2 hours and so on.

Every battery has a resistance, btw., called internal resistance.

Power supply outputs are not rated in Ah. They are rated in Ampere, for example 20V DC, 4.5A. This means, the power supply can power a circuit, which draw 4.5A at the rated voltage, continuous.

You can power your PICAXE with a 12V and 8A computer power supply, but you need a voltage regulator, to lower the voltage from 12V to 5V. Generally you can use a voltage devider made by resistors - you will find these voltage deviders in many circuit diagrams, but this is only recommanded for very simple and low current applications. A voltage regulator (for example 7805) is in your case a much better solution.

Lets see if I can explain simply [and remember highschool electronics at the same time]. Voltage is the potential energy of a battery, Amps is the flow of energy in a circuit. Batteries are rated in Amp/Hours or milliamp/hours which means how much energy they can deliver.

Too much amps can fry a component as well as too much voltage. Most parts are rated by how much current [amps or miliamps] they can handle without burning out.

To answer your voltmeter behind a resistor question: each component in a circuit drops some voltage. So in practice if you have a White LED and a resistor in series, each one will drop some voltage. Start with a 9V battery. The White LED needs 3.6V to run so subtract 3.6V from 9 and you have the voltage of 5.4V across the resistor. The current that the LED needs is a constant of 30 milliamps max or 0.03 amps. So with Ohms law you take the voltage acroos the resistor and divide it by the needed amps of the circuit to get the resistor value that you need. The LED will always drop 3.6V in any circuit you put it in.

Now your resistor or regulator question: Yes it is possible to use resistors but it will not work reliably. The reason is simply that any MCU will not have a steady current. The current draw depends on a lot of variables like how many things [sensors etc] is hooked up to the MCU and how many are drawing current from it. A static resistor network can drop voltage yes but as the MCU needs more the voltage will drop too much and you have a brown-out and reset. If you figure to little you could give too much voltage to the MCU and fry it. Regulators increase or decrease the amount of amps to a circuit to keep the voltage at a constant level without you having to do lots of math and testing of your voltage dropping resistors. Think of it as a switching power supply on a chip or like a dam that opens and closes to let more or less water through in order to maintain a constant water level no matter how many drains are opened or closed for use.

Hope I haven't confused you too much and I hope my memory serves me well on the details above...LOL At least I haven't fried my robot...Yet