# Batteries Series Parallel

A few questions to make sure I don't burn something up. I am playing around with different batteries and ways to make my bots last longer between charges. I want to ensure I have this right before I hook anything up.

First off lets assume I have 5 volts hooked up to a LED (max rated current of 30milliamps) using a 150 ohm resistor (5 volts divided by .03 amps = 166 resistor to keep LED from burning out).

Series: Lets say I take a 5 volt battery pack and hook it up in series making the total voltage 10 volts.

Assumption: The LED will burn out due to the increased voltage. I need to use a 330 ohm resistor to keep ir from burning out (10 volts divided by .03 amps = 333 ohms).

Parallel: Lets say I hook up a 5 volt battery pack in parallel making the total volts still 5 but doubling the amperage (also how long the LED will stay lit until the battery can no longer keep the LED on.

Assumption: No change required the 150 ohm resistor will keep the LED on and it will stay lit approx twice as long as the first scenario with the one 5 volt battery.

## Comment viewing options

First of all, you will not burn your leds with these resistors. If indeed your leds can take .03 A (30 mA) continuously.

Better yet: you are being too careful in your calculations. And since you are planning to change from 5 V to 10 V, you should understand this.

The current through the led is the same as the current through the resistor, you got that right. BUT the tension (voltage) is NOT!

As the electrons race around your circuit, they will lose energy with each obstacle they meet. You will see the voltage (compared to ground) drop after each obstacle.

Some obstacles are harder to negotiate than others. For example, a 10 kilo Ohm resistor is harder to overcome than a 100 Ohm resistor. The funny thing with leds is that, even when you do not know how difficult an obstacle it is measured in Ohm, you can still tell how much energy is lost in that particular obstacle. Just measure the voltage right after the led and compare that with the voltage right before. Let your voltmeter do the subtraction: put one test probe before and the second just after the led.

The voltage that you are reading (across the led) is called its forward voltage and it is a fixed property of your component. The datasheet ought to document this forward voltage. For a coloured led it is typically around 2.0 V.

For your circuit this means that the 2 V your led is "consuming" cannot be "consumed" by the remaining components in series with it (in your case just the 150 ohm resistor). The resistor will only "see" 5 - 2 = 3 V. Again: check with your meter.

Now do Mr Georg Ohm's math again: 3 V / 0.03 A = 100 Ohm. That's a big difference. With a 150 Ohm resistor you would get a 3 V / 150 Ohm = 0.02 A current. Only 66% of the max you were shooting for.

Repeat the whole story with a 10 V battery: (10 V - 2 V) / 0.03 A = 267 Ohm. If you were to use the resistor of 333 Ohm, the current in the circuit would be 8 V / 333 Ohm =  .024 A. It's getting closer to .03 A. But you still did not burn anything 8-(

You are correct in assuming that twice the amount of batteries will last longer. But I shall not make any promises about how much longer.

Batteries are funny stupid things. Check the power pages on LMR. One of them tells me that a single battery (we should say "cell") lasts longer (measured in power delivered to the user) when you "draw" smaller currents from it. In other words: if you want to suck every last Joule of energy from your cell, you should suck very, very, very slowly. Very.

A battery pack is the same as a bunch of cells in series. This, again, means that the current through each cell is the same. The voltage over each cell is probably also the same, but that is just the result of you using similar cells in one pack. A wise choice btw.

As the discharge progresses, that voltage per cell might start to differ a little. Small differences between them (age, accidents, damage, charge status to begin with) will start to play a role. One of them will "bottom out" first. Another one last. But your 5 V will be gone as soon as the first one craps out. Game over if your application absolutely demands a minimum voltage of 5 V. Your led is not as picky as your cell phone.

When you draw a relatively small current from two batteries in paralel, you can no longer garantee that the current through each battery is the same. And the discharge of each battery will not be perfectly balanced as it was with the cells in series. But when one batery ("pack of cells") is empty, the second one might still be able to provide nurishment to your led. But consider this: it is also providing nurishment to the empty battery. A battery is still a conducting material you know!

When it is longevity you are after, consider buying bigger (fatter) cells/batteries. read the power pages about "Ah": Ampere-hour. It's the unit for battery capacity.

I thought (or was told long ago) that amps (current) could only be taken, not given. I.e. if you double the AH's by doubling a pack in parallel, or allow twice the amount of current to be sucked out it is just that -You CAN get twice the amps or the same number of amps for twice the time. However, if the load draws say one amp, and you double the "available amps" the batteries won't "shove" extra current through the device. Again, current can be taken, not given.

Is this even close to correct (if you translate my crappy lingo)?

That is the exact LMR speak that we need in here.

jklug, read Chris' post. So much shorter. And a littlebit sweeter.

That's true for a voltage source - use a current source and you can push as much current around the circuit as you want!

niall:

This is done by upping the voltage, so it's still the same situation, only sorta backwards.
Always thougt the notion of a current source is an unnecessary confuscation.
They're probably practical tool for synthesis in some situations, but don't forget that it's still the voltage pushing a current through a resistance - alternatively a resistance pulling a current from a voltage.

yes, but with a voltage source, the current is limited by the load - with a current source it is not limited by the source.

Oddbot:

Current sources are useful when powering things such as LEDs - you want them all to run at a constant brightness which is governed by the current. Put 2 in parallel with a voltage source and due to the discrepancies in the manufacturing of the LEDs they may not have the exact same voltage drop across them. This would mean that the brightness would vary. Put them from a current source and you are forcing the same current through them regardless of the number of LEDs in series. Obviously you are then limited by the maximum voltage is can supply, much like you are limited by the maximum current a voltage source can supply.

I totally agree, but I still find it simpler to view them as a variable voltage source, such that current is constant -  put in a black box called a current source.

Or, after reading that sentence over again, maybe not simpler, but certainly closer to a physical understanding of what's going on.

A valuable tool for synthesis and analysis, which in the worst case can lead to some misunderstanding in the electra-newbs mind on the workings of an electrical circuit.

Hmm, what am I talking about? I guess the bottom line is I never liked them much.

I dunno man,

My dust collector for my shop pulls almost a full 20 amps at 110-120vac. I rewired it for 220v and it draws 10amps. Double the voltage, 1/2 the current. Is this an AC/DC thing?

I forgot about the re-wire. Yes, when I changed the tool from 110 to 220, I did have to change some wires going to the motor. Windings, yadda yadda.

At any rate, the all things being equal thing is right on and I sorta missed the point on this one.

I'm sticking with, You can suck amps, not shove them.