Theoretical mods that could fix premature range loss from weak cells WITHOUT replacing batteries [Active-balancer during driving]

Chevy Spark EV Forum

Help Support Chevy Spark EV Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

Infinion

Well-known member
Joined
Aug 20, 2020
Messages
141
Location
Burnaby, BC
Time snip 1

Time snip 2

prof John Kelly goes over the battery construction and cell measurement/balancing in the Chevy Volt (similar architecture to Spark EV).

What does the BMS do?​


The BECM (Battery Energy Control Module) monitors the state of the 96s2p (96 series and 2 parallel) cells in the 2015/2016 Spark EV. The 2014 has a slightly different cell configuration, 112S3P (112 Series 3 Parallel).

How does it balance cells and how much power can it handle?​


The only issue with this BECM design is its cell balancing strategy. When the Spark EV fully charges to 100%, the BECM will top-balance all the cells by discharging them across up to 96 groups of resistors until the lowest cells reach the average cell voltage. While charging off L1/L2 EVSEs, or DCFC, the power tapers to something below the ballpark of 2-3kW. Dividing power by 96 or 112, each group must dissipate 3000 W / 96 groups = 31W of charging power.

Using the electrical power formula of Ohm's law [P=IV] where P is power, I is current in amps, and V is cell voltage and solving for current, I = P/V -> I = 31W/4.1V -> I = 7.6A of current for a 96 group balancer.

So we can infer that the BECM's balancing circuit is at least rated for around 8 amps of DC current across 96 / 112 groups of cells.

What's the problem with weak cells?​

Say you have a Spark EV that originally had 18.2 kWh of capacity with ~90% depth of discharge. After 10 years your capacity has dropped to around 14 kWh usable, or 22% degradation, with 0.146 kWh / 42 Amp-hours per cell group. However, when you drive to 90% depth of discharge, the car quickly shuts down with "propulsion power reduced". It turns out only two of your cells have degraded to 146 Wh and the rest are actually much healthier at, say, 167 Wh. However, the BECM detected the weak cells dropping below 3V and reduced your power for the entire battery. But it's not enough and the car shuts down after a only a few miles as the weak cells cross the 2.5V cutoff.

In other words, your EV is going into turtle mode early, and shutting down early because your weakest cells need to be protected, and restrict your depth of discharge.

Whenever your Spark EV top balances, The weak cells will throw away 31W / 8A of charging power so they don't become overcharged. Since the difference in capacity between our theoretical Spark EV is 167 Wh - 146 Wh = 21Wh, the car spends 40 minutes balancing on L2 and around 1h40m on L1.

And So?​

And so the BECM protects the weak cells from being overcharged, but your battery pack behaves like it has 96 weak cells with the smaller capacity, when you really have more energy stored that is unusable.


What's the solution?​

The solution is to add an active bottom balancer that charges the weak cells siphon energy from the strong ones.

How could it be done?​


It would use the same balancing lines the BECM's top balancer uses during charging. In the video above, there are service connectors for battery testing that have pins that connect to each of the 16 cell terminals on top of each battery module.



In the service manual, the connectors I am referring to are X5 - X7, X9, X10, and X13. These are male connectors, and the matching female that mates with it is https://www.aptiv.com/en/solutions/connection-systems/catalog/item?language=en&id=15345477_en
Once the minimum cell imbalance is detected in a weak cell, a 16-cell active balancer would operate, transferring energy from stronger cells in the group at <30W per cell group while the Spark EV is in operation to compensate for the difference in capacity.

There would need to be 6 balancers per battery module. This would effectively maximize the usable capacity and depth of discharge of the pack, and extend the service life of the vehicle. A top balancer to protect cell group overcharging, and a bottom balancer to protect overdischarge that the HPCM2 previously achieved by reducing power and shutting the vehicle down early.

Here's one readymade solution for exactly this purpose
https://100balancestore.com/collect...tive-balancer-module-for-lithium-battery-pack
 
So, just to clarify a few points. The BMS in the Spark only balances the cells when it top charges? Or only while charging in general? It doesn't balance the cells while driving?
The service plugs you're referring to are connected to the same lines as the BMS, but they're separate plugs not normally used?
I'm loving this idea, and it would be something not too difficult to add while rebuilding a battery pack.
 
A few further questions I thought of:
Would these add-on BMS modules be active all the time, and would this cause a drain if the car sits?
Would the add-on BMS and factory BMS sharing the same circuits end up causing them to fight? They should both be doing the same thing a lot of the time, but I don't know much about the factory BMS.
 
So, just to clarify a few points. The BMS in the Spark only balances the cells when it top charges? Or only while charging in general? It doesn't balance the cells while driving?
The service plugs you're referring to are connected to the same lines as the BMS, but they're separate plugs not normally used?
I'm loving this idea, and it would be something not too difficult to add while rebuilding a battery pack.
Correct, the BMS presumably uses passive balancing involving a switched shunt resistor for each of the 96 / 112 groups (simple and cheap), and not between cells, so the only option to balance is to do so while charging to 100%. I would hazard to guess the reason EVs don't balance with their passive BMS while driving is because it would throw away precious energy.

The service plugs you're referring to are connected to the same lines as the BMS, but they're separate plugs not normally used?
Wait a minute... Ahhh, you're right, there should be a connector indicated in the circuit diagram if it were separate. After reviewing the battery footage, it looks like the Spark is definitely using this connection as part of the balancing harness.
It's not a dealbreaker, but it adds more complexity ... the need for a breakout PCB board or an inline Y-splitter harness to be designed with the same female and male OEM connectors, and examining the shape it needs to be a proper low-profile fit.


Would these add-on BMS modules be active all the time, and would this cause a drain if the car sits?
From what I've gleaned the modules communicate via bluetooth and can be programmed to turn on and off at custom voltage ranges. The 100balance / Daly units are on all the time and have a power consumption of 1mA while active and 100uA sleep. So with a 0.000002C discharge rate, it is slower than the intrinsic self-discharge rate of li-ion.

Would the add-on BMS and factory BMS sharing the same circuits end up causing them to fight?
Definitely possible, but since the factory bms' top-balancer only runs at 100% SOC while charging, it wouldn't overlap with the active balancer as long as the user doesn't intentionally set the active balancing starting voltage to be in the 4-4.15V region, or set the cell difference too low. There are some YT videos online showing the balancer in action on LFP cells.
 
Still liking the idea, but a potential alternate option might be a more advanced 96/112s BMS wired as a piggyback to the factory BMS. If an adapter harness is needed anyway, one big one might be easier than 6 small ones. This would also work for both maximizing the factory battery, and building one with newer higher capacity cells.
 
Last edited:
Still liking the idea, but a potential alternate option might be a more advanced 96/112s BMS wired as a piggyback to the factory BMS. If an adapter harness is needed anyway, one big one might be easier than 6 small ones. This would also work for both maximizing the factory battery, and building one with newer higher capacity cells.
I'll keep my eyes open for something like that, but I have a feeling rather than a highly integrated single centralized unit, the more typical solution will be single independent units, or a stack of 16-cell or greater units that communicate via serial data communication. https://www.thunderstruck-ev.com/bms-controller.html

There are more advantages to piggyback at the module rather than at the BMS, namely, you reduce your voltage drop across cable length and achieve better resolution. Your system also won't potentially carry the current of two balancers across the same wires. So I'd choose to get the balancer as close as possible to the module itself to save on wire and complexity.

quick voltage drop calc If the wire harness from a middle module to the BECM was 3 feet (x2 for return wire), and the wire was 18awg, a balancing pair of module cells would have a 153mV drop at 2A, 535mV drop at 7A. If 22awg, 386mV / 1350mV. Since the cells need to be balanced within 20-40mV, that voltage drop needs to be accounted for through some software presets, or with a non-current-carrying Vsense wire for each current-carrying wire to measure the Vdiff across cells.

edit: It just occured to me that you could use the same wires to check voltage during PWM operation, while no current is flowing.

good articles here
https://www.digikey.com/en/articles...n-evs---part-i-passive-balancing-technologies
https://www.digikey.com/en/articles...n-evs---part-ii-active-balancing-technologies

https://www.monolithicpower.com/en/...d-power-systems/ev-battery-management-systems
 
Last edited:
I wasn't factoring in that the original BMS might be adjusted to compensate for the resistance of the leads, and it's a good point that the amperage my be too high with two BMSs working together. Also, the only 96s BMS units I was really finding are larger boxes. A simple Y-harness with that small 16s BMS would likely fit in the gaps between the modules.
 

Attachments

  • 421720025_968115344714619_3832043905344828111_n.jpg
    421720025_968115344714619_3832043905344828111_n.jpg
    89.8 KB
Unfortunately I think there are some fundamental misunderstandings here, betrayed by this line in particular:
Whenever your Spark EV top balances, The weak cells will throw away 31W / 8A of charging power so they don't become overcharged.
"That's not how this works. That's not how any of this works!"

Balancing is *not* a critical function of every charge cycle. Balancing is, in fact, very rarely needed on a battery pack - and at a very slow rate.

I get it, lithium batteries are a new tech, and complicated. But they adhere to the laws of physics, particularly "energy is neither created nor destroyed". Cells are discharged and recharged in perfect unison when in series - that is, exactly the same current (amps) are drawn from (or charged into) each one at all times. Balancing is the act of taking one cell and discharging it (or, in active balancing, charging it) in isolation.

Maybe this notion of "balancing needs 31 watts / 8 amps" comes from believing that the cells must somehow "fall out of balance" when they are discharged, which is what makes the voltages look all disoriented and wonky on a dead battery - e.g. a low cell is wildly far out of "balance" of the others. But that is not "balancing" or "being out of balance". That is just a factor that the different cells now have different capacities over time. When the same current (amps) is applied to all of them at the same time, the cells with a slightly lower capacity will reveal themselves to be dead (low voltage) first.

If you take that discharged pack with all its wildly varied voltages, and merely charge it - without any BMS doing any balancing whatsoever - the pack will (if it's been top-balanced recently) return to a near-perfect uniform balance at the top when it's charged. In fact, it'll appear mostly balanced shortly after you start charging it, as all the low cells come up from the "death cliff" (see the far left of the chart, below):
1740386191755.png

Balance is just needed to make up for minor variance in internal resistance of the cells. Following the "energy is neither created nor destroyed" idea, consider that with higher internal resistance on some cells versus others, that means some cells will heat more than others when they are discharged or charged (e.g. when fast charging or with perky acceleration). That additional heat (energy) produced (wasted) by the weaker cells will result in more energy (watts) being needed to recharge the cell, thus it falls slightly out of balance with the others. That minor deviation is all that needs to be compensated by the balancing of the BMS.

Now, as for using active balancing to compensate for a weak cell (and give it more effective capacity by discharging other cells into it while driving)? Tricky, tricky. In theory it could be possible, but in practice... has anyone ever implemented such a thing? It would be causing "imbalance" all throughout the drive cycle, and would have to almost constantly be re-balancing - especially when recharging (and could certainly cause problems when DC charging). Personally, I'd much rather hear about a replacement BMS that properly reacts to the low cell of a pack, and adjusts the usable range accordingly, instead of... completely bricking Sparks when they get too low.
 
FalconFour, first welcome to the forums. I'm going to have a spat with you. Thank you for recently posting your insights and your thoughts regarding the Spark EVs BMS. I very much agree that the handling of weak cells is very poorly implemented, although it might truly be the fault of the Hybrid Powertrain Control Module 2.

Unfortunately I think there are some fundamental misunderstandings here, betrayed by this line in particular:

Whenever your Spark EV top balances, The weak cells will throw away 31W / 8A of charging power so they don't become overcharged.
"That's not how this works. That's not how any of this works!"

Balancing is *not* a critical function of every charge cycle. Balancing is, in fact, very rarely needed on a battery pack - and at a very slow rate.
Lol, well feel free to personally verify this by observing the OBD II PIDs.

I admire your certainty on the matter, but I'm not sure what you're arguing here. Do you have a problem with the individual cell balancing power I calculated? And forget for a moment whether it balances every charge cycle (it does). When it top-balances, what do you think the balancing current is and what will you base those numbers on?


I get it, lithium batteries are a new tech, and complicated. But they adhere to the laws of physics, particularly "energy is neither created nor destroyed". Cells are discharged and recharged in perfect unison when in series - that is, exactly the same current (amps) are drawn from (or charged into) each one at all times. Balancing is the act of taking one cell and discharging it (or, in active balancing, charging it) in isolation.

Assuming I don't understand battery tech and am breaking energy conservation laws is a profoundly presumptuous thing to say. What violation is this based on?

Curiously, in the same paragraph, you seem to describe battery charging in ideal terms where cell capacities and internal resistances are perfectly matched. This is most certainly a gross oversimplification. You'd be describing idealized batteries. So how now do certain physical realities cease to exist? Exactly which laws can be broken and by whom?

Maybe this notion of "balancing needs 31 watts / 8 amps" comes from believing that the cells must somehow "fall out of balance" when they are discharged,which is what makes the voltages look all disoriented and wonky on a dead battery - e.g. a low cell is wildly far out of "balance" of the others


It was actually quite clearly outlined in my first post if you read it. It was never about how much power balancing needed. Rather, how much power the system must be designed to handle per cell, given a specific input power and observing conservation of energy laws that you presumed I did not understand, along with battery tech. The subheading is below, you should read it.

It would be funny if the entire premise of your post calling me out as violating the 1st law of thermodynamics was because of some personal incredulity that prevented you from considering such an efficient battery could need balancing when it fully charges, at all, any time, ever.


But that is not "balancing" or "being out of balance". That is just a factor that the different cells now have different capacities over time. When the same current (amps) is applied to all of them at the same time, the cells with a slightly lower capacity will reveal themselves to be dead (low voltage) first.
Ok.
Actually, it's quite the contrary. Cells do fall out of balance during discharge and charge cycles. Your position that it doesn't confounds me. The reactions in Li-ion cells are not perfect, and we're even discussing degraded cells here.

Balancing is the act of matching cell voltages through deliberate discharge of individual cells, so that charging and discharging can occur symmetrically across the pack and where the act of balancing does extend to compensating for weak cells. Balancing can be done to the 96 series cells simultaneously, and it ensures the usable capacity, power, and efficiency will all be maximized. If not for the efficiency and power in a top-balancer, it is to protect the weak cells from going overvoltage. That's one of the reasons GM top-balances, and why this thread also considers an added active bottom-balancer.

Being out of balance describes cells deviating in voltage. Besides capacity, real non-ideal batteries have dynamic resistance over their discharge curve. Here's an example from DOI:10.3390/en6105538



If your weak cell enters the high resistance zones first, then by definition it will operate less efficiently for the same current, and continued discharging and charging will further exacerbate its departure from the rest of the pack's state of charge. Cells that have different capacities will reach undervoltage and overvoltage cutoffs at different times for the same current. I believe we agree on this point. During this time we get a larger variation between minimum and maximum cell voltage, which is where the term "out of balance" comes from. If this wasn't a concern and li-ion stayed in perfect unison there wouldn't be a market for Li-ion BMSs. Everything would be perfectly matched always and forever.

If you take that discharged pack with all its wildly varied voltages, and merely charge it - without any BMS doing any balancing whatsoever - the pack will (if it's been top-balanced recently) return to a near-perfect uniform balance at the top when it's charged. In fact, it'll appear mostly balanced shortly after you start charging it, as all the low cells come up from the "death cliff" (see the far left of the chart, below):

1740386191755.png
Yes I agree, except voltage only roughly correlates SOC and you wouldn't look at this curve to estimate those widely varied voltages. You would want a table showing open circuit voltage if the cell chemistries had settled in equilibrium. Like this one below.

https://www.batterydesign.net/electrical/open-circuit-voltage/



Balance is just needed to make up for minor variance in internal resistance of the cells. Following the "energy is neither created nor destroyed" idea, consider that with higher internal resistance on some cells versus others, that means some cells will heat more than others when they are discharged or charged (e.g. when fast charging or with perky acceleration). That additional heat (energy) produced (wasted) by the weaker cells will result in more energy (watts) being needed to recharge the cell, thus it falls slightly out of balance with the others. That minor deviation is all that needs to be compensated by the balancing of the BMS.
Yes I agree with most of that, and not only will degraded cells have a higher internal resistance, the dynamic resistance will amplify this difference if the degraded cell is not in the same SOC as other series cells.

I'd like to hear how minor of a deviation you believe this causes, how quickly it balances and how long. Care to guess?



Now, as for using active balancing to compensate for a weak cell (and give it more effective capacity by discharging other cells into it while driving)? Tricky, tricky. In theory it could be possible, but in practice... has anyone ever implemented such a thing? It would be causing "imbalance" all throughout the drive cycle, and would have to almost constantly be re-balancing - especially when recharging (and could certainly cause problems when DC charging). Personally, I'd much rather hear about a replacement BMS that properly reacts to the low cell of a pack, and adjusts the usable range accordingly, instead of... completely bricking Sparks when they get too low.

I'm absolutely certain there is no replacement to the OEM BMS. Anything coming close would be entirely custom and must communicate through GMLAN. Too many unknowns and too much reverse engineering.
 
Well, respectfully, 🙇‍♂️ from one reasonably well-educated battery guru to another, let us spar then.

I admire your certainty on the matter, but I'm not sure what you're arguing here. Do you have a problem with the individual cell balancing power I calculated?
Absolutely. There's no way I would imagine they'd build 8 amps of discharge current and 30-odd watts of heat dissipation *per cell*, times *96 cells*, into the BMS board. That would be so magnificently inefficient, it would require liquid cooling for the BMS and would show as significant energy loss in the charging. I think those calculations suffer "abstracted math gone too long in a vacuum" - numbers obtained from some source that were put through too many seemingly-logical transformations to come to the conclusion. Instead, I suspect a balance current of 1 amp or so, periodically performed to maintain a top balance as needed.

How "as needed"?
What violation is this based on?
The violation I see is that balancing is needed only where the cell "leaks" energy outside the (logically) sealed system of the battery during its usage. That would come in the form of imbalanced cell resistance, but that's about it. Any other time, the cells will exhibit an "imbalance" at low state of charge, but that isn't a true imbalance - it's just the result of the cells having differing capacity. When recharged, as they walk their SOC% range to reach full charge together, they naturally come back into balance - no external balancing needed, except to recover from the minor imbalance that differing resistance carries.

That is to say, it doesn't make any sense to have 8 amps of balance capability, when it only needs to apply a balance while it's resting, after charging. It can take its merry time to work off whatever small imbalance may turn up.

However, the BECM detected the weak cells dropping below 3V and reduced your power for the entire battery. But it's not enough and the car shuts down after a only a few miles as the weak cells cross the 2.5V cutoff.
This part in particular seems untested and strange to me. The fault level is in the service manual as 1.75v - that's the only "cutoff" the car seems to know/react to. And in my car today, testing it with the heater, I found the "propulsion power reduced" appears at 2.5v, which is pretty damn crazy (because 2.5v is far into the death curve - that cell is totally dead and the car *should* shut off, but it keeps going). From here, nerd to nerd, I admit I took a bit less credibility from the rest of the writing - it's so easy to see the behavior, it reads like an ideal but untested claim.

Besides capacity, real non-ideal batteries have dynamic resistance over their discharge curve. Here's an example from DOI:10.3390/en6105538
This is actually new to me. Cells have a higher resistance at both ends of SOC%? I knew at the low end - as to me, it seems like C-rate has a SOC% component to it (that is to say: it's more stressful to put a high load on a low-charged cell), but at the high end, I always thought the cell was at its absolute peak performance (taking high load would be "good" for it). I'll have to chew on this a bit...

Your position that it doesn't confounds me.
Never was a position that they don't fall out of balance - merely that they don't do it _significantly_. In fact, I've had several battery banks at home charging/discharging almost fully, every day, slowly (about 0.1 C-rate), with only protective BMS watching over them. The cells never drift - after 300+ cycles, the drift is so minor it only takes a few minutes for my iCharger X12 (with about 2-amp balance capability) to regain a perfect top balance - in 64Ah banks. That's my experience - cells don't drift in balance (in any significant way at least) unless they're abused. To that point, I also have a Storm2 Liquid battery bank (90Wh - made of 8x 18650 NMC cells, 2p/4s) that's been through about 700 cycles at ~>1C rate, with pretty high heat and stress... its cells end up drifting out of balance regularly and somewhat chronically, yet they didn't equip the BMS board with a balance circuit, so I have to manually balance it. I have experience at both ends.

Lol, well feel free to personally verify this by observing the OBD II PIDs.
Point me to how to obtain the data, and I'll gladly scoop it up. Currently I've been using the Car Scanner app to get data, and though it provides critically useful data, I imagine there's a lot it doesn't show.

I'm absolutely certain there is no replacement to the OEM BMS. Anything coming close would be entirely custom and must communicate through GMLAN. Too many unknowns and too much reverse engineering.
totally with ya there... but then again, just glance over at what's happening with Nissan Leaf replacement packs. It doesn't seem completely out of the question, especially if the Chevy Volt and Chevy Spark can trade notes.
 
Last edited:
Your nerd fight (meant respectfully, I enjoy a discussion between two people who know more about a subject than me) has me thinking about the idea of a piggyback BMS. There's really two things we want improved on the factory BMS, balancing and a safety cutoff that protects the battery without bricking the car. The balancing part is easy if we're already adding a second BMS. Cloning all the the CAN messages and functions from the factory BMS into an aftermarket one probably isn't feasible unless GM really wants to help, but getting a programmable BMS to spit out the "turtle mode" signal or something along those lines should be doable. Even if that doesn't work, triggering a relay that disconnects something like a temp sensor might cause a non-critical error that lets you know to back off before the battery actually bricks.
 
Using quotes takes up the character limit so I have to double post.
Do you have a problem with the individual cell balancing power I calculated?
Absolutely. There's no way I would imagine they'd build 8 amps of discharge current and 30-odd watts of heat dissipation *per cell*, times *96 cells*, into the BMS board. That would be so magnificently inefficient, it would require liquid cooling for the BMS and would show as significant energy loss in the charging. I think those calculations suffer "abstracted math gone too long in a vacuum" - numbers obtained from some source that were put through too many seemingly-logical transformations to come to the conclusion. Instead, I suspect a balance current of 1 amp or so, periodically performed to maintain a top balance as needed.

Well, if 96 cells were out of balance, it begs the question "in relation to what?". If all cells are being discharged, you're not balancing or charging, you're powering a heater. All I'm describing is a network of 96 switched shunt resistors, controlled by the BECM, serving the role of an OVP (overvoltage protection) balancer. These independent closed-loop circuits do less balancing and more shunting or bypassing of the series conducted current.

The effect of an active shunt resistor on anode current is a net reduction, net zeroing, or net negative contribution towards cell charging. Whatever that result is, it depends on the charging level providing the output current and at least also the time.
In the worst case with 240V and DC charging, the heat dissipation is not as dramatic as you would imagine compared to a 3kW convective space heater. However, the heat being dissipated is definitely going to accumulate and be visible as a temperature in the battery module the BECM's balance board is mounted closest to. You'll probably find a temperature from 99.899% until charging finishes.

Instead, I suspect a balance current of 1 amp or so, periodically performed to maintain a top balance as needed.
Certainly possible for it to be that small, which would require several partial balance sessions. However, this conflicts with my own observations of the car's top charging behavior. At usually 99.899%, power does not taper and remains fixed, SOC becomes fixed, and cell voltages do not exceed 4.12V.

Point me to how to obtain the data, and I'll gladly scoop it up. Currently I've been using the Car Scanner app to get data, and though it provides critically useful data, I imagine there's a lot it doesn't show.
Here's some to start with based on the Bolt PIDs you can use with TorquePro. Limitations being, there is no cell level current measurement and voltage is only accurate to 2 decimal places. Some work that needs to be done is to log cell temperatures and power vs time for 240V charging and 120V charging.

SoC Raw HD
Min Cell V
Min Cell #
Max Cell V
Max Cell #
!Battery - Pack - Resistance
Batt Term V
CAC Volts
CAC Amps
DCC PowerIn
DCC PowerOut(W)
*Charger - System Efficiency (Alt Calc)
!Battery - Coolant Temp
Max Batt (*Battery - Pack Temp - Max Temp)
Min Batt (*Battery - Pack Temp - Min Temp)
*Battery - Pack Temp - 1
*Battery - Pack Temp - 2
*Battery - Pack Temp - 3
*Battery - Pack Temp - 4
*Battery - Pack Temp - 5
*Battery - Pack Temp - 6
+Cell Voltage #01
+Cell Voltage #02
...
...
+Cell Voltage #95
+Cell Voltage #96


Whatever the termination condition is to finish fully charging, it might be either a BECM overtemp, a timer for each charging level, termination current, or something not captured in the PIDs.

...Continuing to debate high balance resistor current

The violation I see is that balancing is needed only where the cell "leaks" energy outside the (logically) sealed system of the battery during its usage. That would come in the form of imbalanced cell resistance, but that's about it. Any other time, the cells will exhibit an "imbalance" at low state of charge, but that isn't a true imbalance - it's just the result of the cells having differing capacity. When recharged, as they walk their SOC% range to reach full charge together, they naturally come back into balance - no external balancing needed, except to recover from the minor imbalance that differing resistance carries.
electrochemical cells are not perfect machines, that "only" betrays more noteworthy violations. In addition to the quiescent power draw (the not so literal "leaks" you mentioned) of the BECM circuitry, there is also self-discharge that drains the individual cells over time, which varies with temperature and their individual weatherings.

I will also reiterate that the dynamic energy losses are far more noteworthy. The energy in an electrochemical cell is not guaranteed, especially in an electric vehicle where the discharge rates can be aggressive, exceeding several times the gentle C rate required for the chemistry to maintain its highest coulombic efficiency. This is staved off partially due to the fact that these are, according to GM, supposed to be battery cells fit for hybrids that can handle higher C rate discharges to compensate for such a relatively small pack. However, it's clear that the pack voltage sags dramatically in the Spark EV during high power drives, so if this were true then, it isn't now.
Any other time, the cells will exhibit an "imbalance" at low state of charge, but that isn't a true imbalance - it's just the result of the cells having differing capacity
Voltage has never been a reliable measure of SOC in lithium-ion cells no matter the form factor. That's why we have fuel gauges that in addition to voltage sensing, use coulomb counters and even then, still need periodic calibration to be meaningfully accurate.

That is to say, it doesn't make any sense to have 8 amps of balance capability, when it only needs to apply a balance while it's resting, after charging. It can take its merry time to work off whatever small imbalance may turn up.
That might not be an acceptable strategy for an electric vehicle battery when charging is designed for "Just in time" use and drivers expect the maximum usable capacity to be available for them. Keep in mind that 8A of balance current represents a 1/5C discharge, which can be PWM'd by the BECM, as well as reduced further by external charging currents.
 
However, the BECM detected the weak cells dropping below 3V and reduced your power for the entire battery. But it's not enough and the car shuts down after a only a few miles as the weak cells cross the 2.5V cutoff.
This part in particular seems untested and strange to me. The fault level is in the service manual as 1.75v - that's the only "cutoff" the car seems to know/react to. And in my car today, testing it with the heater, I found the "propulsion power reduced" appears at 2.5v, which is pretty damn crazy (because 2.5v is far into the death curve - that cell is totally dead and the car *should* shut off, but it keeps going). From here, nerd to nerd, I admit I took a bit less credibility from the rest of the writing - it's so easy to see the behavior, it reads like an ideal but untested claim.
Right, well in my quote I wasn't actually talking about a critical fault cutoff, just typical out of energy thresholds. But maybe you're referring to untested by GM?

you might be interested in observing
*Battery - Pack - Minimum Voltage
This is a soft limit that moves around with respect to SoC, but it might be worth observing in relation to your pack voltage.

In your case with the heater test, the cells are under some load, and so the voltage will sag especially at the knee of the discharge curve, meaning there's still enough active lithium in the anode/electrolyte and surface area to knock out some electrons with i suppose. If you're using just the heater that's probably a great C rate to squeeze out another 50-100Wh. The car will react by a series of escalating C rate limits as to not intentionally allow you to brick your car, but clearly the system isn't perfect. I've not had one of these faults yet. However, it sounds as though it requires an abrupt pulsed discharge with a weak cell to defeat the system.


Never was a position that they don't fall out of balance - merely that they don't do it _significantly_. In fact, I've had several battery banks at home charging/discharging almost fully, every day, slowly (about 0.1 C-rate), with only protective BMS watching over them. The cells never drift - after 300+ cycles, the drift is so minor it only takes a few minutes for my iCharger X12 (with about 2-amp balance capability) to regain a perfect top balance - in 64Ah banks. That's my experience - cells don't drift in balance (in any significant way at least) unless they're abused. To that point, I also have a Storm2 Liquid battery bank (90Wh - made of 8x 18650 NMC cells, 2p/4s) that's been through about 700 cycles at ~>1C rate, with pretty high heat and stress... its cells end up drifting out of balance regularly and somewhat chronically, yet they didn't equip the BMS board with a balance circuit, so I have to manually balance it. I have experience at both ends.
Right, well 1/10C is almost as good as you get for round trip coulombic (and voltaic) efficiency. I would agree that low C-rates are the way to avoid balancing. The 1C 2p4s battery bank sounds annoying. In a way, I'm not entirely surprised to hear about that for a consumer item. I'm sure you've seen the even more egregious single use powerbanks for sale.

On the point about cells drifting when they're abused, I think the Spark EVs battery is undersized despite being a high C pouch type, and capacity loss isn't doing it any favours. What I'm suggesting is that the amount of imbalance due to 1 capacity and 2 coulombic efficiency losses amounts to something in the order of 20-30 Wh round trip delta that needs to be accounted for at the top. This thread is intended to offer a suggestion to transfer energy into the weak cell with a bottom balancer, but we got to discussing what the top balancer really does. I understand you believe there are better places worth exploring to make the driving experience better. However to test whether the passive balancer is charging active there are two opportunities to explore

1) Observe and report similarities and differences between cars during my suggested balancing window
2) Observe behavior after fully charging
3) compare new observations and discuss most likely mechanism.


Some bolt EV owners have already done this but it could be worth doing it anyway.

I think the 99.899% SOC hang while large amounts of power still enter the car and vanish without either cell voltages changing much or power tapering off as you would expect from a CC-CV charge curve supports balancing while charging. That could benefit from better observations. The theoretical net zero anode current balancing current is in the ballpark of 1/5C. It doesn't need to be 8A, but that calculation was based on the assumption that the entire anode current is being shunted to continue charging. It could be a different current, it could be a different duration, or it could have a duty cycle especially if the balance lines are shared with the voltage sensing lines. If we're really talking about conservation of energy.

Is the energy used after 99.899% usable?

After fully charging the car, the cell voltages can settle lower and the SOC can drop with it. SOC is confusing and full of unknown variables but it could possibly support passive balancing without charging, although I've seen the opposite happen simply by leaving the car in the sun in the Summer. The fuel gauge is mostly a black box.
 
Your nerd fight (meant respectfully, I enjoy a discussion between two people who know more about a subject than me) has me thinking about the idea of a piggyback BMS. There's really two things we want improved on the factory BMS, balancing and a safety cutoff that protects the battery without bricking the car. The balancing part is easy if we're already adding a second BMS. Cloning all the the CAN messages and functions from the factory BMS into an aftermarket one probably isn't feasible unless GM really wants to help, but getting a programmable BMS to spit out the "turtle mode" signal or something along those lines should be doable. Even if that doesn't work, triggering a relay that disconnects something like a temp sensor might cause a non-critical error that lets you know to back off before the battery actually bricks.
Lol, I wonder if the spark has this same magnetic float sensor for battery coolant in this video

I wonder if this will cause a no programming required turtle mode with the flip of a switch, or a magnet.
vehicle fitment shows the spark, so it should be there https://www.gmpartsdirect.com/oem-parts/gm-battery-sensor-22922224
 
Been a long time since I posted but feel at home with your nerdly discussion. Have a question and wonder what you think of this.
This iPhone app I occasionally work on displays data I have taken using a Arduino DueCANshield and some software written by myself
and a deceased forum member https://www.mychevysparkev.com/members/solder.544/. The idea was to show cell variations at a glance.

Here is my 2014 Spark at 15% SOC. The cells are arranged as they are physically in the car. The height of each cell is .2V The vertical
center of each cell is at the average voltage for the whole pack. A positive deviation shows as green and negative red. The data is 3 digits
such as 3.123 but not sure its monotonic as some cells seem to never show some values.
The temperatures are listed but I haven't figured how to show graphically. The guessometer value means 9 miles left and 79 traveled.

low.jpeg

Here's after charging.
high.jpeg

Haven't looked at taking data while charging or flooring. I'm not sure I can do that. Thought it was in some maintenance mode.
Haven't looked at the CAN stuff for years.
 
That's an interesting looking visual display interface. Leaf Spy is probably the closest similarity I can compare it to for at a glance readings. The designers went for an autofit column chart on a single page with cells from 1-96 shown. I would personally prefer cells visualized in a single column chart as opposed to the sort of stacked column chart you've got going on, just because you can draw a line straight across a battery module as a min/max/avg and compare a range of cells at once, but if it was meant to show at a glance variations, I can understand wanting to arrange it this way. Do you have an example of the cells at 1% SOC just to see some craziness?

I don't own an iPhone, or an arduino can shield, maybe I should. However, it would be nice if you had even more blatant visual cues like a callout pointing to the min/max cell in addition to how you arrange the vertical center.

If you can get it working, it would be interesting to see what it looks like while flooring, as well as what is happening to cells at the top of charging. I think the best way to intentionally create deviation just to have something to look at would be to sustain high C discharges below 10% and above 80% if it's something you can facilitate.

There's a lot you can do with something like this. It would also be nice if data were logged and compared against previous cycles to see trends.
You could also plot discharge and charging curves for trips or charge sessions with a line chart, select a point and view what the cells looked like on another page.

The Spark Ev's energy display screen has always been a big disappointment. The potential to visualize data was completely missed by the software team at GM. The only thing they got right was the power visualization in the dashboard...
 
All wildly interesting discussions - I'm loving this. The more I read and think about it, more bits make sense (although the nuance I'm gathering is different than it seems on first read). Indeed, to balance out for 1 or 2 cells that are higher resistance than the others, the balance action would need to burn off energy from 94 other cells to match - due to the fundamental "flaw" that shunt balancing (the most common balancing I know of) can only discharge, not charge. So instead of adding more necessary charge to the 1 or 2 low cells, it has to instead discharge (by heat waste) the not-lower cells. That can lead to a lot of wasted energy.

And, indeed, my car does hang out at "100%" for some time - although what's interesting is that the charge time estimate (while at, say, 30%) is always bang-on accurate to the top. So, it says it'll be done by 4:45pm? It's done at 4:45pm - even though it hit 100% at, say, 4:15pm. I account that as being filling more at the top just to keep the gauge pretty... a bit like how iPhone "stuffs" the battery gauge at 3%-0% (when it's really more like 10%-0%).

Odd point about the battery SOC, though. My Spark's SOC gauge is basically "broken" at this point. Reported capacity is stuck at 13.5kWh (though it bumped up to 13.6kWh recently), despite only taking in about 11kWh AC from dead to full. The battery dies at 10 miles - I have to treat 10 miles as "you're dead". The BMS pays no mind to the fact that the lower-capacity cell is lingering down at 3 volts. The only limit the car seems to recognize is the low cell hitting 1.75v, which is the "you're a pumpkin, time to die" trigger point that throws a pack lockout (can no longer charge or discharge, car goes to scrapyard). It also throws a nearly-impotent, arbitrary (pedal remapping), worthless acceleration limiter if/when the cell hits 2.5 volts.

And to that point, you seem to be the only real serious battery guy I've seen involved in Sparks so far - does this not also strike you as horribly defective BMS logic? Maybe we need a new thread for that discussion, though. It seems like it should be criminal for GM to still be shipping "replacement" batteries - especially with a $15k out-of-warranty price tag - with the same defective software.

I'd love to get some better data logging/reporting towards any end that might help push to keep more Sparks on the road.
 
... - does this not also strike you as horribly defective BMS logic? Maybe we need a new thread for that discussion, though. It seems like it should be criminal for GM to still be shipping "replacement" batteries - especially with a $15k out-of-warranty price tag - with the same defective software.
..
I think a 'small' detail is being overlooked here. The BMS logic is designed to keep healthy cells healthy. The unfortunate reality is that LiX cells eventually fail. A cell that has only 1.75 volt under load is likely trash. Doesn't matter how long you charge, it retains little energy. As far as I know nobody's BMS is made to eliminate bad cells. It could probably be done, but not by trying to charge them. I think I saw a video of some cell that had a shorting bar built into it. That way you could pulse modulate the cell down to zero and then keep it shorted. That way the rest of the pack would be only affected by the drop of one cell.
 
Last edited:
That's an interesting looking visual display interface. Leaf Spy is probably the closest similarity I can compare it to for at a glance readings. The designers went for an autofit column chart on a single page with cells from 1-96 shown. I would personally prefer cells visualized in a single column chart as opposed to the sort of stacked column chart you've got going on, just because you can draw a line straight across a battery module as a min/max/avg and compare a range of cells at once, but if it was meant to show at a glance variations, I can understand wanting to arrange it this way. Do you have an example of the cells at 1% SOC just to see some craziness?
Sorry, lowest I have is the 14% above and its flat
I don't own an iPhone, or an arduino can shield, maybe I should. However, it would be nice if you had even more blatant visual cues like a callout pointing to the min/max cell in addition to how you arrange the vertical center.
I'm still trying to figure out the best way to visualize. So what I currently do is:
1. Grab all the cell voltages
2. Find the average, the minimum, the maximum, and the total.
3. Find the greater of max-avg or avg-min and use that as the max + and max - for all the cells.
This dump shows this well. The average-min was .103V so the scale is +/-.103V. This particular picture actually shows a common glitch, an unusually low value followed by an unusually high value on the next cell. This seems to happen randomly in time and location about every four dumps. I ignore the whole thing when I get this. Wonder how the BMS deals with this.
bad.jpeg

If you can get it working, it would be interesting to see what it looks like while flooring, as well as what is happening to cells at the top of charging. I think the best way to intentionally create deviation just to have something to look at would be to sustain high C discharges below 10% and above 80% if it's something you can facilitate.
This is nearly flooring up a 15% grade for approximately 3 seconds. (LA traffic didn't notice :)
flooring.jpeg


A few seconds later. Note the recovery. Also note that I changed the scaling algorithm from what I described above. Otherwise a picture like below would be +/-.001V and look like a full-scale checker board. The change is that if the max voltage deviation is less than .1V then use +/-.05V for scale. This makes good plots look like below and emphasizes bad cells.
afterfloor.jpeg
Also note that in just a few seconds the whole pack rose fairly evenly by 1 degree F.
 
Last edited:
I think a 'small' detail is being overlooked here. The BMS logic is designed to keep healthy cells healthy. The unfortunate reality is that LiX cells eventually fail. A cell that has only 1.75 volt under load is likely trash. Doesn't matter how long you charge, it retains little energy. As far as I know nobody's BMS is made to eliminate bad cells.
The job of a BMS is to... exactly that, keep cells healthy. The problem is that the Spark BMS does not do what others (e.g. Nissan, i-MiEV at least) do, which is to maintain the health of ALL cells, not the pack as a whole voltage system. The Spark BMS seems to only care about total voltage (which is not something it should ever even care about), which is the only thing that sets health state / capacity values.

I'm not savvy to re-writing / re-explaining things, so my explanation is here:

The Spark's BMS is uniquely and horrifyingly flawed in its behavior. Instead of limiting system performance around the lowest cell, it allows the system to beat the sh^t out of a cell, making a weak cell worse, by allowing wildly low voltages under inappropriately high discharge (at low SOC%) that the BMS allows when it should be constraining performance. Eventually, after repeated abuse over months/years of Perky Acceleration under low SOC%, that one cell gets bad enough to significantly deviate from the rest at a low state-of-charge, such that it will be fully and completely discharged sooner than the others, resulting in that 1.75v trip state. It's not that the cell is "Really Bad", and this is a serious mistake/misunderstanding that many people seem to believe. The "1.75v state" is just a state that any EV pack would get into, if its BMS were stupid enough to allow it to happen (instead of cutting-off discharge at a much safer voltage).

The condition occurs when a cell hits 1.75v, but it doesn't stay there for long. After the pack is disconnected by the fault state, the low cell jumps up to a more normal voltage (mine was 2.8v). But even 2.8v ought to tell you something: that cell is jumping off the "dead cliff" of voltage - it's completely discharged, 0% SOC, while the other cells in the pack may still be at 8%. The BMS should have been applying power limitations and then stopped discharge *gracefully* (not in a way of asserting a lockout fault) with that one cell.
 
Back
Top