I’ve learned a lot about the behavior of DC motors from the book Introduction to Mechatronic Design. Much of what I will discuss in this post comes from chapter 20 of this book. Electrical engineers reading this post will laugh at how elementary some of this information is, but it was news to me as a mechanical engineer.
I find that I solidify my understanding of ideas and concepts by writing about them. So what follows is a brief book report on a phenomena often called inductive kickback, which I believe was the cause of the wiring failure on my robot.
A motor is an inductor. Inductors store energy in magnetic fields. When you apply voltage to a motor, a magnetic field is created around the motor. A unique characteristic of inductors is that they resist changes in the flow of electricity through them.
If you were to suddenly disconnect the voltage source, the magnetic field would try to keep electricity flowing through the motor. But because there is no where for that electricity to go, the voltage across the motor terminals increases instead. And it can increase to very high levels, sometimes thousands of volts.
The energy stored in the motor’s magnetic field has to go somewhere. What usually happens is the voltage becomes high enough across the motor terminals that a nearby component becomes a target for electricity to arc through and flow out of the motor.
I didn’t state it in my previous post, but to power on the deck motors I’m using a large relay. To turn them off you have to open the relay, and you end up creating exactly the inductive kickback situation described above. And with nowhere for electricity to flow, I think the voltage between some of my wires became so large that it arced through the insulation.
The damaged wires in the picture above are the + voltage wire on the motors and a battery ground wire, curiously enough. And notice how close they are to each other: a perfect place for electricity to arc. Better than the contacts inside the relay, which was still functional, surprisingly.
I now know why wires are voltage rated: it has nothing to do with the conductor material or size, and everything to do with the insulation. The insulation will only prevent arcing below a certain voltage threshold. And in the field several weeks ago, it looks like I greatly exceeded that threshold, whatever it was for those wires.
I checked the Posi-Lock website, and their connectors are rated for 600V. I don’t remember where I got my wire from. I’m sure it was the cheapest knock-off stuff I could find. The insulation has no identifying marks, so I’m not sure it’s voltage rated at all.
Next post I’ll describe the changes I’m going to make to robustify the wiring and electrical components in the robot mower.
If you look closely at the charts in the previous post, you’ll notice they end quite abruptly. The picture above shows why.
I have the robot mower configured so the blades only turn on when you hold the rudder stick to the right. The stick is spring loaded, so letting go of it returns it to center, deactivating the blades.
The field I was testing in had several large shrubs in it. One of my goals in the field was to push the envelope on what the mower was capable of cutting through, so I let it mow over several of them in autonomous mode. However, there was a particularly large shrub in the robot’s path, and so I decided to turn the mower blades off by releasing the rudder stick.
As expected, the blades stopped. But the robot stopped moving, too. This was curious: the robot should have continued driving. So I pressed the emergency stop button on the robot to immobilize it, and walked over to the truck where I had Mission Planner running on my laptop.
The Mission Planner screen was frozen as if it wasn’t receiving telemetry from the robot. It took me a few minutes to realize that the issue wasn’t Mission Planner or my laptop radio: it was that the robot wasn’t sending telemetry. And even more curiously, the voltage reading on my SLA batteries said 12.50V. I was worried that I’d really fried my batteries, so I walked back over to the robot to investigate.
Opening the battery bay I found a slight amount of smoke and that awful burnt plastic smell from melted wire insulation. But other than that everything seemed fine. There wasn’t a smoldering fire inside the robot. The batteries weren’t hot to the touch.
At first I chalked it up to too much current running through the wires. The robot uses 120A at peak current, and at those levels even a small amount of resistance could cause the wires to heat up a lot. Perhaps the wires just got hot enough to melt the insulation and came into contact right at the moment I decided to shut the mower blades off?
This just so story didn’t sit well with me. What an awful coincidence that the wires failed at exactly the same moment I was shutting off the motors, especially after the robot ran fine for five minutes prior. Why didn’t they fail earlier?
Additionally, why didn’t the exposed wire conductors fuse together after they came into contact? I’ve heard of people using SLA batteries smaller than mine to spot weld 18650 battery tabs together. The wires were really close together, but weren’t touching when I opened the battery bay. And the wire strands don’t seem to be melted judging from the picture above. Which was extraordinarily lucky, given how bad that could have been.
I did have enough sense to put a 100A fuse on my batteries. It seemed logical that the fuse was what saved my bacon. But using the multimeter to test the fuse revealed that it was not, in fact, blown. At this point I was completely bewildered. I thanked the good Lord that the robot hadn’t turned into a massive lead acid fireball, made sure all my batteries were all disconnected, packed up, and went home.
I’m sure readers are ready to scold me for all of the janky things going on in my battery bay, and I certainly deserve that criticism. My mantra is make it robust, and I definitely did not live up to that standard with my wiring. Below is a full picture of the battery bay.
I have my own list of janky things in here that I intend to fix. What do you see that is janky? Feel free to comment below!
In the next post, I’ll explain what I think went wrong, and the improvements I’m going to make to mitigate the problem and eliminate all the redneck wiring I’ve got going on.
I’ve received a few of the weldments back from the shop. While I wait for them to finish fabricating the mower deck weldment I’ve started to put some components together.
Back when I installed the wheel encoders on the wheel chair motors, I stupidly drilled a hole through the dust cover on the back of the motor so I could run data wires to the encoder. In hindsight, I should have run them through the little sleeve that the power and brake wires were routed through.
I had to take the brake off to put the encoder on the motor anyway, so there was a perfect amount of space for the encoder wires once the brake wires were removed from the sleeve.
Because you can’t undrill a hole I purchased a pair of cheap gear motors off eBay for $80. I mostly wanted them for the motor dust cover, but it will be nice to have spare parts on hand in case I need them down the road.
I took the aluminum back piece off the motors and removed the two white wires you see in the picture. The hole you see them sticking through was where I routed the data cables for the encoder.
Pro tip for dealing with these motors: There are two Philips head M5x150 screws holding the aluminum back piece to the mounting plate. These screws have lock washers under them. The screws are ridiculously soft and easy to strip the heads on. If you want to remove them so they’re still reusable, it’s best to use an impact driver. It’s extremely easy to strip them using a screwdriver.
I managed to strip the screws on both motors before I drilled them out and discovered this, so heads up to anyone modifying the motors like I am here. I ordered replacements that were socket head cap screws instead, hoping to avoid this issue in the future.
Once you have the aluminum back piece off, you’ll see wires inside like this:
The inside is going to be quite dusty with a lot of little brush particles inside. I blew it out with compressed air after taking this picture.
You can pull the white wires through pretty easily, but I had to bend the black wire terminal so I could get access to the hole to feed the encoder data cables to. I also ended up removing the brushes so I’d have more room to work.
Once you’ve got the white brake wires removed, you can pretty easily push the encoder wires through. The end result looked like this:
One thing I realized doing this is that it would have been pretty easy to drill holes into the aluminum back piece for screwing the encoder base down. I selected an adhesive backed encoder because I didn’t want to mess with it. But going to the trouble to take it apart like this changes that calculus. If I find myself doing this again, I’ll order an encoder that has clearance holes for mounting screws.
After I had everything wired up, I tested the encoder to make sure it was working well. Nothing like having to tear down a motor after it’s already on the robot to fix a loose wire.
I also wanted to make sure that running the data cables next to the power supply cables wouldn’t cause any issues. I didn’t find any during the bench test. Fingers crossed none pop up in the field either.
I used 5/8-11 screws for all the connections in the front caster assembly. I wanted to standardize on one size so I could buy several of one type of lock nut. Unfortunately the width of a 5/8-11 lock nut is 0.9375in and I don’t have a wrench that size. I also don’t have a hex wrench for the socket head cap screws either. The picture above shows everything hand tightened. I’ll have to go pick up the right tools to get this all put together.
I’m estimating the electric deck motors on the robot lawn mower will use about 168A of current collectively, and at 24V that means they consume just over 4kW or 5.3hp of power. Even in a worst case scenario, saying that aloud sounds ridiculous. Really? Where is all of that power going?
Remember that power is equal to torque times angular velocity. The angular velocity of the mower blade is governed by the blade tip speed limitation of 19,000ft/min. The linear velocity of the blade can’t exceed this value. If we know the blade length, we can do the math and determine the necessary angular velocity to achieve that blade tip speed.
So we have a very good handle on angular velocity. The mystery variable in the power equation is then torque. The amount of torque we need when the blade is spinning is going to determine how much power the motors are going to consume.
In my previous power calculations, I made a huge assumption:
I have no clue how much torque a mower blade needs. Let’s just use whatever my 21in Toro push mower outputs. It mows grass pretty good. Close enough.
-Me, making poor decisions
The engineer in me loves that assumption. Find the appropriate RPM, pull up the power curve for the engine and boom, there’s your torque value on a silver platter.
The problem is that the robot lawn mower and my Briggs and Stratton push mower are two different animals. I should have made this chart a long time ago, but here is the performance curves for the E30-400 electric motor compared to a Briggs and Stratton 450e gasoline engine, typical of a push mower with a 21in wide deck:
The chart above makes more sense when you remember that the 450e gasoline engine is paired with a 21in long blade, but the robot lawn mower has 12in blades. This is why the electric motor curves go all the way to 5,700RPM whereas the gasoline engine curves end at 3,600RPM. Their respective blade length gets you close to the allowable blade tip speed.
Remember that these curves represent the maximum torque and power created at a given speed. When you’re mowing your lawn, how often does your mower bog down? Not much, hopefully. If it’s designed well and your grass isn’t a foot tall your mower probably isn’t operating at it’s maximum torque or power.
Because you shouldn’t often need the maximum torque or power out of your mower engine, the guys that engineered them installed a cleverly designed throttle governor, which varies the amount of fuel and air fed into the engine in such a way that its speed stays in a narrow RPM band.
Instead of letting the engine spool up to the fastest speed it can achieve under a given load, the governor limits speed of the engine, and subsequent power output, making it more efficient. If you need more torque or power, it adjusts the air and fuel mixture accordingly. The governor also ensures the blade tip speed stays safe.
This is where my power calculations go off the rails. I’m using the maximum torque values from these curves (measured in laboratory conditions no less) and sizing an electric motor such that it can achieve this torque value. This is inflating the estimated current these motors will consume.
Remember the motor from the electric push mower I got off Craigslist? It doesn’t appear quite so undersized now, given that our gasoline engine is likely operating somewhere below those maximum torque and power curves.
Our E30-400 electric motor, on the other hand, has no throttle control. This makes the analysis simple: it will always be operating at the curves on the chart above. A brief look at the chart shows that it should still perform very well.
Under no load, it will spin at 5,700RPM. As the required torque increases, the motor speed will drop, but the total power output from the motor increases until the motor is spinning at 2,900RPM. At this power output, one single motor is almost generating the power created by the 450e gasoline engine. Nice!
So realistically, the 168A of current for all three motors is probably on the high side. By how much, I am unsure. But I suspect it’s a significant amount. The robot lawn mower uses three of these motors. Collectively, I would imagine they won’t need too much torque to spin through whatever resistance they encounter.
This is looking more and more like a problem best solved by experimentation, not analysis…
Because the ratio I measured a few days ago was a strange non-integer ratio, I decided to open up the gearbox and count teeth. It was frustratingly messy, to say the least.
I was hoping to be able to count the number of teeth on each gear without taking the whole thing apart, but this proved quite difficult. There’s not really a good way to mark a tooth when the whole thing is swimming in grease.
After a while, I decided to sit down and do some math. It was a flashback to my old engineering school days. I was able to count the teeth on some of the gears (with a fair degree of confidence) and using this information, the problem statement became:
A gearbox has a gear ratio of 31.8. There are four gears in the box. The output gear has 32 teeth. The gear connected to the output gear has 11 teeth. A second gear with an unknown amount of teeth is rigidly attached to the same shaft as the gear with 11 teeth. This gear is then driven by a worm gear connected to the motor. How many teeth are on the second gear?
With most word problems, I find myself drawing pictures to make things more clear. I made a simple CAD model of this setup to try and get a better grip on how the gears worked together.
Assign a letter to each gear in the system. The gear on the output shaft is A, the gear meshing with it is B, the gear on the same shaft is C, and the worm gear in the lower right is D.
The total ratio is the ratio between A and B times the ratio between B and C times C. Remember, one revolution of a worm gear will advance the corresponding spur gear one tooth, or 1 divided by the number of teeth on the gear. Putting it all together:
Wait a minute… that’s just a fancy way of saying the gear ratio is A, or the number of teeth on the output shaft gear. That can’t be right. Or can it?
Remember we measured the gear ratio at 31.8. I did some more estimating and came up with a number closer to 31.9 by counting more tire revolutions (320, or so I thought), but because I can’t exactly line the tire up in the same spot it started from, there was always going to be a little error in the calculation.
And come to think of it, the more tire rotations I counted, the closer my estimation came to 32 exactly. To come up with the 31.8 number I counted 159 revolutions. I was off by one tire rotation. And to come up with the 31.93 number I counted 639 rotations, which meant I was off by about 1.25 tire rotations. Turns out I’m not very good at counting.
At this point you might be wondering why I didn’t use the other encoder to measure the gear ratio with much higher precision (and without all the mess). The output shaft on the gearbox is 17mm diameter, but I mistakenly thought it was 8.5mm. I misread the radius measure in my CAD program and thought it was a diameter measure instead. You can jury rig a 10m ID encoder to fit on an 8.5mm shaft, but not a 17mm shaft.
Long story short, I like to do things the hard way, and the magic number is actually 32. Exactly 32.
The rotary encoders arrived today. To familiarize myself with them, I decided to use them to try and determine the gear ratio of my wheelchair gearmotors. The idea is that I can rotate the tire a fixed number of times, read the encoder output and then do some math to figure the gear ratio. I’ll need this number for using the wheel encoders with the Ardupilot software.
Installing this guy was very easy. Peel the cover strip off the adhesive on the back of the encoder and use the conical centering tool to align the base to the shaft. Press it down so it sticks, and then screw everything else into place. Very nice.
The more difficult part is reading the encoder output. To do this I used my old Arduino Uno and did some googling for a sketch that looked close to what I needed (remember I’m a mechanical engineer, not a computer guru). No need to re-invent the wheel, right?
And lo and behold, I found this webpage. And the sketch was pretty much plug and play, short of playing with the baud rates. Not bad for 15 minutes of work!
The encoder I purchased is rated for 900 pulses per revolution. I could have tweaked the sketch so that it measured actual rotations, but I figured I’d just do the math outside of the sketch.
So I taped a pencil to the wheel, hooked it up to a battery and started measuring tire revolutions. After 152 revolutions (measured by aligning the pencil with the motor), the Arduino sketch read 17401196. Dividing that large number by 152 and also by 3600 (it’s a quadrature encoder, so 900 PPR times 4) should give the gear ratio.
The magic number appears to be 31.8. I’m told that gearbox manufacturers like to use strange ratios so that a tooth on one gear doesn’t always contact the same tooth on another gear every revolution. This helps the teeth wear evenly. That’s the only explanation I can think of for this funky number.
I might hook the other encoder up to the output shaft and to get some additional resolution…
It’s been a while since I’ve posted, but that doesn’t mean I haven’t been busy. The back yard is starting to really green up and I think the worst of the cold is finally behind us here in Kansas.
I purchased a replacement relay board for the one that went bad several of weeks ago. Unfortunately, I didn’t read the fine print and the new relay board requires you to ground the appropriate pin instead of applying 5V to it in order to trigger the relay.
Since I’m now using a high current relay to switch the Sabertooth, I decided to resolder the damaged terminals on the relay board and put it back in the enclosure to switch the high current relay. I’ve taken the wheel chair robot out and tested it, and it is working quite well.
The RTK GPS system is working great, but unfortunately my backyard is still a very challenging environment to work in. Maintaining an RTK fix for 50% of the time is about all I can do. Because of this, I am exploring two different improvements to the wheel chair robot.
For the uninitiated, optical flow is the concept that modern computer mice use to measure the motion of the mouse. An integrated circuit with a built in “camera” takes rapid (500Hz-ish) pictures. The pictures are very low resolution, 16 bit grayscale or thereabouts, and really small, like 32 pixels square. Using math, the vector difference between each frame is computed, and if you know how far above the surface you are you can then estimate your translation.
The same concept can be applied to our robot wheelchair. Take a camera, point it at the ground, measure how far the camera is above the ground and voilà! With some creative coding, you can estimate your robot’s translation. Optical flow is great because it measures your vehicle’s absolution translation, not some intermediate measure like wheel turns or integrating accelerations to back into your position.
What’s really cool is the Ardupilot code already features optical flow! There’s only one catch… it’s currently only implemented for copter. So for the time being, I’m going to have to either bribe a genius on the internet to port the code to Ardurover, or I’ll have to do it myself. I’ve started playing with the Ardupilot code, but let’s be real. I have no clue what I’m doing.
This is a feature that is already implemented in the Ardurover code. I’m thinking about buying some rotary optical encoders to leverage this feature. For ~$140 I can get two encoders and pretty much plug and play them into my motors.
To properly implement them, I need to figure out the gear ratio on the gear box. I measured it by counting turns of the back shaft on the motor per one wheel rotation and came up with a ratio of 32:1. I thought this was an odd ratio (I was expecting 30:1 or 60:1) so I did some digging and it turns out there’s a little bit more going on in that box than I initially suspected.
To measure the gear ratio, I could just use the encoders and hook them up to an Arduino, measure several wheel rotations and then see how many pulses I get out of the encoder, so I’m not going to bust open the gearbox and start counting teeth. In theory though, this would be the correct way to determine the exact gear ratio.
Some things I’m not sure about:
How much wheel slip is acceptable before the encoders actually start degrading position estimation instead of improving it? Wet grass isn’t nearly as friendly as concrete.
How many pulses per revolution (PPR) do I realistically need? Not much is my guess, but if the Pixhawk can read them, the more the merrier. Where’s the sweet spot?
The wheel encoders help if you can continue to get good absolute positioning information from the RTK GPS. But they’re only as good as your initial position, and you’ll quickly start experiencing drift once you lose your RTK fix. How long can the wheel encoders tide you over between RTK fixes?
For ~$140, I think I’ll take the risk and see how well they work.
Running the wheelchair robot in autonomous mode has been a lot of fun. Seeing it come back to waypoints pretty much dead nuts to the same spot is very satisfying. So I’ve become a little complacent with keeping my hand on the “manual mode” button in mission planner in case the robot veers off toward something it shouldn’t.
And boy did I pay for it today. Remember that sprinkler well in my backyard? Well the robot seemed to remember it too, and I may be buying a new power switch for it this spring. I took my eyes off the robot for just a few seconds and boom, collision.
Normally this wouldn’t be too frustrating except for the fact that now the left motor doesn’t rotate at all. The right motor though still runs perfectly.
So I start to troubleshoot the problem using the following process:
I made sure the motor still works. I connected the motor wires to the battery terminals and it spooled up just fine. No problem.
I checked continuity across the motor wires to the relays and Sabertooth motor controller. Everything is connected. No problem.
I checked the relays to make sure they weren’t broken, they function as desired. No problem.
I checked the fuses on the motors. Not blown. No problem.
I swap the S1 and S2 wires on the Sabertooth, thinking this would eliminate the Sabertooth as the issue if the left motor suddenly worked and the right motor didn’t. No change, the right motor still responds to RC input, although in a funky way because S1 and S2 are switched. The left motor is still unresponsive. Might be the Sabertooth.
I removed the Pixhawk from the equation by hooking the Sabertooth up to RC input from the receiver directly. Same results, left motor still unresponsive, right motor works.
I also checked continuity across the DB15 cable between enclosures. Everything seems to be connected. No problem.
So from my cursory testing, the Sabertooth seems to have been bricked, at least on the left motor side. So I pull it from the enclosure and take it inside for bench testing.
But after some simple testing, both left and right motor outputs work when hooked up! So what could the issue be?
I’m thinking something got knocked loose during the collision. Nothing else makes sense. I’ll take everything apart and rebuild it just to make sure, but the lack of a smoking gun is somewhat worrisome.
I think I may have incorrectly estimated my power needs for the mower. A key assumption I’ve been making is that the motor will generally need to be capable of generating ~5ft-lbf of torque during maximum operation. I’m not sure this is really true though.
Do We Really Need 5ft-lbf of Torque?
The 5ft-lbf of torque figure comes from taking a typical gasoline push mower engine and looking at the gross torque output of the engine. But one variable I forgot to consider is that the torque curves I looked at are associated with an engine typically used with an 18in to 21in blade. Our mower uses a 12in blade.
Intuitively, the torque we need to cut through grass is going to be positively correlated to the amount of grass we’re trying to cut at once. So a smaller cutting blade should require less torque than a larger blade. There’s less grass for the blade to run into, sapping momentum from the rotating blade.
I have no idea what the relationship between blade length to required torque looks like. I am going to assume it is linear for simplicity, but I have no clue if this is a good assumption. The torque you need is also going to be related to the quantity of grass clippings circulating around under the deck impacting the blade. Good luck modeling that.
Given the smaller blade size, let’s say you only need 60% of that 5ft-lbf torque value, so 3ft-lbf or 2.2N-m of torque. That’s the ratio between a 20in blade and a 12in blade.
How Much Current Does the Motor Draw at 3ft-lbf of Torque?
The performance curves for the E30-400 motor say that the motor consumes 56A of current at 3ft-lbf of torque. I think this is a more accurate number for current draw from the motor.
How Much Power Does the Motor Consume at 3ft-lbf of Torque?
Another mistake I made was pulling power numbers off this chart thinking they were power supplied to the motor, not shaft power output by the motor.
This is an important distinction, because no motor is 100% efficient. The input power should be the power supply voltage of 24V times the current consumed at a given point on the curves. At 3ft-lbf of torque, it’s (56A)(24V) = 1344W.
This jives with the chart above, because shaft output power at 3ft-lbf or 2.2N-m of torque is about 1040W. That would imply an efficiency of (1040W)/(1344W) = 77%. The chart says the motor is about 75% efficient at this torque, pretty close to this estimation.
So under maximum operating conditions, each motor should consume 56A of current and 1344W of power. The three motors collectively consume 168A of current and 4062W of power.
Is That a Good or a Bad Number?
The 168A number is acceptable because it is right at the limit of what the Mauch current sensor can handle. It’s rated for 200A of current and that leaves us 32A of current for drive motors and miscellaneous control electronics, which should be enough.
So assuming our three mower deck motors consume 56A, our two drive motors consume 12A and our control electronics consume 5A of current, you could have a maximum of 197A drawn from the batteries. Very little margin, but I think it should be okay because…
Maximum Versus Typical
One additional thing I’d like to mention is that I think these are maximum power consumption numbers. Previously I referred to them as typical power consumption numbers.
Do you need 3ft-lbf of torque while mowing the entire time? I doubt it. The calculations above prove that if our electric motors need to operate at 3ft-lbf of torque, they should be able to do it. Operating at 3ft-lbf of torque drops the rotation speed down to 4500RPM which results in a blade tip speed of 14100ft/s, which is a little lower than I’d like but should work.
Run Time Recalculated
Turns out I also miscalculated how battery charge adds when batteries are connected in parallel versus series. In series, battery voltage adds. In parallel, charge (your amp hours) add. Previously I assumed your total charge is the sum of each individual battery charge.
Since we have two sets of batteries connected in series, and then in parallel, our equivalent battery is 24V, 70Ah. This makes sense because I think the Ryobi lawn mower is advertised at 24V, 70Ah too. It’s the same battery set up, apparently.
If we were to run all three deck mowers with a load of 3ft-lbf torque on them, it would take (70Ah)/(197A) = 21 minutes to completely drain our batteries (again, assuming that’s even possible to do, in reality it isn’t).
At half this torque value, total current consumption would be 28A for each deck motor, resulting in 113A total. That results in (70Ah)/(113A) = 37 minutes of run time. The E30-400 motor consumes 29A of current at peak efficiency, so I’m hoping that I’ve sized these motors for the sweet spot of their performance.
If you were to bump up the battery size used on the mower to four 50Ah batteries, run time would be (100Ah)/(113A) = 53 minutes. Doing this would add 34lb to the mower, which would show up in the current consumed by the drive motors.
Even though I have more confidence in these numbers being correct, they’re still disappointing. I would like to shoot for a minimum 2 hours of run time. The only two ways I can think of to get there:
More efficient motors and electronics.
Using BLDC motors would increase our efficiency, but they cost 4 times the brushed DC motors I intend to use. Reduced run time is an acceptable trade off to save $700 in my opinion.
Larger SLA batteries start getting pretty ridiculous beyond the four 35Ah’s I’m using currently. The battery bay has to grow to accommodate the larger batteries, and that pushes the wheels out, increasing the wheel base and negatively affecting vehicle performance.
Additionally, the added weight makes me wonder if the 0.125in sheet metal battery bays are sufficient to support the weight of the batteries. Two 50Ah SLA batteries weigh 64lb. I’d probably want to reinforce it just to make sure.
We could switch to some Lithium Ion batteries, but here the cost is at least as bad as switching to BLDC motors.
If you were to make a composite battery out of 18650 cells equivalent to the four 35Ah SLAs I’m using currently, it would cost just shy of $1,000 in 18650 cells alone. And that doesn’t even include labor to build the battery and a fancy charging system to go with it.
I found some guys that make custom 18650 batteries, and maybe they can do it for cheaper. I’m starting to understand why Tesla’s use lithium ion technology. If you need a boat load of power and have any kind of space or weight constraint, you kind of have to. Unfortunately, I drive a 2003 Honda Accord, not a Tesla Model X and so the mower project can’t afford some legit lithium ions.
I may have to get used to about 30 minutes of run time.
I decided to make the autonomous lawn mower fully electric for one big reason: If a person has to walk out to the mower with a gas can and refill the tank, is it really autonomous?
Ideally, you want the mower to do it’s job without any human intervention. If you have a gas engine, no matter how you cut it fuel has to be delivered to the mower in some fashion. With an electric design, you can have the mower automatically dock with a charging station when the battery gets low. No human required.
So from the get-go I have been trying hard to make the mower electric. I am encouraged by some electric riding mowers out there that use SLA batteries as their power supply. I like SLA batteries because they contain a lot of energy and are fairly cheap. Minimizing battery weight and volume isn’t a huge constraint for this project, thankfully.
Because these electric riding mowers cut grass and carry a ~200lb person on the mower, I have been operating under the assumption that as long as our batteries are larger capacity than those on this riding mower, we should be okay. That Ryobi mower features a battery bank that consists of four 12V, 25Ah SLA batteries.
I am beginning to question that assumption…
Sizing the batteries ultimately depends on how much power the mower needs. The deck motors take the lion’s share of power consumption. Previously I estimated the mower would require motors that can output at least 5ft-lbf of torque to cut through thick grass based on typical gas engine torque output.
Examining the torque curves for the E30-400 motor I selected for our design shows that at 5ft-lbf or 3.7N-m torque, the motor consumes 1400W of power. If you assume all three motors pull this level of power, the deck motors collectively consume 4200W.
The drive I’m using on the mower design are stolen from the wheel chair. I suspect they are rated for 500W but I am not sure. The gearbox on them ensures they will generally be operating in an efficient area of their torque curves, so I am going to consume both motors consume 250W, and collectively consume 500W between the two motors.
The control electronics are almost negligible compared to the power consumed by the motors, but I will budget 100W for all the other little things on the mower, just to be safe.
That brings the total estimated power the mower needs during operation to 4200W + 500W + 100W = 4800W.
The batteries I’ve selected are four 12V, 35Ah SLA batteries. If you assume we intend to discharge these batteries 100% (and that doing so was physically possible), you could obtain (4)(12V)(35Ah) = 1680Wh of energy. If we were to draw 4800W of energy from these batteries, we would drain them in (1680Wh)/(4800W) = 21 minutes. Yikes.
But it gets worse. Because we’re pulling so much power out of these batteries, it looks like you have to discount the total amount of energy you can get out of them. I’m not entirely sure what that calculation looks like, but from the SP12-35 datasheet, it looks like a 1hr discharge rate only allows you to get 21.8Ah of charge out of each battery. That’s only 60% of the 20hr rate of 35Ah. I could be wrong about this interpretation of the datasheet, please correct me if I am mistaken.
Do the motors really draw that much power? Holy moly I hope not. At their most efficient, the motors draw 500W of power. Running the calculations above with this number gives you a run time of 48 minutes. Still not great.
The reality is somewhere between those two extremes. Taking the average of the two gives 35 minutes of run time. I was hoping for something more in the neighborhood of 2 or 3 hours. Going up to some 12V, 50Ah batteries could give us some extra oomph, but I don’t think it will be 3 hours of oomph.
Please let me know if these numbers seem way off base, it’s my best swag at them I can come up with. The last thing I want is a mower that can only cut grass for 10 minutes.