If you look closely at the charts in the previous post, you’ll notice they end quite abruptly. The picture above shows why.
I have the robot mower configured so the blades only turn on when you hold the rudder stick to the right. The stick is spring loaded, so letting go of it returns it to center, deactivating the blades.
The field I was testing in had several large shrubs in it. One of my goals in the field was to push the envelope on what the mower was capable of cutting through, so I let it mow over several of them in autonomous mode. However, there was a particularly large shrub in the robot’s path, and so I decided to turn the mower blades off by releasing the rudder stick.
As expected, the blades stopped. But the robot stopped moving, too. This was curious: the robot should have continued driving. So I pressed the emergency stop button on the robot to immobilize it, and walked over to the truck where I had Mission Planner running on my laptop.
The Mission Planner screen was frozen as if it wasn’t receiving telemetry from the robot. It took me a few minutes to realize that the issue wasn’t Mission Planner or my laptop radio: it was that the robot wasn’t sending telemetry. And even more curiously, the voltage reading on my SLA batteries said 12.50V. I was worried that I’d really fried my batteries, so I walked back over to the robot to investigate.
Opening the battery bay I found a slight amount of smoke and that awful burnt plastic smell from melted wire insulation. But other than that everything seemed fine. There wasn’t a smoldering fire inside the robot. The batteries weren’t hot to the touch.
At first I chalked it up to too much current running through the wires. The robot uses 120A at peak current, and at those levels even a small amount of resistance could cause the wires to heat up a lot. Perhaps the wires just got hot enough to melt the insulation and came into contact right at the moment I decided to shut the mower blades off?
This just so story didn’t sit well with me. What an awful coincidence that the wires failed at exactly the same moment I was shutting off the motors, especially after the robot ran fine for five minutes prior. Why didn’t they fail earlier?
Additionally, why didn’t the exposed wire conductors fuse together after they came into contact? I’ve heard of people using SLA batteries smaller than mine to spot weld 18650 battery tabs together. The wires were really close together, but weren’t touching when I opened the battery bay. And the wire strands don’t seem to be melted judging from the picture above. Which was extraordinarily lucky, given how bad that could have been.
I did have enough sense to put a 100A fuse on my batteries. It seemed logical that the fuse was what saved my bacon. But using the multimeter to test the fuse revealed that it was not, in fact, blown. At this point I was completely bewildered. I thanked the good Lord that the robot hadn’t turned into a massive lead acid fireball, made sure all my batteries were all disconnected, packed up, and went home.
I’m sure readers are ready to scold me for all of the janky things going on in my battery bay, and I certainly deserve that criticism. My mantra is make it robust, and I definitely did not live up to that standard with my wiring. Below is a full picture of the battery bay.
I have my own list of janky things in here that I intend to fix. What do you see that is janky? Feel free to comment below!
In the next post, I’ll explain what I think went wrong, and the improvements I’m going to make to mitigate the problem and eliminate all the redneck wiring I’ve got going on.
I didn’t spend much time in the field testing the robot mower, but the little time I did spend gave me a treasure trove of information about how the robot mower performs. Below are some high-level measures.
I set the robot cruise speed to 1m/s or 2.2mph. I’ve run the robot mower faster in manual mode, but for obvious safety reasons, a moderate walking pace seemed best. If the robot gets out of control, I want to be able to catch it.
I am using a Mauch PL-200-HV current sensor on the robot. The sensor is connected to the positive terminal on my batteries, so it reports the total current used by robot: sensors, drive wheels, mower deck motors, etc. Once the mower is spooled up, current consumption levels out at about 90A.
The first portion of the graph above is the robot mower operating autonomously without the deck motors turned on. The huge spike is when the deck motors get turned on. It’s curious it says the peak is 196A: the PL-200-HV is only rated for 200A. I wonder if the spike was actually larger than shown, but due to its current rating, the sensor only reported ~200A.
There is a surprising variability to the amount of current used by the deck motors. In the garage, the deck motors would use 46A +/-1A. It was kind of impressive how steady current consumption was. But throw some grass under the robot and you get a ton more variability. Current consumption for the whole robot varies from 65A to 120A with the deck motors on.
Grass is typically pretty homogenous, even out in the field, so this was unexpected. I went and reviewed the logs from my backyard, and they look about the same. Even controlling for pivot turns at the end of a pass, the current consumption is all over the place.
It does appear to dip a little with each turn, but only ~15A. Maybe I’m reading into it too much. There were some small shrubs and some pretty gnarly crab grass plants out in the field. The current variation might just be the mower running over those things.
I also wanted to see how much the battery voltage dropped when running the deck motors in a real-world situation.
I made sure all four of my batteries were individually fully charged before going to the field. But any way you cut it, asking 90A out of four 35Ah SLA batteries is pushing it. With the deck motors on, the nominal battery voltage drops to 20V.
To get the total power consumed by the robot you need two things: current consumed and the voltage level when it was consumed. Mission Planner will do the math for you by multiplying the current and voltage at each time step and then adding them up for a total energy consumed graph.
Unfortunately, Mission Planner won’t differentiate that graph for you so you can see what the power consumption was at any given time. But if we assume the power consumption is linear over the five minute time interval above, we can estimate it.
This number is encouraging. I’ve seen the drive motors on typical electric riding mowers consume more than 1kW by themselves, excluding the deck motors. I need to do some more testing to see how efficient the robot mower is on less flat terrain. But as you’d expect, there appear to be a lot of power savings to be had by getting rid of the guy riding the mower.
Regarding the blade motor rotation speed, there are two ways I can think of to estimate this. One way is based on the current consumed by each motor. Take that number to the motor performance charts and you can estimate rotation speed.
The other is to see at what frequencies the mower is vibrating at. Thanks to some new developments by the Ardupilot folks, this is a pretty straightforward task. If you can find the frequency where you get strong vibrations at, you can assume that’s the probably speed at which the motors are rotating.
The mower uses about 5A when running without the blade motors on. That means all three motors consume 85A, or each motor consumes 28A.
At 28A of current consumption, an E30-400 motor spins at 5100RPM, uses 540W of power and and achieves 1N·m or 0.74ft·lb torque. One of the big question marks in the robot mower design has been how well the motors would perform. It appears they’re just about perfect, operating right at the peak of its efficiency curve.
Speaking of efficiency, one thing I did notice about the deck motors was that even after a few minutes of operation, they were very hot to the touch. So hot I couldn’t keep my hand in contact without significant discomfort. Yes, I did try. I’m guessing they were north of 140°F.
Even if the motors are 79% efficient, that means that 21% of that 540W, or 113W, gets turned into waste heat. That’s a fair amount of heat to dissipate through a 5in long, 3in diameter cylinder by free convection. It might be worthwhile to come up with some kind of cooling fin to attach to the motor housing.
Ardupilot now has a feature that lets you analyze logs to see where vibration issues might exist. It looks at data recorded by your IMU and does a fast Fourier transform on the data to find frequencies at which you have high levels of vibration.
Using this method to estimate the motor rotation speed gives you several graphs that look like the one below.
My understanding is that this graph gives you the strength of vibration at certain frequencies. I am unsure of the units on the Y axis. I think they’re m/s, but that seems awfully high. But then again, the mower deck is basically a giant vibratory table. I have no reason to believe they’re not accurate.
One thing that is interesting to note is that the amplitude of vibration is high in the X and Y directions across the spectrum, but less so in the Z direction. I think X is the forward to aft direction and Y is the left to right direction on the robot. So high levels of vibration along those axes makes sense when you think about a giant unbalanced motor rotating along an axis perpendicular to those two directions.
I’m assuming that all three motors are spinning at close to the same speed, but in reality, I’m sure they’re all off by some amount. I’m not sure how that affects the accuracy of this method of estimating the mower blade speed. Regardless, 4680RPM jives with the estimates obtained from the motor chart. The truth is probably somewhere in the middle, perhaps 4800RPM.
15000ft/min is well below the 18000ft/min limit that manufacturers abide by. But judging from the good cut quality in the field, it might be possible run the motors slower. This would reduce the power we need to run the robot. AmpFlow makes a motor that is identical to the E30-400 that is a little bit shorter and uses less power. I might see if it would make sense to use this motor instead.
I made some setting changes to the RTK GNSS modules based on some great comments I received a few posts back. I outlined those changes in this post. After that post, I made one additional change to the CFG-NAVSPG-DYNMODEL configuration item. I set it to 3 = Pedestrian.
The graph above shows the RTK fix status. The changes I made in that post resulted in a rock solid RTK fix. The little dips in RTK status were less than a second in duration. Thank you to Sean Smith and Frank Beesly for their valuable feedback!
The magnetometer performance was more or less garbage. I think a good portion of my cross track error is due to the compass variance corrupting an otherwise good position solution given by my wheel encoders and RTK GNSS.
Reviewing the telemetry logs, before I was even in autonomous mode, the EKF was reporting a marginal compass health reading. And the chart below tells you all you need to know about the rest of the compass performance.
The reported magnetic field strength jumps by 0.15 Gauss after you turn the deck motors on. It might be possible to zero this out in the software by finding the offsets when the deck motors are on, but I have so little confidence in the magnetometer at this point, it’s not worth the effort in my opinion. RTK GNSS yaw seems like a much more robust solution for heading.
Keen observers will notice something peculiar at the 17:16 mark on these graphs. More on that in the next post.
I started drawing up what the robot lawn mower would look like if we didn’t care about separating the compass from the electric motors. Removing this constraint allows several design efficiencies, some of which I was not expecting.
I decided to use two battery bays on version 5 because I had to mount a mast smack dab in the middle of the chassis. I didn’t want to mount it on a removable lid because it would be cumbersome to remove to get access to the batteries. Instead, I put hinged doors on both bays.
It looks neat in the picture above, but what you don’t see is all the wires running through my chassis tubes between bays to connect the batteries and all the signal wires run through the mast weldment up to the control enclosure. It started getting ridiculous drawing all of that up.
A separate design constraint I’ve been trying to achieve is to keep the wheel base of the robot to a minimum for handling reasons. Unfortunately, the mower deck design I settled on has a motor in the middle of the deck that is a pain to locate such that it doesn’t interfere with the battery bays.
Because I was splitting the battery bays anyway, I positioned the mower deck the way you see in the version 5 picture above. One downside to doing this is that the mower deck is pointing backward from what you see on virtually every riding mower.
Combining the battery bays let me rotate the mower deck 180º. In the back of my mind I have been worrying the backward orientation of the mower deck might cause performance issues. Now we won’t have to find out.
Additionally, both power and control enclosures can be mounted directly to the battery bay, which will drastically shorten the wire runs I’ll have to make. I’m actually excited to start drawing wires again. Things aren’t so claustrophobic anymore.
And on top of all the benefits above, the chassis weldment went from having 28 total parts to 15. Not too bad!
Batteries are often advertised with a nominal voltage and nominal charge. It’s tempting to take those numbers at face value and assume that regardless of operating circumstances, a battery will sit at its nominal voltage and run until its nominal charge has been depleted. Reality is more complicated than that.
Christopher Milner reminded me of this in a brief conversation this week. He is using 13 3.2V, 100Ah LiFePo4 batteries in series to power the deck motor on his mowing rig. Stringing them together in series gets you close to 48V in their charged state. He reports getting about 3 hours of runtime while powering a motor that consumes 1300W.
This surprises me because my own calculations suggested four 12V, 35Ah lead acid batteries should be sufficient. Christopher obviously knows what he is doing and I trust his experimental results much more than I do my own back of the envelope calculations. The discrepancy means I need to reevaluate my numbers.
When making my calculations I did not take battery discharge rates into account. This is a big mistake, as I’ll show below. But first, for my own educational benefit, I’d like to introduce you to the C-rate, a way to quantify how fast you pull current out of a battery.
What is a battery’s C-rate? Our good friends at Wikipedia define it as:
The C-rate is defined as the charge or discharge current divided by the battery’s capacity to store an electrical charge.
In layman’s terms, the C-rate is just the ratio of an arbitrary discharge rate to the battery’s charge capacity. It’s a simple way to describe how much current you’re demanding from a battery relative to the battery’s total stored charge. If you’ve got a battery rated for 10Ah and you discharge it at 5A, the C-rate would be:
It’s kind of a janky way of defining things in my opinion. A C is really an inverse hour, or stated differently, the unit of measure for a C-rate is h−1. Also, don’t confuse the C-rate with Coulombs, the unit of measure for charge. Clear as mud?
The reason I write all of this is so we’re all on the same page about what a C-rate is. If I’m wrong, please comment below, because I am going to proceed with this understanding of the C-rate going forward.
The C-rate is a useful measure because the total amount of charge you can extract from a battery depends on how fast you take it out. If you try to pull 100A out of a 10Ah battery, you’re not going to get nearly as much charge out of the battery as you would if you discharged it at 1A.
Additionally, when you demand a large amount of current from a battery, your voltage starts dropping fast. This is important because electrical power, which is ultimately what we’re after, is voltage times current. If your voltage drops you get less power, even if you’re withdrawing the same amount of current from the battery.
Your batteries are going to be much happier and live much longer if you stick with a reasonable C-rate. Having said all of that, how do the four 12V, 35Ah batteries I selected stack up against the expected current draw they’ll experience?
Our SLA Batteries Reevaluated
Previously I estimated the total current consumption of our robot at 197A. I think this is a conservative number, but I’m going to roll with it anyway. The number reflects current consumed by the three deck motors, two drive motors, and various electronics at their worst case scenarios.
I’m using two sets of two 12V, 35Ah batteries wired in parallel, then in series to get an equivalent battery that’s 24V, 70Ah. The discharge rate for one battery in this configuration is half the total current consumption because we have two sets of two batteries wired in parallel. This means that one battery is discharged at a rate of 98.5A.
The datasheet for one of these batteries shows the following chart, with various C-rates:
At a discharge rate of 98.5A, our C-rate is 2.81C. Looking at the chart above, that would mean our batteries will last for ~8 minutes. Yikes. And realistically, the 2.81C curve may extend to about 8 minutes, but the voltage drops off so fast after about 4 minutes that you’ll probably start noticing performance problems very quickly.
Also interesting to note from the chart above is that the one hour discharge rate is in fact not 35A as you’d expect with a 35Ah battery. It’s actually 0.628 times 35Ah, or 21.98Ah. To truly extract the 35Ah charge from the battery, you can only discharge it at a C-rate of 0.05C, or 1.75A.
Having written all of this, I wonder to myself why manufacturers don’t just list curves with specific discharge rates in Amps. The chart above would be completely unambiguous if they just showed a curve for 105A, 70A, 35A, 21.98A, etc. The conversion is tedious and honestly, if you don’t truly understand C-rates you leave with the impression that these SLA batteries are much more capable than they truly are.
Looks like we will need to use some lithium batteries after all. Thanks for saving me $400 on some batteries that wouldn’t have worked Christopher!
What would a composite lithium ion battery look like if we were to use one for the mower?
We need a nominal voltage of 24V, so if you string 7 18650 batteries with a nominal voltage of 3.6V you’d end up with 25.2V, which should work fine.
String 15 sets of those 7 rows of batteries connected in series together and you get a battery with 45Ah of charge. Not too shabby. This would be a 7s15p battery in lithium ion parlance, I think.
Two of those 7s15p batteries would fit in each of our two battery bays. You’d need a total of four to achieve the numbers shown above. They’d fit in our robot something like this:
The weight savings here are significant: 70lb. This is to be expected, but I’m surprised it’s this high. I don’t have a truck to haul this robot around in, so the lighter I can make it, the easier it will be to take it out for field testing. The reduced weight should also decrease the power consumption from the robot’s drive motors, making it more efficient, and possibly more agile.
The cost is still going to be over $1,000 though. Talking with some suppliers, I think the 18650 cells cost a little less than what Amazon will quote you, probably closer to $4.50 per cell. But you still need some fancy charging equipment, and there will be labor and material cost for building up the battery and coming up with a way to secure them in the battery bay.
I may start out using SLA batteries because the monetary risk is pretty low, about $300. They may perform better than I’m expecting. If they don’t, the lithium ion batteries appear to be a good plan B if we need more run time.