Since the last update, I’ve been slightly busy with other things and hardly had time to work on the robot. However, in the few spare days I had, I was able to make quite a few changes and test out different SLAM methods and start with setting up the ROS Navigation Stack. I’ll first start with the hardware changes. I tried out two alternatives to replace my battery and finally replaced it with a mini UPS system. These photos describe my original battery, and the alternatives I tried:

The original battery. I bought it from AliExpress. It is supposed to be for bicycle lights and for charging phones but I used the 12v DC power output to run both the motors and RPi using different power regulators
This battery runs using 6x 18650 batteries. The batteries are of reputed brands and work fine individually. However, after about a few charges/discharges, the power board (which is embedded into the cap) stopped working. Not surprised.
My first alternative - NP-F batteries. These ones are NP-F750 batteries with adapters that provide 8v and 12v DC outputs. I first saw them on [JetsonHacks' amazing video]( about these batteries and how they can be used to power Jetson devices. In parallel, and using the same power regulators as before, they could run the entire system, and also be hot-swappable. Definitely a win! But unfortunately, they both don't fit into the AKROS chassis. So, I'm keeping them for another project (or a future iteration of the same project)
Finally, this mini UPS I bought from Amazon. It is rated 36Wh, and provides 1x 5v2A output, 1x 12v1A output and 1x 12v2A output. These three outputs power the motor driver, the power hungry RPi4 and all sensors/peripherals. Its also got a very handy on-off button, and a useful battery level indicator.
This UPS is an even bigger win than the NP-F batteries. This is because I can charge the robot while its running. So, when the battery backup is low, I can simply plug the charger in without having to disconnect the RPi4. So, I don't have to frantically run to save the generated maps, I can simply plug it in and take my time..

Once the battery was replaced, I decided to drive the robot around to test its capacity. I expected it to last about an hour (double that of my original battery), but it lasts only between 40-45 minutes, which is still much better than the original battery system. The new UPS is almost half the weight of the original battery and takes less space - this makes the robot lighter and more compact since I could reduce the spacer heights as well.

While driving the robot around, I made sure to also run SLAM algorithms in the background - firstly, to test the battery under full load and secondly, to get some nice maps of my studio. I tried both gmapping and hector_slam, and played around with parameters till I got acceptable maps from each of these methods. However, one method was the clear winner:

Map created using Hector SLAM. This took quite some parameter tuning and the resulting map is not bad, however needs to be post-processed before it can be used for localization. One advantage that Hector SLAM has over GMapping is that it can make maps without an odometry source. However, I do have a pretty good odometry source, so I'll keep this method for a project without odometry...
Map created using GMapping. It is pretty much perfect and definitely the winner against Hector SLAM. GMapping uses the odometry provided by the T265 camera.

Next, I saved these maps using map_saver and then launched the localization node using AMCL with each map. I tested the AMCL node in two ways - first, by driving the robot manually and comparing the laser scans with the map. Next, I moved some things around and tried driving the robot around manually.

The AKROS robot being tested in my studio. It is being driven manually using a PS4 controller
Foxglove viz for the experiments. I've chosen to see compressed images from one of the T265's two fisheye lenses for debugging. The errors in the debug panel shows when the Arduino lost connection with the ROS master. This is a common problem that I haven't solved yet. Fortunately, the Arduino reconnects automatically, and I've written recovery functions in the Arduino code.

Finally, I drove the robot to a random location, waited for it to localize correctly and then picked up the robot and placed it at another position. I repeated this multiple times, and also moved/rotated the robot while moving it. I wanted to test if the Intel T265’s (RIP) 3D tracking could allow AMCL to localize correctly even if it received incorrect laser data… and it does! On average, between these three tests, the localization was accurate up to ~10cm. However, it should be noted that I used the map created by gmapping (the better one), was driving around in a very small space (~30m2) and the robot was never in transit for more than a few seconds (odometry/IMU drift would have taken longer). This last test took only about 2 minutes to perform, here’s a video that shows what I did:

90 second localization experiment

For now, I’m more than happy with the results, although I hope to improve it even further once I implement odometry using the motor encoders. My next step is to first play with Steven Macenski’s slam_toolbox, and then focus on tuning the move_base parameters (I already have the ROS package set up). I also plan on cleaning up my launch files and directory structure (I currently have all my launch files in a single ROS package), and I’m using the Robotis Turtlebot3 repository on Github. For this week, my focus is on the slam_toolbox and hopefully I can write another blog post about it this weekend..