30 Days of Python: Day 29 Visualizing Particle Filters

I’m making a small project every day in python for the next 30 days (minus some vacation days). I’m hoping to learn many new packages and  make a wide variety of projects, including games, computer tools, machine learning, and maybe some science. It should be a good variety and I think it will be a lot of fun.

Day 29: Visualizing Particle Filters for Robot Localization

Back in the spring I took Udacity’s Artificial Intelligence for Robotics class. It was a great class that covered a wide range of topics all centered around the algorithms that go into an autonomous car (the instructor, Sebastian Thrun, is the guy behind Google’s and Stanford’s self-driving car). I really enjoyed that the class had python based homework assignments. One assignment was to program a particle filter to perform localization for a robot. A particle filter uses several hundred “particles” to model a robot’s motion and then the average position acts as an estimate of the robot’s position, at least, if it doesn’t diverge. Because the course had us programming in a web interface, we couldn’t do much visualization of what was going wrong. Now that I’ve learned more about python, I thought I’d give it a crack offline.

So for today’s project, I used the python turtle package to draw the world, the robot, and the particles. You can actually watch the particles converge as they go through each motion the robot takes. Below is a GIF of the robot and particles in action. The robot moves through a sequence of moves first in blue. We have no idea where the robot is so we initialize the particles all over the world (in red) with all possible orientations. We do however know what move the robot made and how far the robot was from the four landmarks (black circles). For each step of the particle filter we move all the particles according to how the robot moved, estimate the probablity of the particle being correct by comparing its measurements to the robot’s, and then we resample the particle distribution. That resampling piece is key. The particles that are most likely to be right get resampled the most and unlikely ones don’t get sampled at all. Because both the motions and measurements are noisy, the resampled particles diverge from each other. But because any that diverge too much from the actual robot die off, the clump of particles converge around the robot. In the picture below the particles alternate between a movement and the subsequent resampling.

Robot Localization by Particle Filer

Robot Localization by Particle Filer

After just one motion,  measurement, and resampling the particles have already closed in quite a bit on the robot. Overall turtle worked really well for visualizing this. I thought about static plots with matplotlib but the animation is definitely the most important part for this. I might do another project in the future centered around more complex particle filters because they are a pretty good way to localize on a map.

 

Advertisements