Integrated Motion Planning in Blender

OMPL does not provide inherent visualization capabilities. It is merely a library of algorithms, letting users choose how and when they wish to present the problems, robots, and solutions. Blender is an open-source application for modeling, animating, and rendering 3D scenes. Both OMPL and Blender have extensive Python APIs. Moreover, there is a project called MORSE, the Modular OpenRobots Simulation Engine, which is a system of scripts for Blender that allows for real-time simulation of controllable robots in the Blender physics engine. It was natural to build an interface between OMPL and MORSE/Blender.

The interface I created allows users to design a 3D workspace model in Blender, drop in a robot from the MORSE library, and attach any of OMPL's control-sampling planners to search for a path through the dynamic simulated environment. The interface takes the form of a Blender add-on that writes and saves various states of the system to and from the Blender engine, while talking to MORSE to apply controls for a small time step, complete with physics simulation. Once a solution is found, it is imported into the Blender model as animation data, ready to be rendered. At this point, it is easy to swap out a low-poly model used for fast collision checking with a high-poly model meant for pretty pictures. Here is a nifty video of two robots cooperating to solve a puzzle by pushing around a ball, and another of a robot that has to get a running start on a ramp to jump a gap. See my original post on the OMPL blog, as well as my commits to OMPL's public repository.