Interview with Alex Ellery
Biomimetic devices look to nature for inspiration, mimicking the way insects, plants and animals cope with difficulties. From the way plants furl their leaves to how wasps bore holes into trees, evolution has developed clever and varied ways to solve engineering problems.
Alex Ellery is the head of the Robotics Research group at the Surrey Space Centre in the United Kingdom. In this interview with Astrobiology Magazine, he explains how robotics can borrow from the strategies used by life, and discusses how these techniques may be used in future European space exploration missions.
Astrobiology Magazine (AM): Let’s talk about designing with biomimetics, and how you’re applying such designs to the upcoming ExoMars mission.
Alex Ellery (AE): I prefer the term bio-inspiration rather than biomimetics, because we’re trying to get ideas from nature rather than just copying it. The primary rational is that biological organisms are faced with the same set of challenges and problems that engineers are. For rovers and robots, we’re trying to design autonomous intelligent agents that can survive in hostile environments. So we look at how nature has coped with that problem through evolution by natural selection, and then we can reverse engineer the natural solutions and incorporate them in our machines.
There are many different areas where you can apply this type of technology. For example, a spacecraft must be compact when you launch it, but in order to get power after launch you’ll need to unfurl large solar arrays. To efficiently package things like antennae and solar arrays, we can look at how flowers package their leaves.
Another example is drill design. Most man-made drills are rotary — they typically require two motors and you need to put a lot of force or thrust on them. But small animals like insects don’t have very much weight, so they can’t exert a lot of thrust onto their drills. So, for instance, the wood wasp uses a long tube called an ovipositor to drill into the bark of a tree so that it can lay its eggs. We’ve taken the wood wasp’s ovipositor and converted that mechanism into a novel type of drill. It’s basically a percussive drill rather than a rotary drill. We found that in the initial tests, it’s much more efficient than normal drills.
|Bees and flowers provide engineering lessons. Credit: Arizona State University|
My main interest is in robotic devices, specifically robot rovers with autonomous navigation. By studying how animals navigate, we can try to apply their methods to a planetary rover. But the majority of our work on ExoMars has to do wheel mechanisms — to enable the rover to go over rocks and things like that, and also to control traction to minimize slippage and to conserve as much power as a possible. So biomimetics is separate from our involvement with ExoMars, but it has applicability to this and other missions.
AM: What kind of mission do you think biomimetics might be applied to?
AE: That’s a difficult question. ExoMars has been undergoing an evolution. It originally started out as a large rover, but it’s been cut in half and now we have a lander as well. The scientific instrument selection hasn’t been finalized yet so we don’t know what’s going to be on board. Almost certainly a drill will be required. It’s doubtful that a bio-inspired drill will be used, because we’re too early in the development process for it to be ready for something like ExoMars.
However, we do anticipate an optic flow navigation technique will be employed on the rover. Optic flow enables obstacles to be avoided automatically. Some experiments have been done where you tie a bee to a post –-how they do this, I’m not quite sure –- and then you put the bee into a long box, and you move the walls on either side. If you move the walls of the box at the same speed, the bee sees that the velocity on each side matches, so it will fly the middle path. In nature a bee will always fly through the center of a window — never to one side — because it equalizes its velocity on both sides as it passes obstacles. In the experiment with the bee inside the box, if you move, say, the right wall at a faster rate than the left wall, the bee will then think it’s going to collide with the right side. So the bee will automatically veer to the left and crash into the left wall.
|The flight controls exhibited by dragonflies are admired by aeronautic engineers. |
Credit: Samford University
AM: That’s a dirty trick, isn’t it? It’s a good thing that People for the Ethical Treatment for Insects doesn’t exist yet…
AE: (laughs) You could see it as somewhat cruel, but this is how we learn about animal navigation.
AM: How will this be applied to the ExoMars rover?
AE: It’s not certain yet. The ExoMars rover will use two cameras to take an image of a scene, to a distance of about ten meters. It will merge those two images to generate depth –- just like how two eyes give you stereoscopic vision — and then it can pinpoint each rock that it needs to avoid. On the basis of this three-dimensional map, it will plot a trajectory to get to a target. Right now, the design calls for it to then shut its eyes and drive blind. But instead of doing that, we want to use the cameras while we’re snaking around obstacles, using optic flow to make sure we don’t bump into anything.
AM: Optic flow means continuous scanning then?
AE: It means measuring the velocity of anything moving past you. We can also measure the distance to objects — the obstacle will get larger the closer you get to it, and we can measure the rate at which an obstacle is growing in size. We can use that to avoid any obstacles both in front and to the sides of ourselves.
AM: What about designs for the future, anything else on the horizon?
|Computer rendering of a robotic rover on the edge of a depression, much like Opportunity’s perch on the edge of Endurance Crater on Mars. Navigating in terrain with deep pits, scattered boulders, and other hazards is one of the difficulties of robotic exploration.|
AE:One of the things we’re just starting to work on is a genetically-evolved neural network. We can use such networks as controllers for robots. So far they’ve only been used on very simple robots in very simple environments like T mazes. You stick a robot into a T maze, and it will avoid the walls and turn left or right.
Mars is a lot more complicated than that, but this particular technique is based on trial and error, so it’s self-learning. We train the robots with something called a “fitness function,” but we’re not to clear on how to build to most optimal training regime. We want to mix and match different types of environments to get the robot to learn to be robust, so no matter what situation it finds itself in, it can still navigate.
But spacecraft engineers are notoriously conservative, and they don’t like new things. So it’s a constant battle to try and convince the agency that what we’re doing will work and that it’s better than the technique they currently have. That’s always an uphill struggle.