I already wrote a couple of posts about simulated drones some time back. I also wrote a research proposal in which I have cited plenty of literature about developing unmanned aerial systems. In this article I am going to reflect on what I think could be one of the ways to design drone planes in simulation. Whether or not the simulated drones would be good enough of an emulation of real drones – or what we consider the real drones to be capable of – would depend partly on the way we design the drones and partly on how exactly the simulation environment is representative of real environments.
In order to fully understand and appreciate the ideas presented in this article I strongly recommend you to read my two previous articles on this topic. These would help you understand the scope of these ideas. These titles are Simulated Drone Flying Competition and Explaining the Simulated Drone Flying Competition. You should also read these tutorials on genetic algorithms and especially on genetic programming. These tutorials would help you to understand these technologies and you would not find it much difficult to understand this tutorial. As a matter of fact you would not find it quite easy to understand this tutorial. All the ideas would come to your mind quite easily.
So let’s get started with this tutorial now. So the problem that I am discussing here is that how to evolve controllers for simulated drones. Let us begin with this basic function first that why do we need to evolve any controllers in the first place?
The answer to this question is simple. If you imagine how a plane is flown you would find it fairly natural to consider that the pilot who is responsible for flying the plane has to be able to control the plane somehow. As a matter of fact the guy sitting in the cockpit has access to a number of tools in his hands and under his feet through which he (or she) tries to steer the plane successfully to the desired destination during his moment by moment experience of flying. If all of those controllers were not there, or if he had not been using them to control the plane with his skills and experience of flying planes, he would simply not be able to fly the plane. The plane would simply crash resulting in tragedy.
This has possibly answered the question that why controllers are required to control a plane?
Why do we need to evolve controllers?
Consider that if you are trying to replace a human pilot in an aircraft with some sort of artificial intelligence that would fly the plane as well as a human being would. This can be a great idea. This is also a central theme behind designing drones. And in order to accomplish this task you would either have to develop a background in machine learning or artificial intelligence. And this also answers the question on as to why do we need to evolve controllers.
Now let us answer one more question: Why evolve controllers for simulated drones? The answer for this question is simple, although there could be quite a few reasons. And this is an extremely important question. The answer lies in the question that why do we need to evolve simulated drones in the first place? The reasons we would prefer to design drones in simulation lies in the expenditure it may require to test, try, design and evolve controllers for drones while employing real drones. Most of the machine learning algorithms employ hit and trial methods. We literally have to allow the algorithm to err while it tries to find optimum solutions. This is quite natural to suppose and understand as well that as new solutions are designed, or evolved, it is done so at the expense of bad solutions at times. And bad solutions and controllers can result in a lot of crashes, thus making employment of real drones for design of their controllers a very expensive expedition to undertake.
So as a result controllers for drones have to be designed in simulation. Whether or not the simulated controllers would be good enough for deployment in real drones depends partly on the quality of the controllers that have been designed and also on the ability of the simulation environment to mimic most types of real environments. If you want to design controllers that do other complex tasks besides ordinary flying, such as extinguishing fires or coordinate with other drones as they perform complex activities, you would have to develop simulation environments that can allow your drones to do exactly that.
How to evolve controllers for simulated drones then? This is our final question and in order to explain this, I would like to draw your attention to the tutorials about genetic algorithms and genetic programming. Both of them are population-based algorithms. The latter is a lot more powerful as it allows whole computer programs to be evolved. Both algorithms generate a huge population of individuals as they start. Then they evolve newer populations of individuals using genetic operators of crossover and mutation. They test each individual for its fitness to solve the underlying problem. In this problem a fitness score could be based on how well the set of controllers evolved allow the drone to perform the prescribed tasks of coordination while flying and carrying out the tasks. Once all the individuals of the population have been assigned fitness, a certain number of good individuals are kept and bad ones are littered. The good ones are used to make a new parent population of individuals. And a new evolutionary cycles begins.
At this stage it must be fairly intuitive for you to imagine for you that in the beginning the algorithm would generate a lot of bad and naive controllers. And they might result in a lot of crashes if real drones were employed. So we need nice simulation environments. It is only when a certain number of generations have elapsed, the search process may begin to find better individuals. And eventually, as we can hope, it would find an individual set of controllers that has all the dexterity of an adept human pilot in flying the drone.
The controller can be bench-marked at this stage and employed in real drones.
IEEE Xplore Document – Incremental evolution of autonomous controllers for unmanned aerial vehicles using multi-objective genetic programming
Autonomous navigation controllers were developed for fixed wing unmanned aerial vehicle (UAV) applications using incremental evolution with multi-objective
Unmanned Aerial Vehicles (UAVs) have seen unprecedented levels of growth in military and civilian application domains. Fixed-wing aircraft, heavier or lighter than air, rotary-wing (rotorcraft, helicopters), vertical take-off and landing (VTOL) unmanned vehicles are being increasingly used in military and civilian domains for surveillance, reconnaissance, mapping, cartography, border patrol, inspection, homeland security, search and rescue, fire detection, agricultural imaging, traffic monitoring, to name just a few application domains.
If you found an error, highlight it and press Shift + Enter or click here to inform us.