All we like the world of aviation, see off an aircraft from an aircraft carrier is a spectacle. During takeoff is common to find operators and air traffic controllers to pilots waving to enfilen track or preparing for takeoff. This non-verbal communication is key to controlling traffic to door and, of course, requires that the controller and pilot familiar with this language. Given the increasing use of unmanned aircraft, a team from MIT is working on a system to guide the drones by the gestures of drivers on the tarmac.
The idea of the research team at MIT is that the drones are able to process the gestures drivers the carrier deck and also be able to distinguish between driver and gestures all the maelstrom of activity of the carrier. Indeed, the "background noise" caused by the activity of the carrier deck the main problem that the MIT team, led by Yale Song, a doctoral student in the Department of Electrical Engineering and Computer Science, his thesis Professor Randall Davis and David Demirdjian, a researcher at the Artificial Intelligence Laboratory at MIT, wanted to solve.
This algorithm processes the images captured by a camera system similar to Kinect (do not use Microsoft's system because the project is prior to its release) three-dimensional shapes that can recognize and track the movement of the arms. From there, the system captures a three dimensional image of the driver and remove the bottom to start making estimates of body posture and compare a number of predefined patterns and air traffic control. The system key is to locate the hand gestures and the "library" of body patterns, two pieces of information that make the system can "understand" the order has been given.
In tests, the system has been loaded with the 24 most commonly used patterns to door, of which approximately 76% of orders have been processed properly, which serves to give us an idea of the complexity of the system and it seems to go on the road to make it perfectly feasible.