Innovative drone research opens opportunities

By Everett Dorma Posted: October 28, 2015 10:00 a.m.

(L-R) Seang Cau, Shehryar Khurshid, Muhammad Faheem and Dr. Raman Paranjape
(L-R) Seang Cau, Shehryar Khurshid, Muhammad Faheem and Dr. Raman Paranjape Photo: U of R Photography

Drones are getting smarter.

Dr. Raman Paranjape, Professor, Electronic Systems Engineering and graduate students Muhammad Faheem, Seang Cau, and Shehryar Khurshid are enabling drones or unmanned aerial vehicles (UAVs) to move autonomously and recognize and track specific objects independently by using various algorithms including Simultaneous Location and Mapping (SLAM) and extended Kalman Filters.
     
Depth perception is the visual ability to perceive the world in three dimensions and determine the distance of an object. People are able to perceive depth because we have two eyes, each receiving slightly different visual information, which is combined in the brain giving us a three dimensional picture of the world. A UAV, which typically has only one camera as its eye, cannot perceive depth and without an operator guiding its movements it would soon crash.

Drone tracking Shehryar (click on image to watch video).
UAV following Shehryar (click on image to watch video). Photo: External Relations.

“Simultaneous Location and Mapping allows us to approximate depth perception,” says Dr. Paranjape. “As the drone moves, the onboard camera records the location of an object and measures how it appears to move in relation to the drone’s movements. By calculating the distance to various objects as it moves the drone is able to build a three dimensional map of the world around it and to recognize where it is located within that map.”

In order to avoid crashing the UAV, a human operator must be able to see it in a three dimensional space.  By providing the drone with a map of the space and its initial location within that map, the drone can recognize its position as it moves through the space and avoid objects without a human controlling it.

Enabling UAVs to move independently in areas where it may be too dangerous or impossible for a human operator to go opens up new opportunities for the military, first responders, such as police and search and rescue, as well as a variety of industrial applications.

In a bomb threat for example, a UAV could be sent into a building to locate the bomb without the police having to send personnel into a potentially dangerous situation.

Similarly an engineering firm could use drones to fly around a bridge or other installation, to determine its structural integrity. Farmers could use a drone to fly over fields and determine how wet the soil is and if it is safe to operate machinery for seeding, spraying or harvesting.

As demonstrated in this video Dr. Paranjape’s team has also been using visual mapping to recognize and track objects, including human faces.

Drone's map of Shehryar's face. Photo: External Relations.
Map of Shehryar's face. Photo: External Relations.

“By mapping specific features of a face for instance, the drone can search for, recognize and even follow an individual,” says Shehryar Khurshid.  “This feature may be particularly useful in locating lost individuals or finding someone in a crowd.”

The researchers have even been able to use pictures of an individual to train the drone to recognize and track the person.

“When we first started working with UAVs a couple years ago we were working with the RCMP and focused on accident reconstruction,” says Paranjape.  “Although that project has ended and our focus is now on the automation of these machines we are optimistic that we will be able to continue our relationship with the RCMP as well as with other public safety organizations and industrial players as there are so many potential applications for this technology.”