Frame-to-Frame ego-motion estimation for agile drones with Convolutional Neural Network

dc.contributor.advisorThill, Serge
dc.contributor.advisorCroon de(Delft University of Technology), Guido)
dc.contributor.authorBogoda Arachchige Sameera Sandaruwan, Sameera Sandaruwan
dc.contributor.authorBogoda Arachchige, Sameera Sandaruwan
dc.date.issued2020-04-01
dc.description.abstractOpenly available feature based Visual Odometery algorithms, have a high computational cost. This hinder the use of these algorithms in agile robotics platforms such as racing drones. A wide range of researchers have looked into, using data-driven learning methods such as Deep Learning, instead use of traditional feature based methods for developing a new class of visual odometery systems. These systems have mainly relied on datasets designed for low degree-of-freedom systems such as cars. Hence, even this new class of algorithms, su er from high computational cost and inability to handle agile robotics sytems. Therefore, this research will look into developing a learning based visual odometry system, speci cally designed for racing drones, which can handle high speed agile motion with low computational cost. Keywords| Drone, Neural Networks, Visual-odometry, Ego-motionen_US
dc.identifier.urihttps://theses.ubn.ru.nl/handle/123456789/10121
dc.language.isoenen_US
dc.thesis.facultyFaculteit der Sociale Wetenschappenen_US
dc.thesis.specialisationMaster Artificial Intelligenceen_US
dc.thesis.studyprogrammeArtificial Intelligenceen_US
dc.thesis.typeMasteren_US
dc.titleFrame-to-Frame ego-motion estimation for agile drones with Convolutional Neural Networken_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
S1014012_Bogoda ArachchigaS.pdf
Size:
9.55 MB
Format:
Adobe Portable Document Format