Remotely Processed Visual and Odometric SLAM

Participants: 

Omer Mano
Wayne Chang
Eric Wengrowski

Advisor: 
Prof. Kristin Dana



Objective
To design and build a robotic mapping system that collects visual and odometric data, which is remotely processed by a server for localization and reconstruction.


Background

Robot localization is extremely important for any autonomous mobile or mapping system. Often relying solely on motor control to determine robot position is extremely unreliable, especially in the steady state. SLAM (Simultaneous Localization and Mapping) is a systematic method of triangulating robot position based on the movement of visual landmarks. However SLAM estimations usually contain positional uncertainties that are significant in the transient state.


Introduction

More robust positional awareness can be achieve through a hybrid approach of visual SLAM and rotary encoder odometry data. With a more reliable understanding of robot position, the relative distances of landmarks can be more accurately computed, and more precise maps of the robot's environment can be generated.

Abstract