Software Documentation

This is the main page for software documentation. Check below for the various sub-topics.

Link to software main page: Software Homepage

General

 * PY2020 Software System Overview

Guides

 * Camera calibration guide
 * Rover operation procedures
 * Autonomous navigation

Computer Vision

 * AR tag detection system

Mapping and Autonomous Navigation
In order to navigate autonomously, the rover needs to simultaneously build a map of the environment (i.e. AR tag locations) and estimate the rover's location within that map. To do this, our system uses factor-graph-based simultaneous localization and mapping (SLAM). This is a family of algorithms that attempt to combine all of the available sensor information and use numerical optimization to find a map and a sequence of rover poses that best match the sensor data. You can think of it like a set of springs with various stiffnesses connected to each other in a network. At equilibrium, the vertices of the network determine our best guess for the landmark locations and rover positions. Obstacles are added to the map based on hits from our lidar sensor.

Given a map (including the current rover position and a goal position), we compute a plan (a path to the goal that does not hit obstacles) using A* ("A-star") search. Normally, the goal position is determined by GPS, but when we are close enough to see the AR tags, we instead use the camera to estimate the goal position.

Finally, given a plan, we choose a drive target on the plan a few meters in front of the rover, then compute a desired forward velocity and angular velocity that aims the rover at that position. After that, we convert the velocities to motor torques and send corresponding CAN packets to the motor boards.

Networking/Communication
There are two main kinds of communication that the software team needs to worry about: (1) Mission Control <-> Jetson, and (2) Jetson <-> lower-level electronics. (The Jetson is the main computer on the rover.) The first kind of communication is entirely under the control of the software team (we can do whatever we want), while the second kind requires coordination with the electronics team to agree on a communication protocol. There's also (3) a few sensors that the software team manages directly without the help of the electronics team (such as the lidar, GPS, and cameras), usually because they're simple USB connectors that don't require special wiring.


 * Communication Protocol: Mission Control to and from Jetson
 * Communication Protocol: Jetson to and from Electronics Boards
 * Communication Protocols: USB-Connected Sensors