This is the public website for CS588 Spring 2024. Course information will be kept up to date on Canvas (https://canvas.illinois.edu) for enrolled students.
This course will introduce students to the computational principles involved in autonomous vehicles, with practical labwork on an actual vehicle. Sensing topics will include vision, lidar and sonar sensing, including state-of-the-art methods for detection, classification, and segmentation. Bayesian filtering methods will be covered in the context of both SLAM and visual tracking. Planning and control topics will cover vehicle dynamics models, state-lattice planning, sampling-based kinodynamic planning, optimal control and trajectory optimization, and some reinforcement learning. Evaluation will involve ambitious challenge projects implemented on a physical vehicle.
Time and Location
09:30AM – 10:45AM Mondays and Wednesdays
1310 Digital Computer Laboratory
Labs are located at Highbay Lab, 60 Hazelwood Dr, Bay C, Champaign, IL 61820
Course Staff
Instructor: Kris Hauser
kkhauser@illinois.edu
https://kkhauser.web.illinois.edu/
Office hours: 4-5PM, Tuesdays
Highbay Lab
TA: Yana Zhao
TA: Jiangwei Yu
Center for Autonomy staff software engineer: Hang Cui
Center for Autonomy lab manager: John Hart
Prerequisites
CS374, ECE484, or equivalent.
Readings
Readings will be excerpted from the following online texts:
- K. Hauser. Robotic Systems book draft, 2023.
- Szeleski, R. Computer Vision: Algorithms and Applications.
- O’Kane. A Gentle Introduction to ROS. 2014.
- Lynch, K.M. and Park, F.C. Modern Robotics. Cambridge University Press, 2017.
- Murphy, K.P. Probabilistic Machine Learning: Advanced Topics. MIT Press, 2023.
- Choset, Lynch, Hutchinson, Kantor, Burgard, Kavraki, and Thrun. Principles of Robot Motion: Theories, Algorithms, and Implementations. MIT Press, Boston, 2005.
- Murray, Li, and Sastry. A Mathematical Introduction to Robotic Manipulation. CRC Press, 1994.
- LaValle, S. Planning Algorithms. Cambridge University Press, 2006.
Course organization
This course will expose students to the process of engineering a large hardware and software system, which is a radically different environment than one would find in typical university courses. Students will imagine themselves as being engineers in an autonomous vehicle startup company trying to build its first self-driving product.
The first half of the course is a “bootcamp” that involves lectures and small group homework assignments on the GEM vehicle. In the second half, all groups will work on separate aspects of the autonomous vehicle software stack. Groups will be contributing to a monolithic repository of the software stack (aka, a “monorepo”) that represents the best version of the system to date. They will also evaluate their work on datasets, simulators, and on the real vehicle. The work will culminate in a final presentation outlining the group’s contributions toward the self-driving product.
Group members are expected to interact frequently amongst each other and with other groups in and out of the lab. In class, groups will deliver design documents and presentations on their ideas, goals, plans, and progress on a regular basis. Another important component of working on a large team is feedback, and individual students will offer periodic peer reviews of other students’ performance both on and off their group.
Software and Hardware
The hardware used in this course include two vehicles, GEM e2 and GEM e4, with drive-by-wire and a sensor package. Detailed information on these vehicles hardware and driver stack can be found here. The GEM e4 is still in development this semester and is likely to be available for class use by February.
We will be using Git and Github for all software development. We recommend Github Desktop and Visual Studio Code as the IDE, but you are free to use whatever development environment you like. Nearly all of the programming in the course will be done in Python 3.7+, and some components of the course will use ROS Noetic. Useful libraries include Numpy, Scipy, PyTorch, Shapely, CasADi, and Klampt.
The main Github repository is found here. The course branch is s2024, and your group will have its own development branch s2024_groupX. Your team’s contributions should be tested on your team’s branch. Once you have presented your work in class and obtain approval to contribute to the class system, you may then submit a pull request to the s2024 branch.
Course staff have several SSD hard drives that can be directly connected to the GEM computer. The base image on these hard drives contains the drivers and software sufficient to run basic demonstrations on the GEM vehicle. You can ask course staff for a hard drive to be wiped back to the base image at any time.
Groups
Students will self-organize into groups of at least 4 and no more than 10 students, working on sub-areas of autonomous vehicle system development. Each group will nominate a Lead who acts as the manager of the team. The Lead will handle scheduling, organizational issues, will have the final say in project approval and staffing, and will review group members’ contributions to the effort. Non-leads will also review the Lead’s performance in managing the group. Students can change groups with approval from group Leads, and Leads can also vary through the semester. The instructors should be notified about any group composition changes.
Each group will choose a thematic area and establish projects within that area. Projects should be chosen considering:
- Alignment with product goals
- Reasonable scope to be accomplished within the semester timeframe
- Intermediate and final goals should be specific and measurable
- Coordination with the efforts of other teams
Students will present their proposed projects through design documents and presentations to receive feedback from the rest of the class. Projects can and will change throughout the semester, and the design documents and presentations will serve as a history to track how the development process evolved through the semester.
Theme areas and potential projects may include, but are not limited to:
- Hardware / dynamics: sensor calibration; calibrate steering and driving dynamics; calibrate geometry; tune trajectory tracking and actuation limits around curves; surface and weather-conditioned dynamics models for planning; get GEM e4 and Pacifica operational with software stack.
- Vision: train detectors on KITTI and Waymo datasets; 3D agent and obstacle detection; sign detection; dynamic lane detection; weather and road surface estimation.
- Estimation: improving state estimation using sensor fusion / filtering; agent motion prediction; roadgraph-conditioned agent motion prediction; determination of lead-vehicle and pass/yield relationships.
- SW infra: continuous integration (CI) testing; adding linters to code check-in; log management; visualization of internals with Gazebo or Foxglove; map visualization and editing; A/B testing; parallel execution.
- Simulation: make road graphs and scenes; agent simulation with IDM or learned models; integration with CARLA or Gazebo and sensor simulation.
- Reasoning: import road graphs from OpenStreetMap; create dynamic routers that accept targets; create driving logic for stop signs, stop lights, and lane changes.
- Planning: motion primitive selection planner; trajectory optimization; avoid agents and obstacles; perform crosswalk / stop sign / stop light logic; planning for lane changing; free-form parking lot maneuvers.
Tentative Schedule
- Week 1-2: Introduction to system engineering
Lecture topics: Course introduction, logistics overview, group formation, hardware and software infrastructure. System integration techniques, ROS. Coordinate transforms.
Deadlines: Course survey. Scheduling lab time. HW1. - Week 3-4: Vehicle dynamics and control
Lecture topics: Vehicle models, simulation, PID control. Pure pursuit trajectory tracking. Trajectory representations, continuity. Software engineering best practices.
Deadlines: Safety Driver training. HW2. - Week 5-6: 3D vision and knowledge representation
Lecture topics: Sensor and camera models. 3D scenes and geometric queries. Sensor calibration. Object recognition and segmentation. Working with vision datasets.
Deadlines: HW3. Group area identification and project brainstorming. - Weeks 7-8: Motion planning and model predictive control
Lecture topics: Motion primitive planning, state lattice planning, trajectory optimization, model predictive control, real-time considerations.
Deadlines: Checkpoint 1. Design document, design review presentation. Peer reviews. - Week 9-10: State estimation and trajectory prediction
Lecture topics: Probabilistic filtering, Kalman filter and its variants. Monte Carlo methods and particle filtering. System ID and trajectory prediction.
Deadlines: Checkpoint 2. Design document revision, progress report presentation. Peer reviews. - Week 11-12: Learning in autonomous vehicles
Lecture topics: TBD
Deadlines: Integration Checkpoint. Progress report presentation. Peer reviews. - Week 13-14: Handling uncertainty in autonomous vehicles
Lecture topics: TBD
Deadlines: Checkpoint 4. Design review presentation. Peer reviews. - Week 15: Wrap up
Lecture topics: TBD
Deadlines: Final pitch presentation. Code contribution review. Peer reviews.