mp1: Camera based lane detection module in Bird eye view. http://publish.illinois.edu/safe-autonomy/files/2022/02/ECE484_MP1_SP2022.pdf
mp2: Waypoints tracking controller http://publish.illinois.edu/safe-autonomy/files/2022/02/ECE484_MP2_SP2022.pdf
mp3: Find location in the building using particle filter and localization http://publish.illinois.edu/safe-autonomy/files/2022/03/ECE484_MP3_SP2022-3.pdf
mp4: Design a basic autonomous agent pipeline that contains perception, decision and planning and low-level control modules so that your Autonomous agent can perform lane tracking on a race track while avoiding obstacles (other stationary vehicles). http://publish.illinois.edu/safe-autonomy/files/2022/03/ECE484_MP4_SP2022.pdf
Final project Self-driving GEM vechile 1.Analyzed radar, lidar, camera, GPS signals to achieve robust vision-based lane tracking and intelligent stop in the real world environment 2. Designed combination of filters to process the images to ensure vehicle follows in different weather and traffic conditions. Utilized modified YOLOv3 with lidar data to detect the stop sign, traffic light ,and pedestrians to act safely.