3D Object Detection and Tracking with TI Automotive 4D Imaging Radar
Real-Time Control Module for HIL
Polaris GEM e4 Electric Self-driving Vehicle
- Sensing & Preprocessing
- Perception
- Localization & Mapping (SLAM/V-SLAM)
- Planning
- Dynamics & Control
- Vehicle Interface & PACMod2
- Simulation
I led the integration of various sensors—including LiDAR, camera, GNSS/IMU, and 4D automotive radar—on this electric self-driving vehicle from Jan. 2024 to Mar. 2024. I also configured the sensors using their respective SDKs. This vehicle supports both ROS Noetic (ROS1) and ROS Humble (ROS2) for seamless operation.
The ultimate goal of my work at the Center for Autonomy is to develop an autonomous driving vehicle with Level 3/4 capabilities and a corresponding software stack. I plan to begin by deploying the open-source Autoware.Auto software to the vehicle. From there, key modules such as perception, localization and mapping, planning, and control will be further developed, customized, and refined. Stay tuned for future updates!
3D Object Detection and Tracking with TI Automotive 4D Imaging Radar
My Ph.D. thesis focuses on utilizing TI FMCW (Frequency-Modulated Continuous Wave) millimeter-wave imaging radar evaluation modules to capture raw radar cube data and perform real-time signal processing for 3D object detection and tracking.
Real-Time Control Module for HIL
I developed a real-time control module (a.k.a. simplified dSPACE MicroAutoBox or Speedgoat target machine) using TI C2000 microcontrollers for hardware-in-the-loop (HIL) applications in electric vehicles, utilizing both CAN and Ethernet interfaces. The system architecture assigns one CPU and CLA group to handle low-level control tasks such as steering and motor actuation of the gas and brake pedals. The other CPU and CLA group is designated for higher-level control algorithms, including path tracking methods like pure pursuit or Stanley, as well as more advanced techniques like LQR and MPC, etc.
Back to top of the page