Skip to content

An Unscented Kalman Filter that estimates the state of multiple cars on a highway using noisy lidar and radar measurements.

Notifications You must be signed in to change notification settings

ken-power/SensorFusionND-UnscentedKalmanFilter

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Unscented Kalman Filter

This project is the final project in Udacity's sensor fusion nanodegree. This project implements an Unscented Kalman Filter to estimate the state of multiple cars on a highway using noisy lidar and radar measurements. The project obtains RMSE values that are lower than the tolerance outlined in the project specification below.

Highway Demo

Before

The following animated GIF demonstrates the highway simulation before implementing the unscented kalman filter. The RMSE error values can be seen in red, showing that they exceed their respective threshold values.

After

The following animated GIF demonstrates the highway simulation after implementing the unscented kalman filter. There are no RMSE error values, showing that they do not exceed their respective threshold values.

The above GIF is an extract of a few seconds. This MP4 file demonstrates the full 30 seconds execution of the program.

Accuracy

The px, py, vx, vy output coordinates must have an RMSE <= [0.30, 0.16, 0.95, 0.70] after running for longer than 1 second.

The simulation collects the position and velocity values that the UKF algorithm outputs and they are compare to the ground truth data. My px, py, vx, and vy RMSE should be less than or equal to the values [0.30, 0.16, 0.95, 0.70] after the simulator has ran for longer than 1 second. The simulator will also display an error message if the RMSE values surpass the threshold.

I added a simple private funciton in highway.h to write the RMSE results to a CSV file to allow a detailed performance analysis:

    void WriteToFile(VectorXd rmse_data)
    {
        const string filename = "../../results/ukf_performance_results.csv";
        const string separator = ", ";
        ofstream results_file;
        results_file.open(filename, ios_base::app);
        results_file << rmse_data[0] << separator << rmse_data[1] << separator << rmse_data[2] << separator << rmse_data[3] << endl;
    }

I then imported the CSV data into a spreadsheet and used that to create the graphs used in this analysis.

The following graph shows the output coordinate before implementing the UKF functions. In this case, the values are clearly way above the RMSE threshold values.

The following graph shows the effect of implementing the UKF functions: the output coordinates stay well below the RMSE values after 1 second.

And here are the two graphs again, for side-by-side comparison:

Before implementing UFK functions With UKF functions

Comparing results when only one sensor is used

As expected, Sensor fusion gives better results than using only one sensor type. For comparison, the following two graphs show the results when only using radar or only using lidar.

There are two flags in ukf.cpp that can be toggled to enable or disable the sensors:

    // LiDAR measurements will be ignored if this is false
    use_laser_ = true;

    // RADAR measurements will be ignored if this is false
    use_radar_ = true;

Comparing results when only using Radar

By the time 3 seconds have passed, all of the RMSE values have exceeded their thresholds.

Comparing results when only using Lidar

Lidar alone performs much better than Radar alone. Vx is consistently above it's RMSE threshold. All of the other RMSE values are below their respective threshold values. However, the performance of Lidar alone is still not as good as both sensors together.

Code for the Unscented Kalman Filter

The Unscented Kalman Filter is implemented in the class UKF. The header file for UKF is ukf.h. The following extract shows the public interface of the UKF class:

class UKF
{
public:
    /**
     * Constructor
     */
    UKF();

    /**
     * Destructor
     */
    virtual ~UKF();

    /**
     * ProcessMeasurement
     * @param measurement_package The latest measurement data of either radar or laser
     */
    void ProcessMeasurement(MeasurementPackage measurement_package);

    /**
     * Prediction Predicts sigma points, the state, and the state covariance
     * matrix
     * @param delta_t Time between k and k+1 in s
     */
    void Prediction(double delta_t);

    /**
     * Updates the state and the state covariance matrix using a laser measurement
     * @param measurement_package The measurement at k+1
     */
    void UpdateLidar(const MeasurementPackage & measurement_package);

    /**
     * Updates the state and the state covariance matrix using a radar measurement
     * @param measurement_package The measurement at k+1
     */
    void UpdateRadar(const MeasurementPackage & measurement_package);

    /**
     *
     * @return state vector: [pos1 pos2 vel_abs yaw_angle yaw_rate] in SI units and rad
     */
    const Eigen::VectorXd & State() const;

    ...
}

Internally, the UKF implementation uses a number of private functions to help with the prediction workload:

...

private:
    Eigen::MatrixXd AugmentedSigmaPoints() const;

    void PredictSigmaPoints(Eigen::MatrixXd *Xsig_pred, double delta_t, const Eigen::MatrixXd & Xsig_aug);

    void PredictMeanAndCovariance();

...

The main() function in main.cpp constructs the highway and launches the viewer.

int main()
{
    pcl::visualization::PCLVisualizer::Ptr viewer(new pcl::visualization::PCLVisualizer("3D Viewer"));
    viewer->setBackgroundColor(0, 0, 0);
    
    // set camera position and angle
    viewer->initCameraParameters();
    float x_pos = 0;
    viewer->setCameraPosition(x_pos - 26, 0, 15.0, x_pos + 25, 0, 0, 0, 0, 1);
    
    Highway highway(viewer);
    
    //initHighway(viewer);
    
    int frame_per_sec = 30;
    int sec_interval = 10;
    int frame_count = 0;
    int time_us = 0;
    
    double ego_velocity = 25;
    
    while(frame_count < (frame_per_sec * sec_interval))
    {
        viewer->removeAllPointClouds();
        viewer->removeAllShapes();
        
        //stepHighway(egoVelocity,time_us, frame_per_sec, viewer);
        highway.StepHighway(ego_velocity, time_us, frame_per_sec, viewer);
        viewer->spinOnce(1000 / frame_per_sec);
        frame_count++;
        time_us = 1000000 * frame_count / frame_per_sec;
    }
}

The Highway constructor creates the highway, including the simulated cars. The Highway constructor also creates an Unscented Kalman Filter for each car, e.g.,:

    UKF ukf1;
    car1.setUKF(ukf1);

The call to the highway.StepHighway() function

Building and running the project

The main program can be built and ran by doing the following from the project top directory.

  1. mkdir build
  2. cd build
  3. cmake ..
  4. make
  5. ./ukf_highway

The code in main.cpp is using highway.h to create a straight 3 lane highway environment with 3 traffic cars and the main ego car at the center.

The viewer scene is centered around the ego car and the coordinate system is relative to the ego car as well. The ego car is green while the other traffic cars are blue. The traffic cars will be accelerating and altering their steering to change lanes. Each of the traffic cars has it's own UKF object generated for it, and will update each individual one during every time step.

The red spheres above cars represent the (x,y) lidar detection and the purple lines show the radar measurements with the velocity magnitude along the detected angle. The Z axis is not taken into account for tracking, so you are only tracking along the X/Y axis.

Dependencies

Basic Build Instructions

  1. Clone this repo.
  2. Make a build directory: mkdir build && cd build
  3. Compile: cmake .. && make
  4. Run it: ./ukf_highway

Code Style

This project uses Google's C++ style guide.

Generating Additional Data

If you'd like to generate your own radar and lidar modify the code in highway.h to alter the cars. Also check out tools.cpp to change how measurements are taken, for instance lidar markers could be the (x,y) center of bounding boxes by scanning the PCD environment and performing clustering. This is similar to what was done in Sensor Fusion Lidar Obstacle Detection.

References

About

An Unscented Kalman Filter that estimates the state of multiple cars on a highway using noisy lidar and radar measurements.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages