Skip to content

Team LastMinute's Capstone Project for Udacity Self-Driving Car Nanodegree

Notifications You must be signed in to change notification settings

alistairskirk/CarND-Capstone

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Waffle.io - Columns and their card count

Team LastMinute

Team members:

This is the project repo for the final project of the Udacity Self-Driving Car Nanodegree: Programming a Real Self-Driving Car. For more information about the project, see the project introduction here.

Project Description

Node implementation

Waypoint Updater

In order for the car to safely and successfully navigate the track, the car must come to a stop when reaching red lights. The waypoint updater node helps accomplish this through adjustments to the target velocities at the waypoints leading up to a red light.

The node takes in data on the track waypoints, continuously updated data about the car's pose, continuosly updated velocity data, and updates on the state of upcoming traffic lights. When an upcoming red light is detected, a certain amount of nodes before (25) and after (2) have their target velocities adjusted so that the car will come to a complete stop at the traffic light stop line. Each node from the first adjusted node before the traffic light up to the line at which we seek to stop is set with a linearly decreasing target velocity that results in the traffic light waypoint having a velocity of 0.

Through this waypoint update and the vehicle's controller, the car is able to come to a complete stop at the traffic light stop point.

DBW

The Drive by Wire (DBW) node takes in commands from upstream services in the form of ROS Twist commands. The commands are essentially the desired car velocity (in m/s) in the axial (or forward) direction of the car, and the desired angular velocity (in rad/s) to assist in steering.

The commands are interpreted by the DBW node to output:

  • Throttle setting (between 0 and 1)
  • Steering angle (in radians, 0 is straight)
  • Braking (in Newton meters, must be positive)

These three output values are transmitted to the vehicle hardware control mechanisms, and are actuated by either the simulator or the self driving car.

Low Pass Filter

The sensed current velocity had a significant amount of high frequency noise in simulator version 1.2. This required the sensed velocity to be passed through a low pass filter, which introduces very slight lag but effectively eliminated the noise. The low pass filter used the following settings:

  • Hz = 0.0001 # Setting cutoff freq = 0.0001 Hz
  • Tau = 1./(hz* 2* pi) = 1592
  • S = 50.

It should be noted that the simulator version 1.3 seems to have much lower noise from the sensors, but the low pass filter above works for this version also.

Throttle PID Controller

The throttle is controlled using a PID system: The desired speed / throttle input was modelled with a Transfer Function and is given by: 0.0106 / s^2 + 0.309* s + 0.0006874 The transfer function was used in a feedback control analysis to determine the best coefficients for the PID system as follows:

  • kP = 0.724
  • kI = 0.002036
  • kD = 0.0
Steering Controller

The steering is controlled with a project supplied yaw controller python file that initializes with properties of the car including the wheel base, steering ratio, and minimum and maximum steering angles and accelerations. This controller is similar to a direct proportional control where the steering angle is adjusted according to the difference between the desired angle and the current steering angle.

Braking Controller

The braking controller is expected to be in units of Torque (Nm). The controller will engage the brake only if the throttle is below a minimum threshold (here 0.1) and the desired velocity is lower than the current velocity. This prevents the brakes from being applied at the same time as the throttle which may cause adverse driving behaviour and unnecessary wear on the brake pads.

The braking force required was calculated using Newtons second law; by using the known weight of the vehicle, added to the amount of fuel in the gas tank (converted to kg), and multiplied by the desired deceleration.

The braking torque was then calculated as the braking force multiplied by the wheel radius. This braking torque must be positive and was transmitted as a braking command to the vehicle actuators.

Performance of the braking and acceleration limits could be further tuned in the future. It is likely that all controllers would need to be tuned for the Udacity Self Driving Car - Carla, as the simulator parameters may not reflect reality.

Traffic Light Detection

When approaching a traffic light, the vehicle needs to recognize the color of the light and act accordingly. The traffic light detection node uses a neural network classifier to determine if there is a light, what state it is in, and then chooses the correct stopping position for the light from a predetermined list of positions.

In order to do this, it takes in continuously updated feeds on the vehicles pose and images from the car's cameras. This is combined with known information of the waypoints along the road and positions to stop for each stop light. With this information, the traffic light detection node tries to find the closest traffic light stop point in view, if it finds one, it classifies the lights state and returns both the nearest stop point and the traffic light state.

Additional Notes on Traffic Light Detection

Native Installation

  • Be sure that your workstation is running Ubuntu 16.04 Xenial Xerus or Ubuntu 14.04 Trusty Tahir. Ubuntu downloads can be found here.

  • If using a Virtual Machine to install Ubuntu, use the following configuration as minimum:

    • 2 CPU
    • 2 GB system memory
    • 25 GB of free hard drive space

    The Udacity provided virtual machine has ROS and Dataspeed DBW already installed, so you can skip the next two steps if you are using this.

  • Follow these instructions to install ROS

  • Dataspeed DBW

  • Download the Udacity Simulator.

Docker Installation

Install Docker

Build the docker container

docker build . -t capstone

Run the docker file

docker run -p 4567:4567 -v $PWD:/capstone -v /tmp/log:/root/.ros/ --rm -it capstone

Usage

  1. Clone the project repository
git clone https://github.com/udacity/CarND-Capstone.git
  1. Install python dependencies
cd CarND-Capstone
pip install -r requirements.txt
  1. Make and run styx
cd ros
catkin_make
source devel/setup.sh
roslaunch launch/styx.launch
  1. Run the simulator

Real world testing

  1. Download training bag that was recorded on the Udacity self-driving car (a bag demonstraing the correct predictions in autonomous mode can be found here)
  2. Unzip the file
unzip traffic_light_bag_files.zip
  1. Play the bag file
rosbag play -l traffic_light_bag_files/loop_with_traffic_light.bag
  1. Launch your project in site mode
cd CarND-Capstone/ros
roslaunch launch/site.launch
  1. Confirm that traffic light detection works on real life images

About

Team LastMinute's Capstone Project for Udacity Self-Driving Car Nanodegree

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 82.7%
  • CMake 10.4%
  • C++ 6.8%
  • Shell 0.1%