Autonomous Driving - Capstone Project

For my capstone project, I worked with a team to drive a miniature robot autonomously and avoid obstacles. I named the team "LightSpeed" and created our logo (on the right). Near the end of the project, we competed in a hackathon hosted at ASU and organized by Amazon. The team worked very hard and persisted through difficulties during the hackathon, and in the end, it paid off. Our team clocked the fasted lap time around a track and took first place. We enjoyed applying machine learning to the robot, and it was incredibly rewarding to show it off and see our hard work in action.

Autonomous Driving

The first part of the challenge was to drive the robot autonomously through a miniature race track. The team with the best lap time around the track won the most points for this category. The bot had a front-facing camera, so to drive our robot, our team used computer vision to analyze the video feed and steer the bot to stay within boundaries and finish the track.

The team first took pictures of the track from the bots camera, and marked the x and y coordinates that the bot should steer towards. The coordinates serve as a direction that the robot should steer towards. We then used transfer learning to create our own robust convolutional neural network to predict the x and y coordinates. Taking a pre-trained res-net18 and training it on our images was much faster and more robust than training a CNN from scratch.

Once the model was trained, the robot was ready to race through the track autonomously. We repeatedly tested it and tuned some steering and speed parameters to achieve the fastest time. During the competition, our robot lapped the track in 9.27 seconds and took first place.

Showboat

Our team also created what we called, the JetBot Terminator. The robot was an assassin trained to recognize images of faces and kill them. Once the robot received an SMS message with the name of its target, it would spin in a circle to find its target. Once found, it would move in, terminate the subject, and return to its original position waiting for its next target. Watch the video of me presenting :)

This application used Flask, ngrok, and Twilio to set up a server on the robot to receive SMS messages. Once it received the SMS message, it spun in a circle until the camera was pointing at its target. We used transfer learning again to train an Alexnet CNN to classify the faces of targets. The rest of the application is just some simple motor controls to move the bot forward and backward.

Acknowledgements

Thanks to Professor Yinong Chen for working with our team throughout the capstone project.

What I've Learned

  1. AWS - Robomaker, Sagemaker, Greengrass, and Cloud9 are services that I used to deploy ROS applications, use GPU accelerated cloud computing, and use an IDE in the cloud.
  2. Robotics - the Jetson Nano uses an ARM64 architecture, and so all programs that run on it need to support that architecture.
  3. Docker - we needed to bundle and build our ROS application inside a docker container in order for it to work with the ARM64 architecture of the Jetson Nano.
  4. Teamwork - I've figured out efficient teamwork methods after leading our team of 4 throughout the project.

Two Articles about the Hackathon