Udacity Releases Self-Driving Car Simulator Source Code

Image Credit: GitHub.

Udacity this week released the source code to its self-driving car simulator. The simulator was originally built to teach its Self-Driving Car Engineer Nanodegree students how to use deep learning to clone driving behavior.

In the simulator, users can steer a car around a track to collect “image data and steering angles to train a neural network,” according to the project overview. They will train, validate and test their model to drive the car autonomously around the track using Keras, a high-level neural networks library that is written in Python and capable of running on top of Google’s TensorFlow or Theano, two open source deep learning frameworks. The Unity game developer platform is needed to load all the assets for the project.

When Udacity launched its Self-Driving Car Engineer Nanodegree last September, CEO Sebastian Thrun said the end-goal was to open source the software for anybody to use. Since most self-driving software is developed in a virtual environments, the repository serves as a resource for individuals and organizations to develop their own scenes in Unity or test out their own software — including higher ed institutions that have been ramping up their own research efforts.

Featured

  • row of students using computers in a library

    A Return to Openness: Apereo Examines Sustainability in Open Source

    Surprisingly, on many of our campuses, even the IT leadership responsible for the lion's share of technology deployments doesn't realize the extent to which the institution is dependent on open source. And that lack of awareness can be a threat to campuses.

  • abstract pattern of cybersecurity, ai and cloud imagery

    OpenAI Report Identifies Malicious Use of AI in Cloud-Based Cyber Threats

    A report from OpenAI identifies the misuse of artificial intelligence in cybercrime, social engineering, and influence operations, particularly those targeting or operating through cloud infrastructure. In "Disrupting Malicious Uses of AI: June 2025," the company outlines how threat actors are weaponizing large language models for malicious ends — and how OpenAI is pushing back.

  • cloud icon with a padlock overlay set against a digital background featuring binary code and network nodes

    New Cloud Security Auditing Tool Utilizes AI to Validate Providers' Security Assessments

    The Cloud Security Alliance has announced a new artificial intelligence-powered system that automates the validation of cloud service providers' (CSPs) security assessments, aiming to improve transparency and trust across the cloud computing landscape.

  • geometric grid of colorful faculty silhouettes using laptops

    Top 3 Faculty Uses of Gen AI

    A new report from Anthropic provides insights into how higher education faculty are using generative AI, both in and out of the classroom.