Uber and Nuro begin employee testing of a Lucid Gravity robotaxi in San Francisco
Back to Tutorials
techTutorialintermediate

Uber and Nuro begin employee testing of a Lucid Gravity robotaxi in San Francisco

April 13, 20261 views5 min read

Learn how to create autonomous vehicle simulations using NVIDIA's DRIVE AGX platform, similar to what Uber and Nuro are testing in San Francisco with the Lucid Gravity robotaxi.

Introduction

In this tutorial, you'll learn how to work with autonomous driving simulation technology using NVIDIA's DRIVE AGX platform, similar to what Uber and Nuro are testing in San Francisco. We'll focus on creating a basic simulation environment for autonomous vehicle development using NVIDIA's Isaac SDK and Python. This hands-on approach will give you practical experience with the core concepts behind robotaxis like the Lucid Gravity.

Prerequisites

  • NVIDIA DRIVE AGX platform or access to NVIDIA's simulation environment
  • Python 3.7 or higher
  • NVIDIA Isaac SDK installed
  • Basic understanding of autonomous vehicle concepts and ROS (Robot Operating System)
  • Access to a simulation environment (either local or cloud-based)

Step-by-Step Instructions

1. Set Up Your Development Environment

First, we need to prepare our development environment with the necessary tools. The NVIDIA DRIVE AGX platform provides the foundation for autonomous vehicle simulation, similar to what Uber and Nuro are using in their testing.

# Install required packages
pip install isaac-sdk
pip install nvidia-isaac
pip install ros2-isaac

Why: This setup ensures we have all the necessary libraries to work with NVIDIA's autonomous driving simulation framework, which is the backbone of systems like the Lucid Gravity robotaxi.

2. Create a Basic Autonomous Vehicle Simulation

Next, we'll create a simple simulation environment that mimics the autonomous driving capabilities we see in real robotaxis. This will include setting up a vehicle model and basic sensor simulation.

import isaacgym
import isaacgymenvs
import numpy as np

# Initialize the simulation environment
gym = isaacgym.create_gym()

# Create a simple vehicle model
vehicle = {
    'position': [0, 0, 0],
    'velocity': [0, 0, 0],
    'sensors': ['lidar', 'camera', 'radar']
}

print("Vehicle simulation initialized with sensors:", vehicle['sensors'])

Why: This creates the fundamental simulation structure that mirrors how autonomous vehicles process sensor data in real-world scenarios like those in San Francisco's streets.

3. Implement Sensor Data Simulation

Autonomous vehicles rely heavily on sensor data from multiple sources. We'll simulate the sensor inputs that the Lucid Gravity would collect during operation.

def simulate_sensor_data(vehicle):
    # Simulate LiDAR data
    lidar_data = np.random.rand(1000, 3) * 100  # 1000 points, 3D coordinates
    
    # Simulate camera data
    camera_data = np.random.rand(480, 640, 3)  # 480x640 RGB image
    
    # Simulate radar data
    radar_data = np.random.rand(50, 2) * 50  # 50 detections, range and velocity
    
    return {
        'lidar': lidar_data,
        'camera': camera_data,
        'radar': radar_data
    }

# Generate sensor data for our vehicle
sensor_data = simulate_sensor_data(vehicle)
print("Sensor data generated for", len(sensor_data), "sensors")

Why: This simulates the multi-modal sensor fusion that NVIDIA's DRIVE AGX system uses to process data from LiDAR, cameras, and radar - just like the Lucid Gravity vehicle.

4. Implement Basic Path Planning

Now we'll implement a simple path planning algorithm that could be part of the autonomous driving system. This is crucial for robotaxis to navigate urban environments safely.

def simple_path_planning(start, goal, obstacles):
    # Simple A* path planning implementation
    path = []
    current = start
    
    while current != goal:
        # Simple straight-line path (in reality, this would be more complex)
        path.append(current)
        if current[0] < goal[0]:
            current = (current[0] + 1, current[1])
        elif current[0] > goal[0]:
            current = (current[0] - 1, current[1])
        
        if current[1] < goal[1]:
            current = (current[0], current[1] + 1)
        elif current[1] > goal[1]:
            current = (current[0], current[1] - 1)
    
    path.append(goal)
    return path

# Plan a path from start to goal
start_pos = (0, 0)
goal_pos = (10, 10)
obstacles = [(5, 5), (6, 6)]
path = simple_path_planning(start_pos, goal_pos, obstacles)
print("Planned path:", path)

Why: This demonstrates the core navigation logic that autonomous vehicles must implement. Real systems like those in the Lucid Gravity would use more sophisticated algorithms, but this gives you the foundation.

5. Integrate with NVIDIA DRIVE AGX Simulation

Finally, we'll connect our simulation to NVIDIA's DRIVE AGX platform, which is the actual system being used in the real robotaxi testing.

import isaac
from isaac import isaac_utils

# Connect to NVIDIA DRIVE AGX simulation
try:
    isaac.connect('drive_agx_simulation')
    print("Successfully connected to DRIVE AGX simulation")
    
    # Set up vehicle control
    vehicle_control = {
        'steering': 0.0,
        'throttle': 0.0,
        'brake': 0.0
    }
    
    # Simulate vehicle movement
    def update_vehicle_state(vehicle_data, control):
        # Simple physics simulation
        new_position = [
            vehicle_data['position'][0] + control['throttle'] * 0.1,
            vehicle_data['position'][1] + control['steering'] * 0.05
        ]
        return {'position': new_position}
    
    print("Vehicle control initialized")
    
except Exception as e:
    print("Error connecting to DRIVE AGX simulation:", str(e))

Why: This connects our simulation to the actual platform that Uber and Nuro are using, showing how the theoretical concepts translate to real-world autonomous driving systems.

6. Run a Complete Simulation Test

Let's run a complete simulation that combines all our components to mimic what happens in a real robotaxi test like those in San Francisco.

def run_complete_simulation():
    # Initialize vehicle
    vehicle = {
        'position': [0, 0, 0],
        'velocity': [0, 0, 0],
        'sensors': ['lidar', 'camera', 'radar']
    }
    
    # Generate sensor data
    sensor_data = simulate_sensor_data(vehicle)
    print("Generated sensor data")
    
    # Plan path
    path = simple_path_planning((0, 0), (10, 10), [])
    print("Path planned successfully")
    
    # Simulate vehicle movement
    for i, point in enumerate(path):
        if i < len(path) - 1:
            control = {
                'steering': (path[i+1][0] - point[0]) * 0.1,
                'throttle': 0.5
            }
            vehicle['position'] = point
            print(f"Vehicle at position {point}")
    
    print("Simulation completed successfully")

# Run the complete simulation
run_complete_simulation()

Why: This comprehensive test demonstrates how all components work together in an autonomous vehicle system, similar to how Uber and Nuro test their robotaxis in real-world conditions.

Summary

In this tutorial, you've learned how to set up and run a basic autonomous vehicle simulation using NVIDIA's DRIVE AGX platform. You've created a simulation environment, implemented sensor data simulation, developed basic path planning, and integrated with the DRIVE AGX system. These are the core components that enable robotaxis like the Lucid Gravity to operate safely in urban environments like San Francisco. While this is a simplified version of what Uber and Nuro are testing, it provides practical experience with the foundational technologies behind autonomous driving systems.

Source: TNW Neural

Related Articles