Universal Robots and Scale AI launch the UR AI Trainer
Back to Tutorials
techTutorialintermediate

Universal Robots and Scale AI launch the UR AI Trainer

March 16, 202624 views4 min read

Learn to set up and use the UR AI Trainer for capturing robot training data, integrating force, motion, and visual sensors for imitation learning.

Introduction

The UR AI Trainer, developed by Universal Robots and Scale AI, represents a significant leap in making AI training accessible on real factory hardware. This tutorial will guide you through setting up and using the UR AI Trainer to capture and process robot training data for imitation learning. You'll learn how to interface with the robot's force, motion, and visual sensors to generate high-fidelity training datasets that can be used to teach robots complex tasks.

Prerequisites

  • Basic understanding of robotics and Python programming
  • Access to a Universal Robots UR5e or UR10e robot (or simulation environment)
  • Python 3.8+ installed on your system
  • Required Python packages: urx, numpy, opencv-python, scaleai
  • Basic knowledge of ROS (Robot Operating System) or simulation environments like Gazebo

Step-by-Step Instructions

Step 1: Setting Up the UR AI Trainer Environment

Install Required Dependencies

First, we need to install all necessary Python packages for the UR AI Trainer. The urx library allows us to communicate with Universal Robots, while scaleai provides the data capture and processing tools.

pip install urx numpy opencv-python scaleai

Why: These libraries provide the core functionality needed to interface with the robot hardware and process the data captured during training.

Step 2: Establishing Robot Communication

Connect to the UR Robot

We'll establish a connection to the robot using the URX library. This connection will allow us to send commands and receive sensor data.

import urx

# Connect to the robot at its IP address
robot = urx.Robot("192.168.1.100")

# Verify connection
print("Connected to robot:", robot.get_version())

Why: Establishing a connection is crucial for any robot interaction. The URX library provides a Python interface to communicate with Universal Robots' URScript protocol.

Step 3: Capturing Force and Motion Data

Set Up Data Logging

The UR AI Trainer captures force and motion data from the robot's sensors. We'll set up a logging function to record this data during robot operation.

import time
import numpy as np

# Initialize data storage
force_data = []
motion_data = []

# Function to capture sensor data

def capture_sensor_data(robot):
    # Get current force data
    force = robot.get_force()
    # Get current joint positions
    joints = robot.getj()
    
    force_data.append(force)
    motion_data.append(joints)
    
    return force, joints

# Capture data for 10 seconds
start_time = time.time()
while time.time() - start_time < 10:
    capture_sensor_data(robot)
    time.sleep(0.1)

Why: Force and motion data are essential for understanding how the robot interacts with its environment. This data forms the basis for imitation learning algorithms.

Step 4: Integrating Visual Data Capture

Set Up Camera Integration

Visual data is crucial for the UR AI Trainer's capabilities. We'll integrate camera data capture using OpenCV.

import cv2

# Initialize camera
camera = cv2.VideoCapture(0)  # Assuming USB camera

# Capture images
image_data = []

for i in range(50):  # Capture 50 images
    ret, frame = camera.read()
    if ret:
        image_data.append(frame)
        cv2.imwrite(f"frame_{i}.jpg", frame)
    time.sleep(0.1)

# Release camera
camera.release()

Why: Visual data provides context for the robot's actions and helps in creating more comprehensive training datasets for imitation learning.

Step 5: Creating Training Dataset Structure

Organize Collected Data

Now we'll structure the captured data into a format suitable for AI training. This involves aligning force, motion, and visual data timestamps.

import json

# Create dataset structure
training_data = {
    "force_data": force_data,
    "motion_data": motion_data,
    "image_data": [f"frame_{i}.jpg" for i in range(len(image_data))],
    "timestamps": []
}

# Save dataset
with open('ur_ai_training_dataset.json', 'w') as f:
    json.dump(training_data, f, indent=2)

print("Dataset saved successfully")

Why: Properly structured data is essential for machine learning models. The alignment of different data types ensures that the AI can learn the relationship between visual cues, motion, and force.

Step 6: Processing Data with Scale AI Tools

Upload and Process Dataset

Finally, we'll use Scale AI's tools to process our dataset for training. This involves uploading the data and preparing it for AI model training.

from scaleai import ScaleAI

# Initialize Scale AI client
scale = ScaleAI(api_key="your_api_key")

# Upload dataset
upload_response = scale.upload_dataset(
    dataset_path="ur_ai_training_dataset.json",
    dataset_name="UR_AI_Training_Data"
)

print("Dataset uploaded:", upload_response)

# Process dataset for training
processing_response = scale.process_dataset(
    dataset_id=upload_response["dataset_id"],
    task_type="imitation_learning"
)

print("Dataset processed:", processing_response)

Why: Scale AI's platform automates the preprocessing of training data, making it ready for AI model training. This step streamlines the workflow from data capture to model training.

Summary

In this tutorial, you've learned how to set up and use the UR AI Trainer for capturing robot training data. You've established robot communication, captured force and motion data, integrated visual data, structured your dataset, and processed it using Scale AI tools. This workflow represents a significant advancement in making AI training accessible on production hardware, bridging the gap between research labs and factory floors. The UR AI Trainer's ability to capture high-fidelity data directly on production hardware makes it a powerful tool for developing sophisticated robotic applications.

Source: TNW Neural

Related Articles