Skip to content

RoseCityRobotics/common_platform

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

371 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Rose City Robotics Common Platform

๐Ÿ“š ๐Ÿ“– View Complete Documentation - Comprehensive guides for setup, operation, and development

A Fork of the PARTS Common Platform


Side view with LiDAR sensor


This project implements a complete robotics platform built on the Pololu Romi Chassis Kit, featuring a custom carrier PCB that integrates all components into a compact, efficient design. The platform combines hardware, firmware, and software to create a fully functional autonomous robot.


Logo

Hardware Components:

  • Pololu Romi Chassis Kit - Robust base platform with encoders and caster
  • Custom PARTS CRP Board - Integrated carrier PCB with all connections
  • Teensy 4.0 Microcontroller - High-performance ARM Cortex-M7 processor
  • TB9051FTG Motor Drivers - Dual brushed DC motor control
  • MPU-9250 9DOF IMU - Nine-axis motion sensing for navigation
  • Raspberry Pi with AI HAT - Main computer for ROS2 and onboard autonomous vision inferance at 26 TOPS
  • Power Management - 5V regulator and battery voltage monitoring

Mobile base platform


Current Implementation: The platform now features a complete ROS2 integration with micro-ROS running on the Teensy 4.0, enabling real-time communication between the Raspberry Pi and the robot's sensors and actuators. The system supports autonomous navigation, sensor fusion, and closed-loop control, making it an ideal platform for robotics education and development in the Portland Area Robotics Society.

(back to top)

Built With

  • CPP
  • C
  • Python 3
  • PlatformIO
  • Arduino
  • KiCadEDA
  • ROS

(back to top)

Getting Started

Raspberry Pi Direct Connection to Raspberry Pi

Connect your micro HDMI cable to your Pi and monitor, and a keyboard to the USB-A port.

OR

๐Ÿ–ฅ๏ธ On the development computer: Connect via SSH:

ssh rcr@192.168.1.n

Replace n with your robot number.

Raspberry Pi Ubuntu Login on Raspberry Pi

  • User: rcr
  • Password: siliconforest

๐Ÿ“š Documentation

๐Ÿ“– Complete Documentation - Comprehensive guides for setup, operation, and development

Key Documentation Sections

  • ๐Ÿš€ Getting Started - Complete setup from hardware assembly to first robot movement
  • โš™๏ธ Hardware Assembly - Step-by-step soldering and assembly instructions
  • ๐Ÿ’ป Software Setup - Host configuration, ROS2 installation, and Teensy programming
  • ๐ŸŽฎ Robot Operations - Startup, shutdown, teleoperation, and navigation procedures
  • ๐Ÿงช Testing & Validation - Smoke tests and system verification
  • ๐Ÿ”ง Troubleshooting - Common issues and recovery procedures

Quick Start Highlights

  • Hardware: Custom PARTS CRP board with integrated motor drivers and Teensy 4.0
  • Software: ROS2 with micro-ROS for real-time communication
  • Sensors: LiDAR, IMU, and camera support for autonomous navigation
  • Programming: Arduino CLI for Teensy firmware development
  • Testing: Basic smoke test to verify hardware functionality

(back to top)

๐Ÿค– Imitation Learning Training

The platform includes a complete PyTorch training pipeline for robot maze navigation using imitation learning with a transformer-based architecture. For detailed documentation, see scripts/imitation_learning/src/imitation_learning/README.md.

Transfer Learning from No-Z Model Checkpoint

If you have a previously trained model without the Z variable (CVAE style variable) and want to initialize a new model with Z using those weights, you can use transfer learning:

# Navigate to the imitation learning directory
cd scripts/imitation_learning/src/imitation_learning

# Train with transfer learning from a no-Z model checkpoint
python train.py \
  --config config.yaml \
  --transfer_from_noz /path/to/best_model_without_z.pth

What gets transferred:

  • โœ… Vision encoder weights
  • โœ… Temporal encoder weights
  • โœ… Action decoder weights
  • โœ… Action queries and action head weights
  • ๐Ÿ†• New CVAE components (action_encoder, z_projection) are randomly initialized

Optional: Freeze layers during transfer learning

You can freeze specific layers while training the new CVAE components by enabling transfer learning in your config:

# In config.yaml
transfer_learning:
  enabled: true
  freeze_vision: true      # Freeze vision encoder
  freeze_encoder: false     # Keep temporal encoder trainable
  freeze_decoder: false     # Keep action decoder trainable

Example: Freeze vision encoder only

python train.py \
  --config config.yaml \
  --transfer_from_noz checkpoints/best_model_noz.pth

With freeze_vision: true in the config, the vision encoder will remain frozen while the new CVAE components and other layers are trained.

For more training options and examples, see the imitation learning README.

(back to top)

Assembly Images

Mobile base platform - unassembled components

Mobile base platform

Assembled chassis with controller

Mobile base platform assembly

Custom PCB showing integrated motor drivers and controller connections

Custom PCB with ports and controller

Completed chassis - top view with Raspberry Pi AI HAT+

Assembled Platform - Top View

Raspberry Pi 5 with AI HAT+

Raspberry Pi 5 with AI HAT+

Side view with LiDAR sensor mounted

Side view with LiDAR sensor

(back to top)

See the open issues for a full list of proposed features (and known issues).

(back to top)

Thank you PARTS

Portland Area Robotics Society (P.A.R.T.S.)

Portland Area RoboTics Society (PARTS) Our work is based on the PARTS Common Platform

Contributing

Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

(back to top)

License

Distributed under the MIT License and the Solderpad Hardware License v2.1. See LICENSE.txt for more information.

(back to top)

About

The PARTs Common Platform Robot

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors