Skip to content

VKWHM/Altair-Silent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

92 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Altair Silent

Autonomous drone system for precision window navigation using LiDAR-based SLAM and computer vision

License: MIT ROS2 Python


Overview

Mission Demo

Altair Silent is a fully autonomous drone navigation system designed to detect, approach, and navigate through indoor window openings using LiDAR-based perception and advanced motion planning. The system combines real-time SLAM mapping, Nav2 path planning, PID-based altitude control, obstacle avoidance, and multi-stage window detection to enable safe autonomous flight in GPS-denied environments.

Built on ROS2 Humble, the project integrates Google Cartographer for mapping, ArduPilot for flight control, and custom perception algorithms for robust corner detection and window traversal. The entire software stack runs on a Raspberry Pi-based drone platform with a 360° LiDAR scanner and communicates with ArduPilot over MAVLink.

Technical Highlights:

  • Real-time 2D SLAM with loop closure optimization
  • Multi-phase window detection (horizontal + vertical scanning)
  • Lifecycle-managed controller architecture with composable modules
  • Sector-based obstacle avoidance with repulsive force fields
  • Autonomous exploration with corner-based room navigation
  • Nav2 integration with custom behavior trees and recovery actions

Why This Project Exists

Altair Silent was developed by Team Altair for the Teknofest 2025 International UAV Competition, one of the world's largest aerospace and technology festivals held in Türkiye. The competition challenges teams to design fully autonomous drones capable of navigating complex indoor environments, including:

  • Flying through narrow rectangular openings (windows)
  • Mapping unknown spaces without GPS
  • Avoiding dynamic and static obstacles
  • Executing precise altitude and position control
  • Completing autonomous exploration missions

The Problem It Solves

Traditional drone navigation systems rely heavily on GPS, which becomes unavailable in indoor or GPS-denied environments. Detecting and safely traversing narrow openings like windows requires:

  1. Robust LiDAR-based perception to detect corners and edges in 3D space
  2. Real-time mapping to build navigable maps without prior knowledge
  3. Precision control to align with window openings within centimeter accuracy
  4. Intelligent planning to calculate safe approach trajectories and execute multi-stage maneuvers

Altair Silent addresses these challenges through:

  • Corner detection algorithms that identify window boundaries using range discontinuities
  • Vertical altitude scanning to measure window height dynamically
  • PID-controlled altitude and lateral velocity for stable flight
  • Integration with Nav2 for autonomous waypoint navigation
  • Modular controller architecture enabling composable flight behaviors

Who Benefits

  • Robotics researchers developing autonomous indoor navigation systems
  • Competition teams participating in UAV challenges requiring precision flight
  • Drone developers building perception and control systems for GPS-denied environments
  • ROS2 community seeking reference implementations of lifecycle-managed aerial systems

Altair Team Projects for Teknofest 2025:


Features

Mapping & Localization

  • Google Cartographer SLAM with 2D LiDAR and IMU fusion
  • Real-time occupancy grid generation at 5cm resolution
  • Loop closure optimization with pose graph constraints
  • 30m LiDAR range with 2.5cm voxel filtering

Navigation & Planning

  • Nav2 stack with A* global planner and DWB local controller
  • Layered costmaps: static map, voxel layer, inflation layer
  • Behavior tree-based navigation with recovery actions (spin, backup, wait)
  • 0.25 m/s max velocity with acceleration limits for safe flight

Window Detection & Traversal

  • LiDAR corner detection identifying range discontinuities (>0.5m threshold)
  • Multi-stage detection:
    1. Horizontal scan: detect left/right corners, calculate approach waypoints
    2. Vertical scan: measure window height by altitude sweeping
    3. Center alignment: position drone at optimal traversal altitude
  • Waypoint generation perpendicular to window plane
  • Autonomous exploration after window passage

Flight Control

  • Lifecycle-managed controllers (altitude, Nav2, obstacle avoidance, window)
  • PID-based altitude control with low-pass filtering
  • Sector-based obstacle avoidance (4-quadrant detection)
  • Velocity command fusion at 20Hz
  • MAVROS integration for ArduPilot communication over MAVLink

System Integration

  • Hardware agnostic design with simulation support (Gazebo Harmonic)
  • Serial/UDP MAVLink connection with auto-configuration
  • Custom ROS2 messages and services for window operations
  • Parameter-based runtime configuration
  • RViz visualization for markers and costmaps

Architecture

System Components

┌──────────────────────────────────────────────────────────────┐
│                      Altair Silent Stack                     │
├──────────────────────────────────────────────────────────────┤
│ silent_bringup          → System integration & launch files  │
│ silent_controllers      → Flight control (altitude, nav2,    │
│                           obstacle avoidance, window pass)   │
│ silent_window_detector  → Window detection & mission planner │
│ silent_explorer         → Autonomous corner-based exploration│
│ silent_slam             → Cartographer 2D SLAM integration   │
│ silent_nav2             → Nav2 navigation stack config       │
│ silent_msgs             → Custom ROS2 message/service defs   │
│ silent_utils            → Geometry & state management utils  │
│ silent_description      → Robot URDF models & transforms     │
├──────────────────────────────────────────────────────────────┤
│ External: MAVROS, Nav2, Cartographer, ArduPilot, Gazebo      │
└──────────────────────────────────────────────────────────────┘

Mission Workflow

  1. Takeoff → Drone lifts to initial altitude (1.4m)
  2. Width Detection → LiDAR scans detect horizontal window corners
  3. Approach Outside → Navigate to waypoint facing window
  4. Height Scan → Sweep altitude up/down to measure window height
  5. Center Alignment → Move to optimal altitude for traversal
  6. Window Pass (Outbound) → Navigate through window opening
  7. Exploration → Autonomous corner detection and room mapping
  8. Return Pass (Inbound) → Navigate back through window
  9. Return to Launch → Return to starting position

Installation

Prerequisites

  • NixOS or Nix package manager (for reproducible development environment)
  • NVIDIA GPU + drivers (for Gazebo simulation with GPU acceleration)
  • Docker (optional, for containerized simulation)

Build from Source

1. Enter Development Environment

nix develop . --accept-flake-config

This will automatically provision:

  • ROS2 Humble
  • Google Cartographer
  • Nav2 navigation stack
  • MAVROS + MAVROS Extras
  • RPLIDAR/SLLIDAR ROS2 drivers
  • Python 3.12 with NumPy

2. Build the Workspace

colcon build

3. Source the Environment

source install/local_setup.bash

Hardware Requirements

Flight Controller:

  • ArduPilot-compatible board (Pixhawk, Cube, etc.)
  • Firmware: ArduCopter 4.3+

Onboard Computer:

  • Raspberry Pi 4/5 (8GB recommended)
  • Ubuntu 22.04 or compatible OS
  • Serial UART enabled (GPIO 14/15)

Sensors:

  • 360° 2D LiDAR scanner (RPLIDAR A1/A2, YDLIDAR X2/X4, etc.)
  • IMU (provided by flight controller)

Usage

Simulation with Gazebo (Docker)

1. Allow X11 Access

xhost +SI:localuser:root

2. Build Gazebo Docker Image

sudo docker build .docker/ardupilot-sim/. -t gazebo_ros

3. Launch Simulation

sudo docker run --rm -it \
  --device nvidia.com/gpu=all \
  -v /tmp/.X11-unix:/tmp/.X11-unix:rw \
  -v $(pwd)/.gazebo:/root/buildings \
  -e DISPLAY=$DISPLAY \
  gazebo_ros

This launches Gazebo Harmonic with ArduPilot SITL and custom world models. When MAVProxy console bringup, run

setorigin 1 1 0

Launch Complete System

1. Launch and Configure MAVROS

ros2 launch silent_bringup mavros.launch fcu_url:=tcp://172.17.0.2:5770@

ros2 param set /mavros/mavros use_sim_time true

You may need change the IP address of the container 172.17.0.2

This starts:

  • MAVROS connection to ArduPilot
  • Configure /mavros/mavros node to use simulation time.

2. Launch SLAM Tools

ros2 launch silent_nav2 nav2_tf.launch.py use_sim_time:=True use_rviz:=False

This starts:

  • Robot Description Publisher
  • Cartographer
  • Navigation Stack 2

3. Launch Control System

ros2 launch silent_bringup silent_system.launch.py use_sim_time:=True

This starts:

  • Vehicle controllers (altitude, obstacle avoidance, Nav2, window)
  • Vehicle state watcher (handles takeoff/landing)
  • Window detector nodes (width, height, planner)
  • Explorer node for autonomous exploration

Start Autonomous Mission

Run commands below inside MAVProxy console:

mode GUIDED
arm throttle

Visualization with RViz

ros2 launch silent_nav2 nav2.launch.py use_rviz:=True use_sim_time:=True  # Includes RViz config

NOTE: RViz tools is not included in flake.nix, you must install it manually

View:

  • Real-time costmaps (local + global)
  • SLAM-generated occupancy grid
  • Planned paths and trajectories
  • Window detection markers
  • Obstacle avoidance zones

Configuration

Key Parameter Files

File Package Purpose
nav2_params.yaml silent_nav2 Nav2 stack configuration (planner, controller, costmaps)
drone_2d.lua silent_slam Cartographer SLAM parameters
apm_config.yaml silent_bringup MAVROS/ArduPilot connection settings
controller_params.yaml silent_controllers PID gains, filter coefficients, timeouts
window_detector_params.yaml silent_window_detector Detection thresholds, safety margins

Example: Tuning Altitude Controller

Edit src/silent_controllers/config/controller_params.yaml:

vehicle_controller:
  ros__parameters:
    ac_initial_altitude: 1.45  # Initial target altitude (m)
    ac_kp: 0.4                 # Proportional gain
    ac_ki: 0.065               # Integral gain
    ac_kd: 0.0                 # Derivative gain
    ac_alpha: 0.55             # Low-pass filter coefficient
    ac_output_min: -1.0        # Min vertical velocity (m/s)
    ac_output_max: 1.0         # Max vertical velocity (m/s)

Example: Adjusting Window Detection Safety

Edit src/silent_window_detector/config/window_detector_params.yaml:

width_detector:
  ros__parameters:
    safety_corner_range: 0.5   # Min range change for corner (m)
    safety_area: 1.0           # Min clearance for passage (m)
    targets_length: 1.0        # Distance of waypoints from window (m)

Hardware Configuration

Enable UART on Raspberry Pi:

Edit /boot/firmware/config.txt:

enable_uart=1
dtoverlay=disable-bt

Set permissions:

sudo usermod -aG dialout $USER
sudo chmod 666 /dev/serial0

MAVROS Connection:

Edit src/silent_bringup/params/apm_config.yaml:

fcu_url: /dev/serial0:57600  # Serial connection
# or
fcu_url: udp://:14550@       # UDP connection for simulation

Project Structure

altair-silent/
├── .docker/                    # Docker simulation environment
│   └── ardupilot-sim/          # ArduPilot SITL + Gazebo Harmonic
├── .gazebo/                    # Custom Gazebo world models (Ev, Ev2, EvReal)
├── src/
│   ├── silent_bringup/         # System launch files & MAVROS config
│   ├── silent_controllers/     # Flight control modules (5+ controllers)
│   ├── silent_description/     # URDF robot models
│   ├── silent_explorer/        # Autonomous exploration with corner detection
│   ├── silent_msgs/            # Custom ROS2 messages/services
│   ├── silent_nav2/            # Nav2 configuration & behavior trees
│   ├── silent_slam/            # Cartographer 2D SLAM integration
│   ├── silent_utils/           # Shared utilities (geometry, state factories)
│   └── silent_window_detector/ # Window detection pipeline (3 nodes)
├── flake.nix                   # Nix development environment
├── LICENSE                     # MIT License
└── README.md                   # This file

~5,850 lines of Python code across 8 packages (excluding tests).


Contributing

Contributions are welcome!

Development Workflow

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature/your-feature
  3. Make changes following the code style guidelines in AGENTS.md
  4. Commit changes: git commit -m "feat: add your feature"
  5. Push to branch: git push origin feature/your-feature
  6. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details


Acknowledgments

Team Altair - Teknofest 2025

Core Developers:

  • VKWHM - Project lead, system architecture, control systems
  • Wueenzy - LiDAR perception, corner detection algorithms, exploration logic

Special Thanks

  • ROS2 Community - For the Humble distribution and comprehensive documentation
  • ArduPilot Team - For the robust open-source flight controller firmware
  • Google Cartographer Team - For the excellent SLAM implementation
  • Nav2 Contributors - For the flexible navigation stack
  • Teknofest Organizers - For hosting the International UAV Competition and inspiring innovation

Contact & Links


Built with ❤️ by Team Altair for Teknofest 2025

About

Altair-Silent is an autonomous drone navigation system developed for indoor, GPS-denied environments. It uses 2D LiDAR SLAM, behavior-tree navigation with the Nav2 stack, and a multi-stage window detection pipeline to enable a drone to detect, approach, fly through, and return through a window opening.

Topics

Resources

License

Stars

Watchers

Forks

Contributors