A Python visualization tool for designing and testing robot search algorithms for the University Rover Challenge (URC) competition.
In the URC competition, you're given approximate target coordinates (within a 10m radius) and must search that area. This tool helps you:
- Visualize different search algorithms in real-time
- See coverage maps showing searched vs unsearched areas
- Test with realistic camera specifications (Intel RealSense D435)
- Optimize your search patterns before field testing
- Dual visualization: 2D top-down and 3D perspective views
- Grid-based coverage: Black squares (unseen) turn green (seen) as robot explores
- Realistic camera FOV: 87° × 58° field of view, 3m ideal range
- 10m search radius: Matches URC competition specs
- Pluggable algorithms: Easy to add your own search functions
- Coverage metrics: Track percentage of area covered in real-time
pip install -r requirements.txtpython main.pySelect a search algorithm when prompted and watch it visualize!
Search algorithms are simple Python functions. Example:
def my_search(robot: SearchRobot, time_step: float):
"""My custom search algorithm."""
robot.move_forward(robot.speed * time_step)
robot.turn(5) # Turn 5 degrees each stepSee search_algorithms.py for detailed examples and documentation.
- Spiral Search - Expanding spiral from center
- Lawnmower Search - Back-and-forth rows
- Expanding Square - Growing square pattern
- Random Walk - Randomized exploration
- Star Pattern - Radial rays from center
robot.move_forward(distance)- Move forwardrobot.turn(angle)- Turn (positive = clockwise)robot.set_heading(angle)- Set absolute heading
robot.x, robot.y- Current positionrobot.heading- Current heading (0° = North)robot.camera_range- Camera range (3m)robot.fov_horizontal- Field of view (87°)
Edit config.py to adjust:
- Search radius (default: 10m)
- Grid size (default: 0.5m squares)
- Robot speed (default: 1.0 m/s)
- Camera specifications
- Visualization settings
Intel RealSense D435
- Horizontal FOV: 87°
- Vertical FOV: 58°
- Ideal range: 0.3m - 3m
- The visualization uses these real specs for accurate field-of-view
- Black squares: Unseen areas
- Green squares: Areas the robot has seen
- Red circle: Robot position
- White arrow: Robot heading
- Cyan wedge: Camera field of view
- Yellow line: Robot's path
- Black/flat squares: Unseen (height: 0.05m)
- Green/raised squares: Seen (height: 0.1m)
- Red cone: Robot with heading indicator
- Cyan cylinder: Search boundary
visualization/
├── main.py # Entry point with example algorithms
├── robot.py # Robot class with movement and control
├── environment.py # Search environment and coverage tracking
├── visualizer.py # 2D and 3D visualization
├── config.py # Configuration parameters
├── search_algorithms.py # Algorithm examples and documentation
├── requirements.txt # Python dependencies
└── README.md # This file
- Test incrementally: Start simple, add complexity
- Use the camera specs: 3m range, 87° FOV
- Optimize coverage: Minimize overlap, maximize new areas
- Consider efficiency: Time matters in competition
- Handle boundaries: Stay within 10m radius
- Track state: Use robot attributes to store algorithm state
# In main.py, add your algorithm function:
def my_custom_search(robot: SearchRobot, time_step: float):
"""Your algorithm description."""
# Your logic here
robot.move_forward(1.0)
# Then add to the ALGORITHMS dictionary:
ALGORITHMS = {
'1': ('Spiral Search', spiral_search),
'2': ('My Custom Search', my_custom_search), # Add here
# ...
}The visualization displays:
- Coverage %: Percentage of search area seen
- Position: Current robot coordinates
- Heading: Current robot direction
- Path: Visual trail of robot movement
def stateful_search(robot: SearchRobot, time_step: float):
if not hasattr(robot, '_my_state'):
robot._my_state = {'phase': 1, 'counter': 0}
state = robot._my_state
# Use state in your algorithm# In your algorithm, you can check boundaries:
distance_from_center = np.sqrt(robot.x**2 + robot.y**2)
if distance_from_center > 9: # Near edge
robot.set_heading(np.degrees(np.arctan2(-robot.x, -robot.y)))Visualization is slow: Reduce UPDATE_INTERVAL in config.py or grid resolution
Robot leaves boundary: Add boundary checking in your algorithm
Coverage seems wrong: Check camera FOV calculations match your expectations
This tool is specifically designed for URC search tasks where:
- You receive approximate target coordinates
- Search area is within 10m radius
- You need efficient coverage patterns
- Time optimization matters
- Camera specifications are critical
Good luck with your URC competition! 🚀