I am a passionate Robotics Software Engineer and Full-Stack Engineer with 4+ years of experience building production-grade systems in regulated, high-reliability environments. I specialize in autonomous mobile robots, robotic manipulators, multi-robot systems using ROS2, and end-to-end software development across frontend, backend, and AI/ML domains.
- 📞 Phone: +91 7981960932
- 📧 Email: madhubabugopisetti27@gmail.com
- 💼 LinkedIn: madhubabu-gopisetti
- Built a differential-drive mobile robot in ROS2 Jazzy and Gazebo using URDF/XACRO.
- Implemented SLAM using slam_toolbox, AMCL for localization, and the full Nav2 navigation stack.
- Result: A complete classical autonomous navigation stack — drive, map, localize, and navigate to goals reliably.
- Extended NEXUS-AMR with camera and LiDAR sensors and OpenCV-based perception nodes for docking target detection.
- Built autonomous control nodes to convert vision output into motion commands — no keyboard required.
NEXUS-ADS-INTEL (ML Automated)
- Replaced rule-based vision with YOLO object detection and LiDAR feature extraction.
- Integrated sensor fusion (Camera + LiDAR) and trained Logistic Regression and Random Forest models for autonomous navigation decisions.
- Built dataset collection, training, and real-time inference pipelines fully inside ROS2.
ROBOT: Real Hardware Implementation
- Assembled a real differential-drive robot using Raspberry Pi 5 and RP2040 MCU.
- Designed full power and motor control stack (BTS7960 drivers, GB37 encoded motors), integrated YDLIDAR G2, and deployed full SLAM + Nav2 stack on physical hardware.
- Built a 6-DOF robotic arm in ROS2 using URDF/XACRO and integrated with ros2_control for joint-level control in Gazebo and RViz.
- Added wrist-mounted camera and built a visual servoing pipeline for object detection, centroid extraction, and closed-loop end-effector alignment.
NEXUS-AGS-INTEL (ML Automated)
- Integrated YOLO-based perception for object detection/classification and built a full ML pipeline inside ROS2 for dynamic grasp behavior.
- Designed the full spider robot in Blender, built URDF/XACRO model, and implemented gait scripts in ROS2 + Gazebo for stable standing and walking.
SPIDER ROBOT: Real Hardware
- Built a real multi-legged robot using SG90 servos, PCA9685 PWM controller, and ESP8266 MCU.
- Implemented servo calibration, joint zeroing, and motion sequencing for coordinated gait control.
(Simulation-Based Autonomous Mobile Manipulation)
- Integrated NEXUS-ADS and NEXUS-AGS into a single coordinated ROS2 platform using Nav2 for mobile base autonomy and ros2_control for manipulator control.
- Full mission pipeline: navigate → align → deploy arm → pick object → navigate → place object.