Skip to content
View madhubabugopisetti's full-sized avatar

Block or report madhubabugopisetti

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Maximum 250 characters. Please don’t include any personal information such as legal names or email addresses. Markdown is supported. This note will only be visible to you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
madhubabugopisetti/README.md

Madhu Babu Gopisetti - Robotics & Full-Stack Software Engineer

About Me:

I am a passionate Robotics Software Engineer and Full-Stack Engineer with 4+ years of experience building production-grade systems in regulated, high-reliability environments. I specialize in autonomous mobile robots, robotic manipulators, multi-robot systems using ROS2, and end-to-end software development across frontend, backend, and AI/ML domains.

📫 Contact Information:

🚀 Projects

🤖 Autonomous Mobile Robot — NEXUS Series

NEXUS-AMR (Keyboard Control)

  • Built a differential-drive mobile robot in ROS2 Jazzy and Gazebo using URDF/XACRO.
  • Implemented SLAM using slam_toolbox, AMCL for localization, and the full Nav2 navigation stack.
  • Result: A complete classical autonomous navigation stack — drive, map, localize, and navigate to goals reliably.

Technology Stack:

ROS2    Gazebo    Nav2    SLAM Toolbox    AMCL    URDF/XACRO    RViz2    python


NEXUS-ADS (OpenCV Automated)

  • Extended NEXUS-AMR with camera and LiDAR sensors and OpenCV-based perception nodes for docking target detection.
  • Built autonomous control nodes to convert vision output into motion commands — no keyboard required.

Technology Stack:

ROS2    opencv    LiDAR    Nav2    cv_bridge    python


NEXUS-ADS-INTEL (ML Automated)

  • Replaced rule-based vision with YOLO object detection and LiDAR feature extraction.
  • Integrated sensor fusion (Camera + LiDAR) and trained Logistic Regression and Random Forest models for autonomous navigation decisions.
  • Built dataset collection, training, and real-time inference pipelines fully inside ROS2.

Technology Stack:

ROS2    YOLO    scikit-learn    numpy    pandas    LiDAR    python


ROBOT: Real Hardware Implementation

  • Assembled a real differential-drive robot using Raspberry Pi 5 and RP2040 MCU.
  • Designed full power and motor control stack (BTS7960 drivers, GB37 encoded motors), integrated YDLIDAR G2, and deployed full SLAM + Nav2 stack on physical hardware.

Technology Stack:

ROS2    raspberry-pi    arduino    YDLIDAR G2    Nav2    SLAM Toolbox    ubuntu    python


🦾 Autonomous Grasping System — NEXUS Manipulator Series

NEXUS-MM (Keyboard Control)

  • Built a 6-DOF robotic arm in ROS2 using URDF/XACRO and integrated with ros2_control for joint-level control in Gazebo and RViz.

Technology Stack:

ROS2    ros2_control    Gazebo    RViz2    URDF/XACRO    MoveIt    python


NEXUS-AGS (OpenCV Automated)

  • Added wrist-mounted camera and built a visual servoing pipeline for object detection, centroid extraction, and closed-loop end-effector alignment.

Technology Stack:

ROS2    opencv    ros2_control    cv_bridge    Gazebo    python


NEXUS-AGS-INTEL (ML Automated)

  • Integrated YOLO-based perception for object detection/classification and built a full ML pipeline inside ROS2 for dynamic grasp behavior.

Technology Stack:

ROS2    YOLO    pytorch    scikit-learn    ros2_control    python


🕷️ Legged Robot — NEXUS Spider

NEXUS SPIDER (Simulation)

  • Designed the full spider robot in Blender, built URDF/XACRO model, and implemented gait scripts in ROS2 + Gazebo for stable standing and walking.

Technology Stack:

ROS2    blender    Gazebo    RViz2    URDF/XACRO    ros2_control    python


SPIDER ROBOT: Real Hardware

  • Built a real multi-legged robot using SG90 servos, PCA9685 PWM controller, and ESP8266 MCU.
  • Implemented servo calibration, joint zeroing, and motion sequencing for coordinated gait control.

Technology Stack:

embedded-c    arduino    ESP8266    PCA9685    cpp


🚀 NEXUS FLEET — Full Mobile Manipulation System

(Simulation-Based Autonomous Mobile Manipulation)

  • Integrated NEXUS-ADS and NEXUS-AGS into a single coordinated ROS2 platform using Nav2 for mobile base autonomy and ros2_control for manipulator control.
  • Full mission pipeline: navigate → align → deploy arm → pick object → navigate → place object.

Technology Stack:

ROS2    Nav2    ros2_control    MoveIt    Gazebo    opencv    YOLO    SLAM Toolbox    python


🧰 Technologies & Tools

🤖 Robotics:

ROS2 ROS1 Gazebo MoveIt Nav2 OpenCV Blender

💻 Programming:

python    cpp    javascript    typescript    c

🌐 Frontend:

angular    react    nextjs    vuejs

⚙️ Backend & DevOps:

nodejs    docker    jenkins    linux    mongodb    mysql

🧠 AI / Machine Learning:

pytorch    scikit_learn    pandas    YOLO


Pinned Loading

  1. Pi_Li_AMR Pi_Li_AMR Public

    A real autonomous mobile robot implemented using ROS2 Jazzy on Raspberry Pi 5, integrating RP2040-based motor control, encoder odometry, YDLiDAR SLAM, and Nav2 for goal-based navigation, demonstrat…

    Python

  2. ROS2_NEXUS_ADS ROS2_NEXUS_ADS Public

    Autonomous Docking System - capable of map-based navigation and vision-guided docking using closed-loop control

    Python

  3. ROS2_NEXUS_FLEET ROS2_NEXUS_FLEET Public

    Autonomous mobile manipulation system that coordinates navigation, object pickup, and docking by integrating a mobile base with a robotic arm through perception-driven, closed-loop control

    Python

  4. ROS2_NEXUS_ADS_INTEL ROS2_NEXUS_ADS_INTEL Public

    Autonomous mobile robot navigation with machine-learning-based perception, sensor fusion, decision-level control, and obstacle avoidance

    Python

  5. ROS2_NEXUS_AGS_INTEL ROS2_NEXUS_AGS_INTEL Public

    Autonomous robotic manipulation with machine-learning-based object detection and vision-guided control

    Python 2

  6. ROS2_NEXUS_SPIDER ROS2_NEXUS_SPIDER Public

    Autonomous Quadruped System — Legged robot simulation that implements joint-space control, coordinated limb motion, and basic locomotion behaviors using closed-loop control

    Python 1