I specialize in SLAM, localization, and sensor fusion for large-scale autonomous mobile robots. From NASA lunar rovers at Bosch to co-building CTDI's Enzo AMR, scaling to 125+ robots across production warehouses in the U.S. and Europe.
About
I'm a Robotics Systems Engineer who has spent the last few years deep in the trenches of autonomous mobile robot development, from early-stage R&D to full-scale production deployment.
Most recently, I co-led the development of CTDI's Enzo AMR platform from the ground up, scaling it to 125+ robots across multiple facilities in the U.S. and Europe. I owned the full localization and mapping pipeline, including LiDAR-based SLAM, scan matching, AMCL, sensor fusion, and drift correction, stabilizing operations across 170,000+ sq ft dynamic warehouse environments.
Before that, I worked at Bosch on NASA lunar rover programs, developing smart docking systems and vision-based navigation with high pose estimation accuracy.
I care about the hard problems: long-aisle drift, localization instability, sensor timing synchronization, and making robots work consistently in messy, real-world environments, not just in simulation.
Built AMR systems from concept to 125+ robots in production across multiple continents.
Developed smart docking and vision-based navigation systems for NASA lunar rover programs at Bosch.
End-to-end ownership across SLAM, calibration, fleet management, PLC integration, and field deployment.
Solving drift, instability, and timing issues in 170K+ sq ft warehouse environments.
Featured
Work featured by Beckhoff Automation, Beckhoff YouTube, and Northeastern University.
CTDI developed the Enzo AMR in-house using Beckhoff's TwinCAT platform, going from a working prototype in two weeks to a fully autonomous production robot in 60 days. The platform runs a hybrid control architecture with TwinCAT/BSD and Linux via virtualization, enabling seamless communication between ROS2 and TwinCAT for real-time SLAM navigation and QR code guidance.
The Enzo features modular "toppers" for shelves, lifters, tuggers, and conveyors, forming a single adaptable platform for goods-to-person operations across CTDI's 100 global facilities. I'm named in the article as part of the core team that brought this system to life.
Read the full article
Beckhoff released a video feature on the Enzo AMR program showcasing the robot in action across warehouse operations. The video covers the full system architecture, from the Beckhoff C6032 IPC running TwinCAT to the ROS2 integration for SLAM-based navigation, modular topper system, and fleet deployment.
I'm featured in this video as part of the core engineering team that designed, built, and deployed the Enzo platform from prototype to production.
Watch on YouTubeFeatured in Northeastern University's official LinkedIn post during National Robotics Week. Demonstrated a simulated self-driving car to President Aoun, showcasing how the vehicle collects and processes real-time data from Boston's streets for autonomous navigation.
The project involved building a full perception and data collection pipeline for autonomous driving, integrating sensor data from cameras and LiDAR to map urban environments. This work was part of the robotics program at Northeastern, where I developed hands-on experience with autonomous vehicle systems, computer vision, and real-time sensor processing.
View the post
Experience
From lunar rovers to warehouse-scale AMR fleets.
Co-led the development, stabilization, and deployment of the Enzo AMR platform from scratch. Scaled to 125+ robots across multiple U.S. and European facilities. Owned the full localization and mapping pipeline including LiDAR-based SLAM, scan matching, AMCL, sensor fusion (IMU + encoders + LiDAR), and drift correction. Stabilized localization across 170K+ sq ft warehouses, achieving 85%+ reliability in long-aisle conditions.
Built system-wide NTP time synchronization across LiDAR, Linux, BSD, and PLC subsystems. Co-developed Fleet Manager applications (WPF/.NET/C#), implemented Lua scripting for task orchestration, and integrated with ROS2 for real-time fleet communication.
Contributed to NASA lunar rover programs, developing smart docking and vision-based navigation systems. Achieved high pose estimation accuracy and improved sensor fusion and odometry performance for autonomous navigation in extraterrestrial environments.
Projects
Academic and personal work in robotics, perception, and autonomous systems.

Designed and built an affordable mobile robot platform capable of performing SLAM in indoor environments. Focused on achieving reliable mapping and localization while keeping hardware costs minimal.
Implemented a frontier-based exploration strategy enabling a robot to autonomously navigate and map unknown environments, selecting the next best frontier to maximize coverage efficiency.


Built a pipeline that reconstructs 3D point clouds from 2D image sequences. Performs feature detection, matching, camera pose estimation, and triangulation to create dense 3D representations.
Skills
Proficiency across the full robotics stack, from low-level sensor integration to fleet-scale deployment.
Contact
I'm interested in roles focused on autonomous vehicles (ADAS), AMR systems, and anything involving perception, localization, navigation, and scalable robotics architecture. If you're working on hard problems in autonomy, let's talk.