Autonomous Systems Engineer

Perception-driven autonomy, mapped and validated before deployment.

I design multi-sensor perception, SLAM, and simulation-first validation pipelines that let autonomous robots ship faster and safer. Recent work centers on NVIDIA Isaac Sim digital twins and CAD-to-URDF asset pipelines for ROS 2 navigation and high-fidelity validation in complex real-world environments.

ROS 2 C++ Python Gazebo Isaac Sim CUDA
Dany Rashwan's Portrait
Simulation-first validation Reduced field testing cycles and improved autonomy release confidence.
Multi-sensor perception LiDAR + RGB-D fusion for robust scene understanding in dynamic spaces.
ROS 2 native stacks Nav2, behavior trees, and cloud-ready CI pipelines.

Impact Highlights

Metrics summarized from the case studies below (see “How we measured this” on each).

60% Less field testing time via CI-driven simulation validation
95%+ Regressions caught before hardware testing
80% Faster environment generation for digital twins
99.9% Collision-free operation in mixed human-robot deployments

Skill Signal

Select a capability to see impact, tools, and the work that proves it.

Focus

Full-stack autonomy delivery

Orchestrating perception, mapping, simulation, and validation to ship reliable robot behavior.

Perception Mapping Simulation Autonomy Validation
Depth

Impact

Reduced field validation cycles by shifting autonomy testing into simulation-first workflows.

Toolchain

  • ROS 2
  • Nav2
  • NVIDIA Isaac Sim
  • CAD-to-URDF
  • OpenUSD
  • OpenCV
  • PCL
  • Docker

Evidence

  • Real-to-Sim Pipeline
  • Perception-Driven Navigation
  • Simulation Validation CI

Featured Case Studies

Demo clip available on request. Images shown are AI-generated stand-ins because I can't share the original customer footage.

Additional Projects

Survival Code Vault

Feb 7, 2026 – Present

Encrypted, end-to-end autonomy codebases that run from MCU-class hardware to GPU workstations. Shows portable, deterministic systems thinking with optional accelerators and verified demos.

Autonomous Hospital Transport Robot

Jan 2024 – Dec 2024

Senior design project delivering hospital supplies with a ROS 2-based navigation stack, SLAM, and onboard perception.

Mood Music

Sep 12, 2023 – Oct 24, 2023

Web app that recommends Spotify playlists based on detected or selected mood signals.

Additional Engineering Systems (Private + Public)

2020 – Present

A collection of internal autonomy and perception systems spanning GPU-accelerated 3D mapping, ROS 2 middleware integration, SLAM experimentation, embedded robotics deployment, applied AI tooling, and full-stack engineering prototypes. Many repositories are maintained privately by default, with selected public work available on GitHub.

  • Real-time 3D reconstruction and volumetric mapping pipelines
  • ROS 2 stereo perception and navigation middleware
  • Simulation-first SLAM and localization evaluation workflows
  • Applied LLM and automation toolchains for autonomy research
  • Embedded-to-GPU scalable robotics software systems
  • Private deployments with code available upon request

Additional repositories are private by default and available for technical review upon request.

Core Competencies

Perception

Multi-sensor fusion across LiDAR, RGB-D, and IMU for robust scene understanding.

Mapping & SLAM

High-fidelity maps and localization with EKF-based state estimation and continuous refinement.

Simulation & Digital Twins

NVIDIA Isaac Sim digital twins with CAD-to-URDF and USD asset pipelines built from real-world scans to close the sim-to-real gap.

Autonomy

Navigation, behavior orchestration, and mission logic for safe and efficient fleet operations.

About

Engineering Snapshot

I'm an Autonomous Systems Engineer at Rovex, focused on building perception pipelines, simulation-first validation workflows, and ROS 2 navigation stacks for real-world deployments. I earned my B.S. in Computer Engineering from the University of Florida and bring experience in robotics research, ML for healthcare, and simulation-driven testing.

Based in Gainesville, FL.

Toolbox

  • ROS 2, Nav2, Behavior Trees
  • NVIDIA Isaac Sim, Omniverse, Gazebo
  • SLAM, EKF-based sensor fusion
  • OpenCV, PCL, TensorFlow, OpenVINO
  • Docker, CI pipelines, cloud simulation

Resume

Dany Rashwan

Autonomous Systems Engineer · Perception · Mapping · Simulation · Autonomy

Building reliable autonomy stacks with ROS 2, simulation-first validation, and multi-sensor perception. Focused on shipping safe, scalable robotics systems in real-world environments.

About

Autonomous Systems Engineer specializing in robotics, machine learning, and simulation-first development. Experienced in designing, validating, and deploying robust autonomy stacks for complex real-world operations.

Core Skills

  • NVIDIA Isaac Sim / Omniverse digital twins
  • CAD-to-URDF + USD asset pipelines
  • ROS 2 autonomy stacks (Nav2, behavior trees)
  • Perception + mapping with LiDAR/RGB-D and EKF fusion
  • NVIDIA Isaac Sim / Omniverse
  • CAD-to-URDF + USD asset prep
  • ROS 2 autonomy stacks (Nav2)
  • OpenUSD / USD pipelines
  • C++ / C / C#
  • Python
  • Java
  • JavaScript
  • MATLAB
  • Assembly
  • VHDL
  • SQL
  • Bash / Shell Scripting
  • HTML / CSS
  • OpenUSD
  • ROS / ROS 2 (Nav2, Behavior Trees)
  • URDF / Xacro
  • SLAM & State Estimation (EKF‑based fusion)
  • OpenCV & Point Cloud Library (PCL)
  • TensorFlow / PyTorch
  • OpenVINO / TensorRT
  • NVIDIA Isaac Sim / Omniverse
  • Gazebo
  • Docker
  • Git / GitHub
  • CI for headless simulation
  • Jetson AGX
  • Raspberry Pi / Arduino
  • AWS
  • SolidWorks / Altium / Quartus
  • CAD-to-URDF workflows
  • NVIDIA Cosmos
  • Metropolis VSS
  • COLMAP
  • 3D Gaussian Splatting
  • fVDB/NanoVDB
  • Embedded C, microcontrollers, FPGA
  • LiDAR & camera integration
  • Sensor fusion & calibration
  • PCB & hardware design
  • Field‑Oriented Control (FOC)
  • CAN bus communication
  • Digital Twins & Simulation
  • Synthetic Data Generation (SDG)
  • Real-to-Sim Pipelines
  • CAD-to-URDF asset pipelines
  • Robotics & Autonomous Systems
  • Perception & Autonomy Stack
  • English (Fluent)
  • Arabic (Native)
  • Simulation & Validation: Engineered high-fidelity digital twins in NVIDIA Isaac Sim and CAD-to-URDF conversion workflows to validate autonomy stacks and perception pipelines before deployment.
  • Autonomy & Control: Architected and optimized a ROS 2-based autonomy stack with navigation, mission control, and behavior-tree orchestration.
  • Perception & Sensor Fusion: Integrated camera and LiDAR pipelines with OpenCV and point-cloud processing for robust obstacle detection.

Selected technologies: ROS 2, NVIDIA Isaac Sim, OpenCV, Docker

  • ROS Development: Built custom ROS packages for robot control and navigation with Git-based workflows.
  • SLAM & Localization: Improved localization accuracy through EKF-based state estimation in Gazebo simulations.
  • Research Collaboration: Communicated findings via presentations and technical reports.

Selected technologies: ROS, Gazebo, Git/GitHub

  • Model Development: Engineered an MRI tumor detection model with ~95% accuracy and optimized inference using OpenVINO.
  • Privacy & Performance: Evaluated federated learning strategies for privacy-preserving training.

Selected technologies: Python, TensorFlow, OpenVINO

  • Data Analysis: Applied K-means clustering to analyze healthcare data and identify engagement patterns.
  • Model Validation: Validated ML models and presented results at the FAI Summit 2022.

Selected technologies: Python, scikit-learn

  • Resolved 100+ technical support tickets per week and managed directory data for 200+ staff.
  • Provided academic support to 200+ students in algebra through calculus.

Autonomous Hospital Transport Robot Jan 2024 – Dec 2024

  • Autonomous mobile robot for hospital goods transport (UF senior design).
  • Tools: ROS 2, Python, Gazebo, SolidWorks, SLAM, RViz, LiDAR, Camera, Jetson Orin Nano.

Mood Music Sep 2023 – Nov 2023

  • Web app that recommends a Spotify playlist based on detected or selected mood.
  • Repository: github.com/dannirash/Mood-Music
  • Tools: OpenCV, Flask, Python, Spotify API, HTML/CSS, JavaScript.

University of Florida, Gainesville, FL Dec 2024

  • Computer Engineering B.S.

Santa Fe College, Gainesville, FL May 2020

  • Computer Engineering A.A.

Council Member — International Center Council, University of Florida Jan 2023 – Dec 2024

  • Co‑founded the council with Dean Marta Wayne to advocate for 1,000+ international students.
  • Provided strategic input for programs supporting international students at UF.

Marketing Director — Village Mentors UF Chapter, University of Florida Aug 2020 – Jun 2021

  • Delivered mentoring and tutoring to under-resourced students and coordinated outreach.

Dassault Systémes Associate Mechanical Design Aug 2020

NVIDIA GTC25 Physical AI Workshop Nov 2025

  • Hands-on workshop on building and deploying AI-powered autonomous systems using NVIDIA Metropolis and Isaac Sim.

CRLA Certified Math Tutor Aug 2019

  • Recognition by the College Reading and Learning Association. Learn more

Photography

A curated collection of sports, landscapes, and light painting. Photography is where I recharge my visual curiosity outside of robotics work.

See more on Instagram.

Game Hub

Side projects built for fun, experimentation, and interaction design. These game prototypes sharpen my real-time systems intuition outside of robotics work.

Gator Buster Jan 2021 – Apr 2021

Description: Arcade game where players bust maskless gators for points.

Repository: Gator Buster

Controls: Use SPACE or TOUCH.

Ping Pong Apr 2025

Description: Classic ping pong game built with p5.js featuring power-ups, AI, and timed mode.

Repository: Ping Pong Game

Controls: Q/A (Left), P/L (Right), T (Timed mode), ESC (Pause)

Contact

Let's collaborate

Interested in autonomy, perception, or simulation-first validation? I'm open to discussing full-time roles, collaborations, and research opportunities.

Email Me Resume

Profiles

Connect or review recent work.