top of page

Applications open until April 30th!

Perception & Navigation Engineer

1. Humanoid

Develop the robot sensing pipeline and computer vision algorithms for planning and spatial navigation!

Responsibilities & Tasks

  • Develop and implement computer vision algorithms for object detection/ avoidance, environmental interaction and mapping, and spatial awareness using RGB, depth, stereo cameras, and LiDAR .

  • Integrate sensor data from various sources (e.g., cameras, LIDAR, IMUs) to build robust sensor fusion pipelines and algorithms for state estimation, real-time navigation and perception.

  • Design and implement computer vision pipelines and planning algorithms for obstacle avoidance, path planning, and spatial navigation in complex environments.

  • Collaborate with the Simulation, Control, and Real-Time Systems Teams to ensure sensor integration and smooth locomotion, real-time sensing and navigation in various environments.

  • Conduct simulations and real-world testing to validate and refine the robot’s perception, planning, and navigation performance.

  • Document algorithm development, testing results, and performance improvements for knowledge sharing.

  • General Responsibilities: Attend weekly/bi-weekly departmental & organizational RoboTUM meetings, as well as occasional spontaneous meetings as needed. Answer emails and messages within 24 hours (be comfortable using Slack). Adhere to RoboTUM policies and be willing to help when the team needs you 


Position Benefits

  • Contribute to the development of advanced computer vision, sensing algorithms, and spatial navigation for the humanoid robot, enabling autonomous perception and decision-making.

  • Gain hands-on experience in robot perception, 3D spatial planning, machine learning, and navigation algorithms.

  • Work on a high-performance humanoid with cutting-edge software and hardware inspired by biomechanics.

  • Collaborate with power electronics, body mechatronics, arm mechatronics, and other software teams to develop a fully integrated robotic system.

  • Access to state-of-the-art simulation software such as Isaac Sim, training environments like Isaac Lab, and prototyping facilities at TUM.

  • Expand your professional network through research collaborations, competitions, and industry connections.

  • Potential for thesis projects or specialized research into robotic simulation, and robot training with and without synthetic data.

  • Close-knit community of dedicated people.


Minimum Qualifications

  • Background in robot perception, computer vision, or spatial navigation algorithms.

  • Experience with a framework such as YOLOv10, simulation tools (e.g., Isaac Sim, Gazebo, MuJoCo), and Python/ C++.

  • Understanding of sensor fusion, 3D environment mapping, and robot navigation algorithms.

  • Understanding of machine learning techniques for computer vision (e.g., CNNs, deep learning).

  • Ability to collaborate across interdisciplinary teams, combining software with hardware and sensor systems, and iterate based on real-world testing.

  • Strong problem-solving skills, analytical thinking, and enthusiasm for tackling complex challenges arising from software and hardware.


Optimal Qualifications

  • Background in vision-based motion planning, robot perception, or spatial navigation for robotic systems.

  • Experience with robotic systems, particularly vision-based localization, motion planning, and autonomous navigation.

  • Experience with Isaac Sim, navigation algorithms, and machine learning frameworks for computer vision (e.g., TensorFlow, PyTorch)/ YOLOvX.

  • Experience in developing 3D spatial maps, and SLAM (Simultaneous Localization and Mapping).

  • Familiarity with/ knowledge of reinforcement learning or other adaptive techniques for autonomous decision-making and navigation.

  • Good understanding of sensor fusion, machine learning algorithms, and real-time control systems and integration of the latter with perception and planning algorithms for robots.

  • Strong problem-solving skills, analytical thinking, and enthusiasm for tackling complex challenges arising from software and hardware.

  • Project experience in interdisciplinary teams combining hardware and software.


Estimated Time Commitment

  • 10-12hours/week at the beginning of term and less towards exam season

PDF Offer
bottom of page