Perception Engineer

Perception Engineer

Full-Time 36000 - 60000 £ / year (est.) Home office (partial)
Go Premium
K

At a Glance

  • Tasks: Design and deploy perception pipelines for robots to interact with the real world.
  • Company: Kinisi Robotics is pioneering next-gen robotic manipulation systems in Bristol.
  • Benefits: Enjoy competitive salary, equity, flexible hours, and comprehensive health coverage.
  • Why this job: Join a cutting-edge team blending research and product development in robotics.
  • Qualifications: PhD or outstanding Master's in Robotics/Computer Vision; 3+ years experience required.
  • Other info: Hybrid work options available for top talent; state-of-the-art lab space.

The predicted salary is between 36000 - 60000 £ per year.

We’re building the next generation of robot manipulation systems for unstructured, real-world environments. As a Perception Engineer, you will design, prototype, and deploy high-performance perception pipelines that enable our robots to see, understand, and dexterously interact with the world in real time. You’ll sit at the intersection of academia and product, translating state-of-the-art research into production-ready software running on ROS 2 based platforms.

What You’ll Do

  • Own the real-time perception stack for robotic manipulation tasks, from sensor acquisition through to fused 3-D scene understanding and grasp/placement proposals.
  • Research, prototype, and benchmark novel algorithms in 2-D/3-D vision, multi-modal fusion, and dense correspondence that push manipulation speed and reliability.
  • Implement, optimize, and profile deep-learning models in PyTorch and C++ (CUDA) to meet strict latency budgets on embedded GPUs/accelerators.
  • Integrate perception modules in ROS 2, ensuring clean interfaces, deterministic scheduling, and robust failure handling.
  • Conduct rigorous real-world and simulated experiments, and communicate results through clear technical reports and publications (internal and external).
  • Collaborate cross-functionally with controls, planning, and hardware teams to close perception–action loops and ship production-quality releases.

Minimum Qualifications

  • PhD (or outstanding Master’s + equivalent publications) in Robotics, Computer Vision, Machine Learning, or a closely related field.
  • 3+ years hands-on experience building real-time perception systems for robot manipulation or autonomous platforms.
  • Advanced proficiency in Python and modern C++17/20; proven track record writing clean, testable, high-performance code.
  • Deep expertise with PyTorch (training & inference) and GPU programming (CUDA, TensorRT, or similar).
  • Production experience with ROS 2 (rclcpp/rclpy, lifecycle nodes, DDS tuning, real-time QoS).
  • Strong publication record in top-tier venues (e.g., RSS, ICRA, CoRL, CVPR, RAL).

Preferred Qualifications

  • Track record shipping perception on manipulation platforms (e.g., mobile manipulators, bin-picking arms, industrial cobots).
  • Familiarity with multi-sensor calibration, tactile or force perception, depth cameras (D-ToF, active stereo), and point-cloud processing (PCL, Open3D).
  • Experience deploying on-device inference for NVIDIA Jetson/Orin, Intel ARC, or similar edge accelerators.
  • Contributions to open-source robotics or vision libraries.
  • Comfortable working in an agile, research-driven environment with fast iteration cycles.

What We Offer

  • Competitive salary, equity, and performance bonus.
  • Comprehensive health, dental, and vision coverage.
  • Annual conference budget & dedicated research time.
  • Flexible hours and hybrid/remote-friendly culture.
  • State-of-the-art lab space with collaborative, cross-disciplinary teams.
K

Contact Detail:

Kinisi Robotics Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Perception Engineer

✨Tip Number 1

Familiarise yourself with the latest advancements in perception systems and robotics. Follow relevant research papers and publications, especially those from top-tier venues like RSS and CVPR, to stay updated on cutting-edge techniques that you can discuss during interviews.

✨Tip Number 2

Engage with the robotics community by attending conferences or meetups. Networking with professionals in the field can provide insights into the company culture at Kinisi Robotics and may even lead to referrals, which can significantly boost your chances of landing the job.

✨Tip Number 3

Showcase your hands-on experience with real-time perception systems by working on personal projects or contributing to open-source initiatives. This practical experience will not only enhance your skills but also serve as concrete examples to discuss during interviews.

✨Tip Number 4

Prepare to demonstrate your coding skills in Python and C++ during technical interviews. Brush up on writing clean, testable code and be ready to solve problems related to ROS 2 integration, as this will be crucial for the role of Perception Engineer.

We think you need these skills to ace Perception Engineer

Real-Time Perception Systems
2-D/3-D Vision Algorithms
Multi-Modal Fusion
Dense Correspondence Techniques
Deep Learning Model Implementation
PyTorch
C++ (CUDA)
Embedded GPU Programming
ROS 2 Integration
Sensor Acquisition
3-D Scene Understanding
Grasp and Placement Proposals
Technical Report Writing
Cross-Functional Collaboration
Agile Development Practices
Publication in Top-Tier Venues

Some tips for your application 🫡

Tailor Your CV: Make sure your CV highlights relevant experience in robotics, computer vision, and machine learning. Emphasise your hands-on experience with real-time perception systems and any specific projects that align with the job description.

Craft a Strong Cover Letter: In your cover letter, express your passion for robotics and how your background makes you a perfect fit for the Perception Engineer role. Mention specific skills like proficiency in Python, C++, and experience with ROS 2 that are crucial for the position.

Showcase Your Publications: If you have a strong publication record, include a section in your application that lists your most relevant papers. Highlight those published in top-tier venues, as this will demonstrate your expertise and commitment to the field.

Prepare for Technical Questions: Anticipate technical questions related to perception algorithms, deep learning models, and ROS 2 integration. Be ready to discuss your previous projects and how they relate to the responsibilities of the role.

How to prepare for a job interview at Kinisi Robotics

✨Showcase Your Technical Skills

Be prepared to discuss your experience with real-time perception systems, particularly in robotics. Highlight specific projects where you've implemented algorithms in 2-D/3-D vision or worked with ROS 2, as this will demonstrate your hands-on expertise.

✨Prepare for Problem-Solving Questions

Expect questions that assess your ability to tackle complex problems. Think about challenges you've faced in previous roles and how you overcame them, especially in relation to deep learning models and performance optimisation.

✨Demonstrate Your Research Acumen

Since the role involves translating research into practical applications, be ready to discuss your publications and how they relate to the job. This shows your ability to bridge the gap between academia and industry.

✨Emphasise Collaboration Skills

The position requires working closely with cross-functional teams. Share examples of successful collaborations from your past experiences, focusing on how you contributed to closing perception-action loops or shipping production-quality releases.

Perception Engineer
Kinisi Robotics
Go Premium

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

K
Similar positions in other companies
UK’s top job board for Gen Z
discover-jobs-cta
Discover now
>