At a Glance
- Tasks: Design and maintain large-scale data pipelines for cutting-edge robotics AI.
- Company: Join a pioneering tech firm focused on Physical AI innovation.
- Benefits: Competitive salary, flexible work options, and opportunities for professional growth.
- Other info: Dynamic work environment with a focus on collaboration and innovation.
- Why this job: Be part of a team shaping the future of robotics with impactful data solutions.
- Qualifications: Strong software engineering skills and extensive experience in data pipeline development.
The predicted salary is between 48000 - 72000 £ per year.
Design, build, and maintain large-scale data pipelines (batch and streaming) for robotics foundation model training and evaluation at petabyte scale.
Own core data infrastructure: data model, storage systems, ingestion pipelines, transformation frameworks, and orchestration layers.
Standardize data models and unify processing pipelines across real-world teleoperation and synthetic simulation datasets.
Collaborate with a team of driven individuals committed to building general-purpose Physical AI.
What You’ll Bring
- Excellent software engineering skills (Python, Go, or similar)
- Extensive experience designing, building, and maintaining large-scale data pipelines (8+ years)
- Deep understanding of distributed systems (Spark, Kafka, or similar)
- Extensive experience with data storage technologies (data lakes, warehouses, object stores like S3)
- Experience running and maintaining production-grade infrastructure (Kubernetes, Terraform)
- Bonus: Experience supporting AI systems, in particular embodied AI like self-driving
Member of Technical Staff, Data (Paris, London) employer: Genesis AI
Contact Detail:
Genesis AI Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Member of Technical Staff, Data (Paris, London)
✨Tip Number 1
Network like a pro! Reach out to folks in the industry on LinkedIn or at meetups. A personal connection can often get your foot in the door faster than any application.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your data pipelines and projects. This gives potential employers a taste of what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on your technical knowledge. Be ready to discuss your experience with distributed systems and data storage technologies, as these are key for the role.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, we love seeing candidates who are proactive!
We think you need these skills to ace Member of Technical Staff, Data (Paris, London)
Some tips for your application 🫡
Show Off Your Skills: Make sure to highlight your software engineering skills, especially in Python or Go. We want to see how your experience aligns with designing and maintaining large-scale data pipelines, so don’t hold back!
Tailor Your Application: Take a moment to customise your application for this role. Mention specific projects where you've worked with distributed systems like Spark or Kafka, and how you’ve tackled challenges in data storage technologies.
Be Clear and Concise: When writing your application, keep it clear and to the point. We appreciate straightforward communication, so avoid jargon unless it’s necessary to showcase your expertise.
Apply Through Our Website: Don’t forget to apply through our website! It’s the best way for us to receive your application and ensures you’re considered for the role. We can’t wait to see what you bring to the table!
How to prepare for a job interview at Genesis AI
✨Know Your Data Pipelines
Make sure you can talk confidently about your experience with large-scale data pipelines. Be ready to discuss specific projects where you've designed, built, or maintained these systems, especially in relation to batch and streaming processes.
✨Showcase Your Technical Skills
Brush up on your software engineering skills, particularly in Python or Go. Prepare to demonstrate your understanding of distributed systems like Spark or Kafka, and be ready to explain how you've used these technologies in past roles.
✨Understand the Infrastructure
Familiarise yourself with the core data infrastructure mentioned in the job description. Be prepared to discuss your experience with data storage technologies, orchestration layers, and tools like Kubernetes and Terraform.
✨Collaborate and Communicate
Since collaboration is key in this role, think of examples where you've worked effectively within a team. Highlight your ability to communicate complex technical concepts clearly, especially when working on AI systems or robotics projects.