At a Glance
- Tasks: Design and maintain secure cloud infrastructure while optimising data workflows.
- Company: Fast-growing biotech company revolutionising immune therapies.
- Benefits: Competitive salary, health benefits, and opportunities for professional growth.
- Other info: Collaborate with scientific teams in a dynamic, innovative environment.
- Why this job: Join a mission-driven team making a real impact in disease treatment.
- Qualifications: Experience in DevOps, AWS, and data pipeline design required.
The predicted salary is between 50000 - 70000 £ per year.
Help shape the future of immune therapies. A fast-growing biotech company is on a mission to tackle immune dysfunction — a root cause of many cancers, autoimmune conditions, and infectious diseases. By developing first-in-class antigen modulation technologies, the team is pioneering a new approach to controlling how the immune system recognises and targets cells.
As the company enters its next phase of growth, we’re looking for a Data & DevOps Engineer to join the IT function and play a key role in building scalable, secure, and reproducible data systems that support cutting-edge science.
This position sits at the intersection of IT, data, and scientific teams (including Bioinformatics). You’ll lead on designing and delivering modern data infrastructure and DevOps practices that enable high-quality, reproducible research and analytics.
- Design and maintain secure, scalable cloud infrastructure (AWS)
- Build and optimise CI/CD pipelines for scientific workflows (e.g. Architect data pipelines and platforms for structured and unstructured data Power BI, Spotfire, Microsoft Fabric)
- Enhance data governance, lineage, and auditability
- Experience in DevOps, Platform Engineering, or similar within biotech, pharma, or life sciences
- Strong expertise in AWS, Docker, and CloudFormation
- Hands-on experience with Linux systems and CI/CD pipelines
- Experience designing data pipelines and ETL processes
- Understanding of data modelling and master data management
- Ability to collaborate effectively with scientific stakeholders
- Experience in regulated or research-driven environments (data governance, reproducibility)
- Exposure to Microsoft Fabric, Azure data services, or modern BI tools
- Experience with AI services (e.g. DevOps/security certifications)
Be part of a company driving a fundamental shift in disease treatment. Work at the cutting edge of science, data, and engineering.
DevOps Engineer with Python employer: Practicus
Contact Detail:
Practicus Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land DevOps Engineer with Python
✨Tip Number 1
Network like a pro! Reach out to people in the biotech and DevOps space on LinkedIn. Join relevant groups, attend meetups, and don’t be shy about asking for informational interviews. You never know who might have the inside scoop on job openings!
✨Tip Number 2
Show off your skills! Create a portfolio or GitHub repository showcasing your projects, especially those related to AWS, Docker, and CI/CD pipelines. This gives potential employers a tangible look at what you can do and sets you apart from the crowd.
✨Tip Number 3
Tailor your approach! When reaching out to companies, mention specific projects or technologies they’re working on that excite you. This shows you’ve done your homework and are genuinely interested in contributing to their mission.
✨Tip Number 4
Don’t forget to apply through our website! We love seeing candidates who are proactive and engaged. Plus, it’s a great way to ensure your application gets into the right hands quickly.
We think you need these skills to ace DevOps Engineer with Python
Some tips for your application 🫡
Tailor Your CV: Make sure your CV reflects the skills and experiences that align with the DevOps Engineer role. Highlight your expertise in AWS, Docker, and CI/CD pipelines, as these are key for us in this position.
Craft a Compelling Cover Letter: Use your cover letter to tell us why you're passionate about biotech and how your background can contribute to our mission. Be specific about your experience with data systems and collaboration with scientific teams.
Showcase Relevant Projects: If you've worked on projects involving data pipelines or cloud infrastructure, make sure to include them. We love seeing real-world applications of your skills, especially in regulated environments.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you don’t miss out on any important updates from our team!
How to prepare for a job interview at Practicus
✨Know Your Tech Stack
Make sure you’re well-versed in the technologies mentioned in the job description, especially AWS, Docker, and CI/CD pipelines. Brush up on your knowledge of data governance and ETL processes, as these will likely come up during technical discussions.
✨Showcase Your Collaboration Skills
Since this role involves working closely with scientific teams, be prepared to discuss how you've successfully collaborated in the past. Share specific examples of how you’ve communicated complex technical concepts to non-technical stakeholders.
✨Prepare for Scenario-Based Questions
Expect questions that assess your problem-solving skills in real-world scenarios. Think about challenges you’ve faced in previous roles, particularly in regulated environments, and how you overcame them. This will demonstrate your ability to handle the responsibilities of the position.
✨Ask Insightful Questions
At the end of the interview, have a few thoughtful questions ready about the company’s approach to data governance or their future projects. This shows your genuine interest in the role and helps you gauge if the company aligns with your career goals.