At a Glance
- Tasks: Lead the design and implementation of data engineering frameworks and build data pipelines.
- Company: Join a dynamic London-based pharmaceutical manufacturing organisation.
- Benefits: Competitive day rate, flexible work with 1 day onsite per week.
- Other info: Initial 12-month contract with opportunities for career growth in a collaborative environment.
- Why this job: Make a real impact by stabilising and improving data architecture in a key role.
- Qualifications: Strong experience as a Data Engineer with skills in Snowflake, SQL, and Python.
The predicted salary is between 60000 - 80000 € per year.
Robert Half have partnered with a London-based pharmaceutical manufacturing organisation who are looking to engage a Senior Data Engineer to play a key role in scaling and maturing their data function. This is an initial 12-month contract, playing a key role in stabilising and improving a currently fragmented data architecture while introducing best-practice engineering and delivery standards. The role requires 1 day per week onsite in London.
Responsibilities:
- Lead the design and implementation of a clear, structured data engineering framework
- Build, enhance, and maintain data pipelines within Snowflake
- Develop transformations and data models using DBT
- Orchestrate workflows using Apache Airflow
- Write clean, modular Python code for data engineering solutions
- Improve pipeline reliability, performance, and observability
- Introduce and embed CI/CD, testing, and deployment best practices (Azure DevOps)
- Define and implement clear data domains and schemas
- Address ongoing ETL failures and long-running jobs
- Work closely with technical and non-technical stakeholders in an Agile delivery environment
- Contribute to roadmap definition and future platform evolution (including potential Databricks adoption)
Skills:
- Strong hands-on experience as a Data Engineer at Senior or Lead level
- Extensive experience with Snowflake as a core data platform
- Advanced SQL with strong business logic and modelling capability
- Experience with DBT, Airflow, and Python
- Experience building modular, scalable solutions within Snowflake
- CI/CD pipeline experience, ideally using Azure DevOps
- Exposure to Docker and modern software engineering practices
- Strong understanding of software delivery lifecycles and engineering best practices
- Background in software engineering (rather than purely analytics-focused roles)
- Experience working in Agile / Scrum delivery environments
- Databricks experience or exposure desirable
Contract:
- Initial 12-month contract
- London-based, 1 day per week onsite
- Competitive day rate
Senior Data Engineer employer: Robert Half
Join a leading London-based pharmaceutical manufacturing organisation that values innovation and excellence in data engineering. With a strong commitment to employee growth, you will have the opportunity to work in a collaborative Agile environment, enhancing your skills while contributing to impactful projects. Enjoy a competitive day rate and the flexibility of working one day a week onsite, all within a culture that prioritises diversity, equity, and inclusion.
StudySmarter Expert Advice🤫
We think this is how you could land Senior Data Engineer
✨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field, especially those who work with Snowflake or in similar roles. A friendly chat can lead to insider info about job openings that aren't even advertised yet.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your best data engineering projects, especially those involving DBT and Apache Airflow. This will give potential employers a taste of what you can do and set you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on your technical knowledge. Be ready to discuss your experience with CI/CD practices and how you've tackled ETL failures in the past. Confidence in your expertise can make all the difference!
✨Tip Number 4
Don't forget to apply through our website! We’ve got loads of opportunities waiting for talented folks like you. Plus, it’s a great way to ensure your application gets the attention it deserves.
We think you need these skills to ace Senior Data Engineer
Some tips for your application 🫡
Tailor Your CV:Make sure your CV is tailored to the Senior Data Engineer role. Highlight your experience with Snowflake, DBT, and Python, and don’t forget to showcase your hands-on experience in data engineering. We want to see how you fit into our vision!
Craft a Compelling Cover Letter:Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how your skills align with our needs. We love seeing enthusiasm and a clear understanding of the role.
Showcase Your Projects:If you've worked on relevant projects, make sure to mention them! Whether it's building data pipelines or improving ETL processes, we want to know what you've done and how it relates to the job. Real-world examples can make a big difference.
Apply Through Our Website:We encourage you to apply through our website for a smoother process. It helps us keep track of applications and ensures you get all the updates directly from us. Plus, it shows you're keen on joining our team!
How to prepare for a job interview at Robert Half
✨Know Your Tech Stack
Make sure you’re well-versed in the technologies mentioned in the job description, especially Snowflake, DBT, and Apache Airflow. Brush up on your Python skills too, as you'll need to demonstrate your ability to write clean, modular code.
✨Showcase Your Problem-Solving Skills
Prepare to discuss specific challenges you've faced in previous roles, particularly around ETL failures or data pipeline issues. Be ready to explain how you approached these problems and what solutions you implemented.
✨Understand Agile Methodologies
Since the role involves working in an Agile environment, be prepared to talk about your experience with Agile/Scrum practices. Highlight any relevant projects where you collaborated with both technical and non-technical stakeholders.
✨Ask Insightful Questions
At the end of the interview, don’t shy away from asking questions. Inquire about the current data architecture challenges they face or how they envision the evolution of their data platform. This shows your genuine interest in the role and the company.