At a Glance
- Tasks: Design and maintain data pipelines on Azure, ensuring high-quality data delivery.
- Company: Join a leading healthcare/insurance client focused on data innovation.
- Benefits: Competitive pay, flexible working, and opportunities for professional growth.
- Why this job: Make a real impact in healthcare by enhancing data analytics capabilities.
- Qualifications: Experience with Azure Data Factory and strong SQL skills required.
- Other info: Fast-paced environment with a chance to work on modern data platforms.
The predicted salary is between 36000 - 60000 £ per year.
We are seeking an Interim Data Engineer on behalf of a healthcare/insurance client to support a growing Data & Analytics function during a period of increased delivery demand. This role will provide hands-on data engineering support as the team scales, helping meet immediate modelling and pipeline requirements while contributing to the evolution of a modern data platform. You will work closely with the Data Engineering team, reporting into senior leadership, and partner with analytics, BI, and business stakeholders to deliver reliable, high-quality data products across the organisation.
Responsibilities
- Design, build, and maintain robust data pipelines within an Azure-based data platform.
- Deliver data modelling and transformation work to support analytics, reporting, and operational use cases.
- Support increased demand from the business for new datasets, models, and enhancements.
- Work with SQL-based data warehouses and contribute to orchestration using Azure Data Factory.
- Ensure data quality, reliability, security, and observability across pipelines.
- Collaborate with BI, data science, and governance teams to enable trusted, well-defined datasets.
- Proactively identify opportunities to improve performance, scalability, documentation, and ways of working.
- Support the team during a period of transition, including cover for senior capacity where required.
Qualifications
- Strong hands-on experience as a Data Engineer in a cloud-based environment.
- Proven experience with Azure Data Factory and SQL-based data platforms (hard requirements).
- Advanced SQL skills and experience building analytical data models.
- Experience supporting BI.
- Familiarity with data quality, monitoring, lineage, and security best practices.
- Experience working in fast-paced, delivery-focused environments.
- Exposure to healthcare, customer-centric, or insurance organisations is beneficial but not essential.
- Experience with or interest in modern data platforms (e.g., Snowflake) is a plus.
Data Engineer - Azure Data Factory - Outside IR35 - Healthcare/Insurance in Kingswood employer: Korn Ferry
Contact Detail:
Korn Ferry Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer - Azure Data Factory - Outside IR35 - Healthcare/Insurance in Kingswood
✨Tip Number 1
Network like a pro! Reach out to your connections in the healthcare and insurance sectors. We all know that sometimes it’s not just what you know, but who you know. Attend industry meetups or webinars to make those valuable connections.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your data engineering projects, especially those involving Azure Data Factory. We can’t stress enough how important it is to demonstrate your hands-on experience with real-world examples.
✨Tip Number 3
Prepare for interviews by brushing up on your SQL and Azure knowledge. We recommend practising common data engineering scenarios and being ready to discuss how you’ve tackled challenges in past roles. Confidence is key!
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets noticed. Plus, we’re always looking for passionate individuals who want to make an impact in the data space.
We think you need these skills to ace Data Engineer - Azure Data Factory - Outside IR35 - Healthcare/Insurance in Kingswood
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Data Engineer role. Highlight your hands-on experience with Azure Data Factory and SQL-based platforms, as these are key for us. Use specific examples that showcase your skills in building data pipelines and supporting analytics.
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Tell us why you're passionate about data engineering and how your experience aligns with our needs. Mention any relevant projects or achievements that demonstrate your ability to deliver high-quality data products.
Showcase Your Problem-Solving Skills: In your application, don’t forget to highlight your problem-solving abilities. We love candidates who can proactively identify opportunities for improvement, so share examples of how you've enhanced performance or scalability in past roles.
Apply Through Our Website: We encourage you to apply through our website for a smoother process. It helps us keep track of your application and ensures you’re considered for the role. Plus, it’s super easy to do!
How to prepare for a job interview at Korn Ferry
✨Know Your Tech Inside Out
Make sure you brush up on your Azure Data Factory and SQL skills. Be ready to discuss specific projects where you've designed and built data pipelines, as well as any challenges you faced and how you overcame them.
✨Showcase Your Collaboration Skills
This role involves working closely with various teams, so be prepared to share examples of how you've successfully collaborated with BI, data science, or governance teams in the past. Highlight your ability to communicate complex data concepts clearly.
✨Demonstrate Problem-Solving Abilities
Think of instances where you've proactively identified opportunities for improvement in data quality or pipeline performance. Discuss how you approached these challenges and the impact your solutions had on the team or project.
✨Understand the Business Context
Familiarise yourself with the healthcare and insurance sectors, even if it's not a hard requirement. Showing that you understand the business context can set you apart and demonstrate your commitment to delivering high-quality data products.