At a Glance
- Tasks: Design and build scalable data pipelines using Microsoft Fabric.
- Company: Growing business focused on innovative data solutions.
- Benefits: Competitive salary, mainly remote work, and high ownership in projects.
- Why this job: Shape a greenfield data platform and make a real impact.
- Qualifications: Strong experience in Microsoft Fabric, Python, Spark/PySpark, and SQL.
- Other info: Minimal office requirement with excellent career growth opportunities.
The predicted salary is between 60000 - 70000 £ per year.
We’re working with a growing business investing heavily in a greenfield Microsoft Fabric data platform and looking for a Senior Data Engineer to help shape and deliver it. This is a hands-on engineering role with real ownership - not just maintaining pipelines, but building a modern data platform properly from the ground up.
What you’ll be doing:
- Designing and building scalable data pipelines in Microsoft Fabric (Lakehouse, Pipelines, Spark)
- Working across ingestion → transformation → serving layers
- Writing clean, production-grade Python / PySpark and SQL
- Improving performance, reliability, and monitoring across the platform
- Translating business requirements into solid technical solutions
- Setting standards around code quality, CI/CD, and documentation
What we’re looking for:
- Strong Microsoft Fabric experience (essential)
- Solid background in Python, Spark/PySpark, and SQL
- Experience building end-to-end data pipelines in Azure environments
- Engineering mindset — testing, version control, maintainability
- Comfortable working with stakeholders and owning delivery
Why this role:
- Greenfield Fabric implementation (not fixing legacy)
- High level of ownership and influence on architecture
- Strong investment in data and engineering capability
- Mainly remote with minimal office requirement
Data Engineer in Leeds employer: Fairmont Recruitment Technology
Contact Detail:
Fairmont Recruitment Technology Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer in Leeds
✨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field, especially those who have experience with Microsoft Fabric. A friendly chat can lead to insider info about job openings or even referrals.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving Python, Spark, and SQL. This will give potential employers a taste of what you can do and set you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on your technical knowledge and soft skills. Be ready to discuss how you've built scalable data pipelines and tackled challenges in previous roles. Confidence is key!
✨Tip Number 4
Don’t forget to apply through our website! We’re always on the lookout for talented individuals like you. Plus, it’s a great way to ensure your application gets the attention it deserves.
We think you need these skills to ace Data Engineer in Leeds
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Data Engineer role. Highlight your experience with Microsoft Fabric, Python, and building data pipelines. We want to see how your skills match what we're looking for!
Showcase Your Projects: Include specific projects where you've designed and built scalable data pipelines. We love seeing real examples of your work, especially if they involve Azure environments or modern data platforms.
Be Clear and Concise: When writing your application, keep it clear and to the point. Use bullet points for your achievements and make it easy for us to see your qualifications at a glance. We appreciate straightforward communication!
Apply Through Our Website: Don’t forget to apply through our website! It’s the best way for us to receive your application and ensures you’re considered for the role. We can’t wait to see what you bring to the table!
How to prepare for a job interview at Fairmont Recruitment Technology
✨Know Your Microsoft Fabric Inside Out
Make sure you brush up on your Microsoft Fabric knowledge before the interview. Understand its components like Lakehouse, Pipelines, and Spark, and be ready to discuss how you've used them in past projects. This will show that you're not just familiar with the tech but can also apply it effectively.
✨Showcase Your Python and SQL Skills
Prepare to demonstrate your coding skills in Python and SQL during the interview. You might be asked to solve a problem or explain your thought process while writing clean, production-grade code. Practising common data engineering tasks can help you feel more confident.
✨Talk About Your Engineering Mindset
Be ready to discuss your approach to testing, version control, and maintainability. Share examples of how you've implemented CI/CD practices in your previous roles. This will highlight your commitment to quality and reliability in data engineering.
✨Engage with Stakeholders
Since this role involves working closely with stakeholders, prepare to talk about your experience in translating business requirements into technical solutions. Think of specific instances where you successfully communicated complex ideas to non-technical team members.