At a Glance
- Tasks: Design and build scalable data pipelines using Microsoft Fabric and Python.
- Company: Join a growing business focused on innovative data solutions.
- Benefits: Enjoy remote work flexibility and a high level of ownership.
- Other info: Minimal office requirement with excellent career growth opportunities.
- Why this job: Shape a greenfield data platform and make a real impact.
- Qualifications: Strong experience in Microsoft Fabric, Python, and Azure environments.
The predicted salary is between 60000 - 80000 £ per year.
We’re working with a growing business investing heavily in a greenfield Microsoft Fabric data platform and looking for a Senior Data Engineer to help shape and deliver it. This is a hands-on engineering role with real ownership - not just maintaining pipelines, but building a modern data platform properly from the ground up.
What you’ll be doing:
- Designing and building scalable data pipelines in Microsoft Fabric (Lakehouse, Pipelines, Spark)
- Writing clean, production-grade Python / PySpark and SQL
- Improving performance, reliability, and monitoring across the platform
- Translating business requirements into solid technical solutions
- Setting standards around code quality, CI/CD, and documentation
What we’re looking for:
- Strong Microsoft Fabric experience (essential)
- Solid background in Python, Spark/PySpark, and SQL
- Experience building end-to-end data pipelines in Azure environments
- Engineering mindset — testing, version control, maintainability
- Comfortable working with stakeholders and owning delivery
Why this role:
- Greenfield Fabric implementation (not fixing legacy)
- High level of ownership and influence on architecture
- Strong investment in data and engineering capability
- Mainly remote with minimal office requirement
Senior Data Engineer — Microsoft Fabric Platform (Remote) employer: Fairmont Recruitment Technology
Contact Detail:
Fairmont Recruitment Technology Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Data Engineer — Microsoft Fabric Platform (Remote)
✨Tip Number 1
Network like a pro! Reach out to folks in the industry, especially those already working with Microsoft Fabric. A friendly chat can lead to insider info about job openings and even referrals.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving Python, Spark, and data pipelines. This gives potential employers a taste of what you can do beyond just a CV.
✨Tip Number 3
Prepare for interviews by brushing up on your technical knowledge. Be ready to discuss your experience with Microsoft Fabric and how you've tackled challenges in previous roles. Confidence is key!
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen. Plus, we love seeing candidates who are proactive about their job search.
We think you need these skills to ace Senior Data Engineer — Microsoft Fabric Platform (Remote)
Some tips for your application 🫡
Show Off Your Skills: Make sure to highlight your experience with Microsoft Fabric, Python, and SQL in your application. We want to see how you've built data pipelines and tackled challenges in your previous roles.
Be Clear and Concise: When writing your application, keep it straightforward. Use clear language to explain your past projects and how they relate to the role. We appreciate a well-structured application that gets straight to the point!
Tailor Your Application: Don’t just send a generic application! Tailor it to reflect the specific requirements of the Senior Data Engineer role. Mention how your engineering mindset aligns with our goals for building a modern data platform.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for this exciting opportunity!
How to prepare for a job interview at Fairmont Recruitment Technology
✨Know Your Microsoft Fabric Inside Out
Make sure you brush up on your Microsoft Fabric knowledge. Understand its components like Lakehouse, Pipelines, and Spark. Be ready to discuss how you've used these tools in past projects and how they can be applied to build scalable data pipelines.
✨Showcase Your Coding Skills
Prepare to demonstrate your Python, PySpark, and SQL skills. You might be asked to solve a coding challenge or explain your thought process behind writing clean, production-grade code. Practise common coding problems and be ready to discuss your approach to testing and version control.
✨Translate Business Needs into Technical Solutions
Think about how you can bridge the gap between technical and non-technical stakeholders. Prepare examples of how you've successfully translated business requirements into solid technical solutions in previous roles. This will show your engineering mindset and ability to own delivery.
✨Emphasise Your Ownership and Influence
This role is all about ownership and influence on architecture. Be prepared to discuss instances where you've taken charge of a project or initiative. Highlight how your contributions have improved performance, reliability, or monitoring in past data platforms.