At a Glance
- Tasks: Support the transition to a modern data platform and build scalable data pipelines.
- Company: Join a forward-thinking organisation in the East Midlands with a focus on innovation.
- Benefits: Permanent role with competitive salary and flexible working arrangements.
- Why this job: Make a real impact by shaping the future of data management and analytics.
- Qualifications: Experience in Data Engineering and hands-on skills with Microsoft Fabric and SQL.
- Other info: Collaborative environment with opportunities for professional growth and development.
The predicted salary is between 36000 - 60000 £ per year.
James Adams is currently looking for an experienced Data Engineer. This is a permanent role based in East Midlands, 2 days pw onsite.
The Opportunity
An experienced Data Engineer is required to support the transition from SAP BW to a modern data platform built on Microsoft Fabric. The role will focus on delivering the migration, building scalable data pipelines, and maintaining a robust reporting and data framework that supports the organisation’s wider data strategy.
Key Responsibilities
- Support the implementation and development of a Microsoft Fabric data platform.
- Design and maintain data ingestion pipelines from multiple source systems.
- Build and manage ETL and ELT processes using tools such as Azure Data Factory and Fabric Data Pipelines.
- Work with stakeholders across Finance, HR, Commercial and Technology teams to translate business requirements into technical solutions.
- Develop and maintain data models to support reporting and analytics.
- Maintain a trusted reporting layer used across the business.
- Ensure strong data governance, security and regulatory compliance.
- Document data architecture, pipelines and processes.
- Support colleagues in developing data engineering and coding capability.
Skills and Experience
- Proven experience in Data Engineering within enterprise environments.
- Hands on experience with Microsoft Fabric including OneLake, Lakehouse, Delta, Data Pipelines, Dataflows and semantic models.
- Strong SQL skills including optimisation and performance tuning.
- Experience using Python or PySpark for data transformations.
- Experience building ETL or ELT pipelines using Fabric Data Pipelines or Azure Data Factory.
- Strong understanding of data modelling and modern Lakehouse architectures.
- Experience with data governance and security including RBAC and regulatory data handling.
- Experience using Git and working in CI/CD enabled data environments.
- Strong communication skills with the ability to work with both technical and non-technical stakeholders.
Data Engineer - Fabric employer: James Adams
Contact Detail:
James Adams Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer - Fabric
✨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field, especially those who have experience with Microsoft Fabric. A friendly chat can lead to insider info about job openings or even referrals.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving ETL processes or data pipelines. This gives potential employers a taste of what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on your SQL and Python skills. Be ready to discuss how you've tackled data challenges in the past, particularly in enterprise environments. Practice makes perfect!
✨Tip Number 4
Don’t forget to apply through our website! We’ve got loads of opportunities waiting for talented Data Engineers like you. Plus, it’s a great way to ensure your application gets noticed directly by our hiring team.
We think you need these skills to ace Data Engineer - Fabric
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Data Engineer role. Highlight your experience with Microsoft Fabric, ETL processes, and any relevant projects you've worked on. We want to see how your skills match what we're looking for!
Showcase Your Skills: Don’t just list your skills; demonstrate them! Use specific examples of how you've used SQL, Python, or Azure Data Factory in your previous roles. This helps us understand your hands-on experience and how you can contribute to our team.
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Explain why you're excited about the opportunity at StudySmarter and how your background aligns with our needs. Keep it concise but engaging – we love a bit of personality!
Apply Through Our Website: We encourage you to apply through our website for a smoother process. It helps us keep track of applications and ensures you get all the updates directly from us. Plus, it shows you're keen on joining the StudySmarter family!
How to prepare for a job interview at James Adams
✨Know Your Tech Inside Out
Make sure you brush up on your knowledge of Microsoft Fabric and its components like OneLake and Lakehouse. Be ready to discuss how you've used these tools in past projects, especially in building ETL or ELT pipelines.
✨Showcase Your SQL Skills
Prepare to demonstrate your SQL prowess during the interview. Think of specific examples where you've optimised queries or improved performance, as this will show your hands-on experience and problem-solving abilities.
✨Communicate Clearly with Stakeholders
Since you'll be working with various teams, practice explaining technical concepts in simple terms. Prepare a few scenarios where you've successfully translated business requirements into technical solutions, highlighting your strong communication skills.
✨Emphasise Data Governance Knowledge
Be ready to discuss your understanding of data governance and security practices. Share any experiences you have with regulatory compliance and how you've implemented RBAC in previous roles to ensure data integrity.