At a Glance
- Tasks: Design and develop data pipelines using Microsoft Fabric and Python.
- Company: Leading independent engineering firm with a rich history and innovative projects.
- Benefits: Hybrid work, competitive salary, training, and clear career progression.
- Why this job: Join a dynamic team and work on groundbreaking projects with cutting-edge technology.
- Qualifications: Experience in Microsoft Fabric, Python, and data warehousing principles.
- Other info: Inclusive culture with a focus on personal development and support.
The predicted salary is between 36000 - 60000 £ per year.
We’re currently seeking a multi-skilled developer to join our ICT team here in Leeds. Reporting directly to our ICT Development Lead, you’ll focus initially on Microsoft Fabric Data Engineering.
In this role, an ideal candidate will have a strong background in software development and will have demonstrated experience with one of the Azure Data Analytics platforms e.g. Fabric, Synapse or ADF.
Some of the key deliverables for the role will include:
- Designing, developing and maintaining data pipelines within Microsoft Fabric, including Lakehouse, Data Engineering and Data Warehouse components.
- Building scalable ETL/ELT workloads using Spark notebooks and Python, following data engineering best practices, including a meta driven framework.
- Applying data modelling and data warehousing principles to support business intelligence and analytics solutions.
- Working alone and with others on new business systems and/or modify existing systems to specific user/system interfaces.
- Awareness of the software engineering life cycle for development and the concepts and practices required to implement effective information systems.
- IT setups in liaison with colleagues and users as appropriate including: Analysis and modelling of user requirements, specifying information flows, processes and procedures.
- Converting specifications into detailed designs taking account of technical and non-technical features and limitations of the target implementation environment.
- Translating design requirements and implementing into physical database structures.
- Constructing or modifying, testing and correcting program modules from detailed specifications.
- Interpreting and executing defined test plans.
- Installing fully tested software in target environments.
- Producing documentation of all work in accordance with agreed standards.
- Taking part in client/user meetings (both formal and informal) and assist in presenting issues and solutions both orally and in writing.
- Participating in technical discussions with our third parties.
- Undertaking general business systems support on internal/third party systems and adopt a flexible and consistent approach to offering on-site support to users.
What we’re looking for:
- Proven experience in Microsoft Fabric or similar technologies.
- Solid knowledge in Python & Spark Notebooks, Azure Data Factory / Azure.
- Demonstrable experience in Data Warehousing / Data Marts.
- Previously used one or more of: SQL, Oracle SQL, T-SQL, PLSQL.
- Previously used one or more of: Power BI, Crystal, SQL Server Reporting Services (SSRS), Synapse, Azure Integration Services, Azure API Management, Azure Logic Apps, Azure Functions, Azure Service Bus.
- Demonstrable experience with traditional programming languages and/or low-code development tools.
- Experience with Source Control.
What would be desirable:
- Microsoft Power Platform.
- Azure DevOps.
- HTML/CSS.
- Further knowledge or experience with ERP, WMS, .NET Technologies.
Next Steps: As a business, we’re on a journey to build on our culture where everyone is included, treated fairly and with respect. This starts with recruitment and how we bring people into the organisation. We’ll do our best to outline the recruitment process to you ahead of time with plenty of notice. If you require any accommodations to participate in the application or interview process, please let us know and we will work with you to ensure your needs are met.
About Us: We are one of the leading independent engineering and services businesses in the UK. Founded in 1921, with a turnover of £500m and 3000 employees, we are proud of our history of developing great people through our investment in training. Working across a variety of sectors within the building and infrastructure industry, our innovative, responsible and forward-thinking approach allows us to work on fantastic ground-breaking projects, providing solutions using the latest tools and technologies. Progression is something we value, and we will make sure that when you join us you have a clearly defined development path, supported by regular reviews, training and ongoing support to enable you to be the best you can be.
Developer with Data Engineering focus in Leeds employer: NG Bailey
Contact Detail:
NG Bailey Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Developer with Data Engineering focus in Leeds
✨Tip Number 1
Network like a pro! Reach out to your connections in the industry, attend meetups, and engage with online communities. You never know who might have the inside scoop on job openings or can refer you directly.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving Microsoft Fabric, Python, and Spark. This gives potential employers a tangible look at what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on common technical questions related to data engineering and software development. Practice explaining your thought process clearly, as communication is key in this role.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, it shows you’re genuinely interested in joining our team.
We think you need these skills to ace Developer with Data Engineering focus in Leeds
Some tips for your application 🫡
Tailor Your CV: Make sure your CV reflects the skills and experiences that match the Developer role. Highlight your experience with Microsoft Fabric, Python, and data engineering principles to catch our eye!
Craft a Compelling Cover Letter: Use your cover letter to tell us why you're passionate about data engineering and how your background aligns with our needs. Be genuine and let your personality shine through!
Showcase Relevant Projects: If you've worked on projects involving data pipelines or Azure technologies, mention them! We love seeing real-world applications of your skills, so don’t hold back on the details.
Apply Through Our Website: We encourage you to apply directly through our website for a smoother process. It helps us keep track of your application and ensures you’re considered for the role without any hiccups!
How to prepare for a job interview at NG Bailey
✨Know Your Tech Inside Out
Make sure you brush up on your knowledge of Microsoft Fabric, Python, and Spark Notebooks. Be ready to discuss how you've used these technologies in past projects, as well as any challenges you faced and how you overcame them.
✨Showcase Your Problem-Solving Skills
Prepare to share specific examples of how you've designed and implemented data pipelines or ETL/ELT workloads. Highlight your thought process and the best practices you followed, as this will demonstrate your ability to tackle real-world problems.
✨Understand the Business Context
Familiarise yourself with the company’s projects and how they use data engineering to drive business intelligence. This will help you connect your technical skills to their needs and show that you're genuinely interested in contributing to their success.
✨Practice Communication
Since you'll be working with both technical and non-technical stakeholders, practice explaining complex concepts in simple terms. Be prepared for client/user meetings and think about how you can present issues and solutions clearly and effectively.