At a Glance
- Tasks: Design and maintain data pipelines using Azure Data Factory and Databricks.
- Company: Join Albany Beck, a leader in data-driven transformation and innovation.
- Benefits: Full-time role with opportunities for growth and collaboration.
- Other info: Dynamic team environment focused on modern cloud-based solutions.
- Why this job: Make an impact by enabling smarter decision-making through data engineering.
- Qualifications: Experience with Azure Data Factory, Databricks, and strong communication skills.
The predicted salary is between 50000 - 65000 £ per year.
Albany Beck is a specialist consultancy that partners with leading organisations to deliver high-quality talent, technology solutions, and data-driven transformation. With a strong focus on innovation, capability development, and long-term collaboration, Albany Beck helps clients modernise their operations and build scalable digital futures.
As a Data Engineer within Albany Beck, you will play a key role in supporting one of our major clients by designing, building, and maintaining high-quality data pipelines and models. You will contribute to a modern cloud-based data platform that enables smarter decision-making, improved operational efficiency, and continuous innovation.
Key Responsibilities- Design, build, and maintain ELT pipelines using Azure Data Factory, Databricks, and SQL Server.
- Implement Medallion-layer data models with clean, well-structured, and maintainable code.
- Ensure pipelines meet high standards for scalability, performance, reliability, and security.
- Embed data quality checks, lineage, and observability into all data workflows.
- Translate business requirements into robust, scalable data engineering solutions.
- Collaborate with architects, governance teams, and application engineers.
- Contribute to technical design discussions and help shape best practice standards.
- Contribute to Agile delivery processes including estimation, refinement, and peer reviews.
- Enhance CI/CD pipelines, testing automation, and engineering quality standards.
- Optimise existing pipelines and investigate production issues, ensuring long-term stability.
- Hands-on experience with Azure Data Factory, Databricks (Python/SQL), and SQL Server.
- Strong understanding of ELT development, data modelling, and data governance.
- Experience working with CI/CD, automation, and testing frameworks.
- Strong communication skills with the ability to collaborate effectively across teams.
- Experience with Microsoft Purview or other metadata catalogue tools.
- Familiarity with Medallion architecture and modern cloud-based data platforms.
Seniority level: Mid-Senior level
Employment type: Full-time
Job function: Finance
Industries: Business Consulting and Services
Data Engineer employer: Albany Beck
Contact Detail:
Albany Beck Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer
✨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field and let them know you're on the lookout for opportunities. You never know who might have a lead or can refer you directly to hiring managers.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving Azure Data Factory, Databricks, and SQL Server. This will give potential employers a taste of what you can do and set you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on your technical knowledge and soft skills. Be ready to discuss your experience with ELT pipelines and data modelling, and don’t forget to highlight your collaboration skills with teams!
✨Tip Number 4
Apply through our website! We love seeing candidates who are genuinely interested in joining us at Albany Beck. Tailor your application to reflect how your skills align with our mission of delivering high-quality talent and technology solutions.
We think you need these skills to ace Data Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Data Engineer role. Highlight your experience with Azure Data Factory, Databricks, and SQL Server. We want to see how your skills match what we're looking for!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how you can contribute to our innovative projects. Keep it concise but impactful!
Showcase Your Projects: If you've worked on any relevant projects, make sure to mention them! Whether it's building ELT pipelines or optimising data models, we love seeing real-world examples of your work.
Apply Through Our Website: We encourage you to apply through our website for a smoother process. It helps us keep track of your application and ensures you don’t miss out on any important updates from us!
How to prepare for a job interview at Albany Beck
✨Know Your Tech Stack
Make sure you brush up on your knowledge of Azure Data Factory, Databricks, and SQL Server. Be ready to discuss how you've used these tools in past projects, and think about specific examples where you designed or maintained ELT pipelines.
✨Showcase Your Problem-Solving Skills
Prepare to talk about how you've tackled production issues or optimised existing pipelines. Think of a couple of scenarios where you implemented data quality checks or improved performance, and be ready to explain your thought process.
✨Communicate Clearly
Since collaboration is key, practice explaining complex technical concepts in simple terms. You might be asked to translate business requirements into data engineering solutions, so being able to communicate effectively with non-technical stakeholders is crucial.
✨Familiarise Yourself with Agile Practices
Understand the Agile delivery processes, including estimation and peer reviews. Be prepared to discuss how you've contributed to these processes in previous roles, as this will show your ability to work well within a team and adapt to changing requirements.