At a Glance
- Tasks: Design and deliver scalable data platforms using cutting-edge technologies like Databricks and Azure.
- Company: Join a leading firm committed to innovation and social responsibility.
- Benefits: Enjoy 25 days holiday, flexible pension schemes, and well-being initiatives.
- Why this job: Make a real impact in data engineering while working in a supportive and inclusive environment.
- Qualifications: Experience with cloud data platforms, Databricks, and strong coding skills in Python.
- Other info: Opportunities for professional growth and a culture of continuous learning.
The predicted salary is between 36000 - 60000 £ per year.
We are seeking a highly skilled Lead Data Platform Engineer to join our Data Engineering and Machine Learning team. This role is pivotal in designing, architecting, and delivering robust, scalable, and secure data platforms that enable the firm to manage, analyse, and leverage data effectively while meeting regulatory and client confidentiality requirements. You will combine hands-on engineering with strong solution architecture skills, ensuring that data platform solutions are fit-for-purpose, well-governed, and aligned to business needs. A key focus will be on Databricks, Azure Data Factory, and the Lakehouse Medallion architecture, with DevOps and automation at the heart of everything you do.
Key Responsibilities
- Architect and design end-to-end data platform solutions, ensuring scalability, reliability, and compliance.
- Lead implementation using Databricks, PySpark, Spark SQL, and Azure Data Factory.
- Develop APIs for data integration and automation.
- Write efficient, maintainable code in PySpark, Python, and SQL.
- Implement and manage CI/CD pipelines and automated deployments via Azure DevOps.
- Build infrastructure-as-code solutions (Terraform, ARM templates) for cloud resource provisioning.
- Monitor and optimise platform performance and manage cloud costs.
- Ensure data quality, security, governance, and lineage across all components.
- Collaborate with data engineers, architects, and business stakeholders to translate requirements into effective solutions.
- Maintain comprehensive documentation and stay current with emerging technologies.
- Provide coaching and mentoring to engineers, fostering a culture of continuous learning and technical excellence.
Essential Skills
- Proven experience designing and engineering data platforms in cloud environments (preferably Azure).
- Strong hands‑on experience with Databricks, PySpark, Spark SQL, and Azure Data Factory.
- Proficiency in Python and RESTful API development.
- Advanced expertise in DevOps practices for CI/CD and automated deployments.
- Experience with infrastructure-as-code tools (Terraform, ARM templates).
Desirable Skills
- Experience with additional Azure services (Fabric, Functions, Logic Apps).
- Data Lakehouse architecture.
- Background in regulated or professional services environments.
Our Benefits - What We Can Offer You
- 25 days holidays as standard plus bank holidays - You can ‘buy’ up to 35hrs of extra holiday too.
- Generous and flexible pension schemes.
- Volunteering days – Two days of volunteering every year for a cause of your choice (fully paid).
- Westfield Health membership, offering refunds on medical services alongside our Aviva Digital GP services.
- We also offer a wide range of well‑being initiatives to encourage positive mental health both in and out of the workplace and to make sure you’re fully supported.
- This includes our Flexible by Choice programme which gives our colleagues more choice over a hybrid way of working subject to role, team and client requirements.
- We have been ranked in the Best Workplaces for Wellbeing for Large Organisations for 2024!
- Our responsible business programmes are fundamental to who we are and our purpose.
- We’re committed to being a diverse and inclusive workplace where our colleagues can flourish, and we have established a number of inclusion network groups across our business to support this aim.
- Our commitment to Social Responsibility, community investment activity and tackling climate change is a fundamental part of who we are. It’s made up of four strands: Our People, Our Community, Our Environment and Our Pro Bono.
Lead Data Platform Engineer in Birmingham employer: Irwin Mitchell
Contact Detail:
Irwin Mitchell Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Lead Data Platform Engineer in Birmingham
✨Tip Number 1
Network like a pro! Reach out to folks in the industry, attend meetups, and connect with potential colleagues on LinkedIn. You never know who might have the inside scoop on job openings or can put in a good word for you.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving Databricks, PySpark, and Azure Data Factory. This gives you a chance to demonstrate your hands-on experience and problem-solving abilities.
✨Tip Number 3
Prepare for interviews by brushing up on your technical knowledge and soft skills. Practice explaining your past projects and how they align with the role of a Lead Data Platform Engineer. Confidence is key!
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, it shows you’re genuinely interested in joining our team at StudySmarter.
We think you need these skills to ace Lead Data Platform Engineer in Birmingham
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Lead Data Platform Engineer role. Highlight your experience with Databricks, Azure Data Factory, and any relevant cloud environments. We want to see how your skills align with what we're looking for!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data platforms and how your background makes you a perfect fit for our team. Don’t forget to mention your hands-on experience and any leadership roles you've had.
Showcase Your Projects: If you've worked on any cool projects involving data engineering or machine learning, make sure to include them in your application. We love seeing real-world examples of your work, especially if they demonstrate your problem-solving skills and creativity.
Apply Through Our Website: We encourage you to apply through our website for the best chance of getting noticed. It’s super easy, and you’ll be able to keep track of your application status. Plus, we love seeing candidates who take the initiative to connect with us directly!
How to prepare for a job interview at Irwin Mitchell
✨Know Your Tech Stack
Make sure you’re well-versed in Databricks, PySpark, and Azure Data Factory. Brush up on your knowledge of the Lakehouse Medallion architecture and be ready to discuss how you've used these technologies in past projects.
✨Showcase Your Problem-Solving Skills
Prepare to share specific examples of how you've architected and designed data platforms. Think about challenges you've faced and how you overcame them, especially in terms of scalability and compliance.
✨Demonstrate DevOps Expertise
Be ready to talk about your experience with CI/CD pipelines and automated deployments. Highlight any projects where you implemented infrastructure-as-code solutions using Terraform or ARM templates.
✨Emphasise Collaboration and Mentorship
Since this role involves working closely with engineers and stakeholders, prepare to discuss how you’ve collaborated in the past. Share examples of how you’ve coached others and fostered a culture of continuous learning.