At a Glance
- Tasks: Design and deliver scalable data platforms using cutting-edge technologies like Databricks and Azure.
- Company: Join a national law firm that values inclusivity and teamwork.
- Benefits: Enjoy 25 days holiday, flexible pensions, and paid volunteering days.
- Why this job: Make a real impact while working in a supportive and innovative environment.
- Qualifications: Experience with data platforms, Databricks, and strong coding skills in Python and SQL.
- Other info: Be part of a diverse workplace committed to social responsibility and community investment.
The predicted salary is between 43200 - 72000 £ per year.
We are seeking a highly skilled Lead Data Platform Engineer to join our Data Engineering and Machine Learning team. This role is pivotal in designing, architecting, and delivering robust, scalable, and secure data platforms that enable the firm to manage, analyse, and leverage data effectively while meeting regulatory and client confidentiality requirements.
You will combine handsāon engineering with strong solution architecture skills, ensuring that data platform solutions are fitāforāpurpose, wellāgoverned, and aligned to business needs. A key focus will be on Databricks, Azure Data Factory, and the Lakehouse Medallion architecture, with DevOps and automation at the heart of everything you do.
Key Responsibilities
- Architect and design endātoāend data platform solutions, ensuring scalability, reliability, and compliance.
- Lead implementation using Databricks, PySpark, Spark SQL, and Azure Data Factory.
- Develop APIs for data integration and automation.
- Write efficient, maintainable code in PySpark, Python, and SQL.
- Implement and manage CI/CD pipelines and automated deployments via Azure DevOps.
- Build infrastructureāasācode solutions (Terraform, ARM templates) for cloud resource provisioning.
- Monitor and optimise platform performance and manage cloud costs.
- Ensure data quality, security, governance, and lineage across all components.
- Collaborate with data engineers, architects, and business stakeholders to translate requirements into effective solutions.
- Maintain comprehensive documentation and stay current with emerging technologies.
- Provide coaching and mentoring to engineers, fostering a culture of continuous learning and technical excellence.
About You
- Proven experience designing and engineering data platforms in cloud environments (preferably Azure).
- Strong handsāon experience with Databricks, PySpark, Spark SQL, and Azure Data Factory.
- Proficiency in Python and RESTful API development.
- Advanced expertise in DevOps practices for CI/CD and automated deployments.
- Experience with infrastructureāasācode tools (Terraform, ARM templates).
- Experience with additional Azure services (Fabric, Functions, Logic Apps).
- Data Lakehouse architecture.
- Background in regulated or professional services environments.
Our Benefits ā What We Can Offer You
- 25 days holidays as standard plus bank holidays ā You can 'buy' up to 35hrs of extra holiday too.
- Generous and flexible pension schemes.
- Volunteering days ā Two days of volunteering every year for a cause of your choice (fully paid).
- Westfield Health membership, offering refunds on medical services alongside our Aviva Digital GP services.
- A wide range of wellābeing initiatives to encourage positive mental health both in and out of the workplace.
- Flexible by Choice programme which gives our colleagues more choice over a hybrid way of working subject to role, team and client requirements.
We have been ranked in the Best Workplaces for Wellbeing for Large Organisations for 2024! Our responsible business programmes are fundamental to who we are and our purpose. We are committed to being a diverse and inclusive workplace where our colleagues can flourish, and we have established a number of inclusion network groups across our business to support this aim. Our commitment to Social Responsibility, community investment activity and tackling climate change is a fundamental part of who we are.
As part of the Irwin Mitchell Group's onāboarding process all successful applicants are required to complete the groupās employment screening process. This process helps to ensure that all new employees meet our standards in relation to honesty and integrity therefore protecting the interests of the Group, colleagues, clients, partners and other stakeholders.
Irwin Mitchell LLP is an equal opportunity employer. Weāre proud of our values, and weāre looking for people who share them.
Lead Data Platform Engineer (5873) in London employer: Irwin Mitchell LLP
Contact Detail:
Irwin Mitchell LLP Recruiting Team
StudySmarter Expert Advice š¤«
We think this is how you could land Lead Data Platform Engineer (5873) in London
āØTip Number 1
Network like a pro! Reach out to current employees on LinkedIn or attend industry events. A friendly chat can give you insider info and maybe even a referral!
āØTip Number 2
Prepare for the interview by researching the firmās values and recent projects. Show us youāre not just another candidate, but someone who genuinely cares about making a difference.
āØTip Number 3
Practice your technical skills! Brush up on Databricks, PySpark, and Azure Data Factory. We want to see that you can walk the walk, not just talk the talk.
āØTip Number 4
Donāt forget to apply through our website! Itās the best way to ensure your application gets seen by the right people. Plus, it shows youāre serious about joining our team.
We think you need these skills to ace Lead Data Platform Engineer (5873) in London
Some tips for your application š«”
Tailor Your Application: Make sure to customise your CV and cover letter to highlight your experience with Databricks, Azure Data Factory, and other relevant technologies. We want to see how your skills align with our needs!
Showcase Your Projects: Include specific examples of projects you've worked on that demonstrate your expertise in data platform engineering. We love seeing real-world applications of your skills, so donāt hold back!
Be Clear and Concise: When writing your application, keep it straightforward and to the point. We appreciate clarity, so make sure your achievements and experiences shine through without unnecessary fluff.
Apply Through Our Website: We encourage you to submit your application directly through our website. Itās the best way for us to receive your details and ensures youāre considered for the role promptly!
How to prepare for a job interview at Irwin Mitchell LLP
āØKnow Your Tech Stack
Make sure youāre well-versed in Databricks, PySpark, and Azure Data Factory. Brush up on your knowledge of the Lakehouse Medallion architecture and be ready to discuss how you've used these technologies in past projects.
āØShowcase Your Problem-Solving Skills
Prepare to share specific examples of how you've architected and designed data platform solutions. Think about challenges you've faced and how you overcame them, especially in regulated environments.
āØEmphasise Collaboration
This role involves working closely with various stakeholders. Be ready to talk about how youāve collaborated with data engineers and business stakeholders to translate requirements into effective solutions.
āØDemonstrate Continuous Learning
Stay current with emerging technologies and be prepared to discuss how you keep your skills sharp. Mention any recent courses or certifications related to DevOps practices or cloud services that you've completed.