At a Glance
- Tasks: Lead the design and delivery of innovative data solutions using Databricks on Azure.
- Company: Capco, a forward-thinking tech consultancy in financial services.
- Benefits: Competitive salary, generous leave, mental health support, and continuous learning opportunities.
- Other info: Join a collaborative culture that values individuality and offers excellent career growth.
- Why this job: Shape the future of digital transformation in finance while working with cutting-edge technology.
- Qualifications: Experience with Databricks, Python, and CI/CD pipelines; strong client engagement skills.
The predicted salary is between 80000 - 100000 £ per year.
Location: London (Hybrid) | Practice Area: Technology & Engineering | Type: Permanent
Lead the architecture of data platforms driving the future of financial services.
The Role
As a Principal Azure Data Engineer (Databricks) at Capco, you'll architect and lead the delivery of enterprise-grade data solutions using the Databricks platform within the Azure ecosystem. You'll drive the design and implementation of streaming and batch data pipelines, helping our clients to modernise their data capabilities. Working with cross-functional teams, you’ll guide clients on best practices and technical strategy, ensuring quality and security standards are maintained across deployments.
What You’ll Do
- Lead the end-to-end delivery of Databricks-based solutions across Azure environments
- Architect secure and scalable pipelines using DeltaLake, Spark Structured Streaming, and Unity Catalog
- Develop and enforce engineering best practices across the data lifecycle
- Engage with clients to define solution strategies and data governance frameworks
- Mentor engineering teams and contribute to internal capability development initiatives
What We’re Looking For
- Proven experience with the Databricks platform, including Unity Catalog, DeltaLake, and orchestration
- Expertise in Python, PySpark, and distributed data processing frameworks
- Extensive background in CI/CD pipeline development with tools like Azure DevOps, Jenkins, GitHub Actions
- In-depth understanding of data lakehouse principles, data modelling, and GDPR-compliant design
- Experience building robust, production-grade data pipelines from ingestion through to serving
Bonus Points For
- Strong client-facing and commercial skills to support pre-sales and RFP engagements
- Background in coaching and mentoring engineering teams
- Development experience with Scala or Java
- Exposure to PII and sensitive data handling, and regulatory frameworks like GDPR
- Active contribution to Capco thought leadership and internal initiatives
Why Join Capco
- Deliver high-impact technology solutions for Tier 1 financial institutions
- Work in a collaborative, flat, and entrepreneurial consulting culture
- Access continuous learning, training, and industry certifications
- Be part of a team shaping the future of digital financial services
- Help shape the future of digital transformation across FS & Energy.
We offer a competitive, people-first benefits package designed to support every aspect of your life:
- Core Benefits: Discretionary bonus, competitive pension, health insurance, life insurance and critical illness cover.
- Mental Health: Easy access to CareFirst, Unmind, Aviva consultations, and in-house first aiders.
- Family-Friendly: Maternity, adoption, shared parental leave, plus paid leave for sickness, pregnancy loss, fertility treatment, menopause, and bereavement.
- Family Care: 8 complimentary backup care sessions for emergency childcare or elder care.
- Holiday Flexibility: 5 weeks of annual leave with the option to buy or sell holiday days based on your needs.
- Continuous Learning: Your growth, your way - minimum 40 hours of training annually. Take your pick; workshops, certifications, e-learning. Also, Business Coach assigned from Day One: Get one-on-one guidance to fast-track your goals and accelerate your development.
- Extra Perks: Gympass(Wellhub), travel insurance, Tastecard, season ticket loans, Cycle to Work, and dental insurance.
Inclusion at Capco
We’re committed to making our recruitment process accessible and straightforward for everyone. If you need any adjustments at any stage, just let us know – we’ll be happy to help. We value each person’s unique perspective and contribution. At Capco, we believe that being yourself is your greatest strength. Our #BeYourselfAtWork culture encourages individuality and collaboration – a mindset that shapes how we work with clients and each other every day.
Lead Azure Data Engineer (Databricks) in London employer: Capco
Contact Detail:
Capco Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Lead Azure Data Engineer (Databricks) in London
✨Tip Number 1
Network like a pro! Get out there and connect with people in the industry. Attend meetups, webinars, or even just grab a coffee with someone who’s already in the role you want. You never know who might have the inside scoop on job openings!
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving Databricks and Azure. This is your chance to demonstrate your expertise in building data pipelines and using tools like DeltaLake and Spark. Make it easy for potential employers to see what you can do!
✨Tip Number 3
Prepare for interviews by brushing up on common questions related to data engineering and Azure. Practice explaining your past projects and how you’ve tackled challenges. Remember, confidence is key, so rehearse until you feel ready to impress!
✨Tip Number 4
Don’t forget to apply through our website! We’re always on the lookout for talented individuals like you. Plus, applying directly can sometimes give you an edge over other candidates. So, get your application in and let’s shape the future of digital financial services together!
We think you need these skills to ace Lead Azure Data Engineer (Databricks) in London
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Lead Azure Data Engineer role. Highlight your experience with Databricks, DeltaLake, and any relevant projects you've worked on. We want to see how your skills align with what we're looking for!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how you can contribute to our team. Be sure to mention your client-facing experience and any mentoring you've done, as these are key for us.
Showcase Your Technical Skills: Don’t forget to showcase your technical skills in Python, PySpark, and CI/CD tools like Azure DevOps. We love seeing specific examples of how you've used these technologies in past roles, so be detailed!
Apply Through Our Website: We encourage you to apply through our website for a smoother application process. It helps us keep track of your application and ensures you don’t miss out on any important updates from us!
How to prepare for a job interview at Capco
✨Know Your Databricks Inside Out
Make sure you’re well-versed in the Databricks platform, especially Unity Catalog and DeltaLake. Brush up on your knowledge of how to architect secure and scalable pipelines, as this will likely be a key focus during your interview.
✨Showcase Your Python and PySpark Skills
Prepare to discuss your experience with Python and PySpark in detail. Be ready to share specific examples of projects where you’ve implemented distributed data processing frameworks, as this will demonstrate your hands-on expertise.
✨Understand CI/CD Pipelines
Familiarise yourself with CI/CD pipeline development using tools like Azure DevOps and Jenkins. Be prepared to explain how you’ve used these tools in past projects to ensure smooth deployments and maintain quality standards.
✨Engage with Client Scenarios
Think about how you would engage with clients to define solution strategies and data governance frameworks. Prepare some scenarios or case studies from your previous work that highlight your client-facing skills and ability to mentor engineering teams.