At a Glance
- Tasks: Lead the design and delivery of innovative data solutions using Databricks on Azure.
- Company: Capco, a forward-thinking tech consultancy in financial services.
- Benefits: Competitive salary, flexible holidays, mental health support, and continuous learning opportunities.
- Other info: Join a collaborative culture that values individuality and offers excellent career growth.
- Why this job: Shape the future of digital transformation in finance while working with cutting-edge technology.
- Qualifications: Experience with Databricks, Python, and CI/CD pipeline development.
The predicted salary is between 80000 - 100000 £ per year.
Location: London (Hybrid) | Practice Area: Technology & Engineering | Type: Permanent
Lead the architecture of data platforms driving the future of financial services.
The Role
As a Principal Azure Data Engineer (Databricks) at Capco, you'll architect and lead the delivery of enterprise-grade data solutions using the Databricks platform within the Azure ecosystem. You'll drive the design and implementation of streaming and batch data pipelines, helping our clients to modernise their data capabilities. Working with cross-functional teams, you’ll guide clients on best practices and technical strategy, ensuring quality and security standards are maintained across deployments.
What You’ll Do
- Lead the end-to-end delivery of Databricks-based solutions across Azure environments
- Architect secure and scalable pipelines using DeltaLake, Spark Structured Streaming, and Unity Catalog
- Develop and enforce engineering best practices across the data lifecycle
- Engage with clients to define solution strategies and data governance frameworks
- Mentor engineering teams and contribute to internal capability development initiatives
What We’re Looking For
- Proven experience with the Databricks platform, including Unity Catalog, DeltaLake, and orchestration
- Expertise in Python, PySpark, and distributed data processing frameworks
- Extensive background in CI/CD pipeline development with tools like Azure DevOps, Jenkins, GitHub Actions
- In-depth understanding of data lakehouse principles, data modelling, and GDPR-compliant design
- Experience building robust, production-grade data pipelines from ingestion through to serving
Bonus Points For
- Strong client-facing and commercial skills to support pre-sales and RFP engagements
- Background in coaching and mentoring engineering teams
- Development experience with Scala or Java
- Exposure to PII and sensitive data handling, and regulatory frameworks like GDPR
- Active contribution to Capco thought leadership and internal initiatives
Why Join Capco
- Deliver high-impact technology solutions for Tier 1 financial institutions
- Work in a collaborative, flat, and entrepreneurial consulting culture
- Access continuous learning, training, and industry certifications
- Be part of a team shaping the future of digital financial services
- Help shape the future of digital transformation across FS & Energy.
We offer a competitive, people-first benefits package designed to support every aspect of your life:
- Core Benefits: Discretionary bonus, competitive pension, health insurance, life insurance and critical illness cover.
- Mental Health: Easy access to CareFirst, Unmind, Aviva consultations, and in-house first aiders.
- Family-Friendly: Maternity, adoption, shared parental leave, plus paid leave for sickness, pregnancy loss, fertility treatment, menopause, and bereavement.
- Family Care: 8 complimentary backup care sessions for emergency childcare or elder care.
- Holiday Flexibility: 5 weeks of annual leave with the option to buy or sell holiday days based on your needs.
- Continuous Learning: Your growth, your way - minimum 40 hours of training annually. Take your pick; workshops, certifications, e-learning. Also, Business Coach assigned from Day One: Get one-on-one guidance to fast-track your goals and accelerate your development.
- Extra Perks: Gympass(Wellhub), travel insurance, Tastecard, season ticket loans, Cycle to Work, and dental insurance.
Inclusion at Capco
We’re committed to making our recruitment process accessible and straightforward for everyone. If you need any adjustments at any stage, just let us know – we’ll be happy to help. We value each person’s unique perspective and contribution. At Capco, we believe that being yourself is your greatest strength. Our #BeYourselfAtWork culture encourages individuality and collaboration – a mindset that shapes how we work with clients and each other every day.
Lead Azure Data Engineer (Databricks) employer: Capco
Contact Detail:
Capco Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Lead Azure Data Engineer (Databricks)
✨Network Like a Pro
Get out there and connect with people in the industry! Attend meetups, webinars, or even just grab a coffee with someone who works at Capco. Building relationships can open doors that a CV just can't.
✨Show Off Your Skills
When you get the chance to chat with potential employers, don’t hold back! Share your experiences with Databricks, DeltaLake, and any cool projects you've worked on. Let them see how you can add value to their team.
✨Prepare for Technical Challenges
Brush up on your Python and PySpark skills because you might face some technical questions or challenges during interviews. Practising coding problems can help you feel more confident and ready to impress.
✨Apply Through Our Website
Don’t forget to apply directly through our website! It’s the best way to ensure your application gets seen by the right people. Plus, it shows you're genuinely interested in joining the Capco team.
We think you need these skills to ace Lead Azure Data Engineer (Databricks)
Some tips for your application 🫡
Tailor Your CV: Make sure your CV reflects the skills and experiences that match the Lead Azure Data Engineer role. Highlight your expertise with Databricks, DeltaLake, and any relevant projects you've worked on. We want to see how you can bring value to our team!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to tell us why you're passionate about data engineering and how your background aligns with our mission at Capco. Be sure to mention specific examples of your work with Azure and client engagement.
Showcase Your Technical Skills: Don’t hold back on showcasing your technical prowess! Include details about your experience with Python, PySpark, and CI/CD tools like Azure DevOps. We love seeing how you’ve tackled challenges in past projects, so share those stories!
Apply Through Our Website: We encourage you to apply directly through our website for a smoother application process. It’s the best way for us to receive your application and ensures you don’t miss out on any important updates from our team!
How to prepare for a job interview at Capco
✨Know Your Databricks Inside Out
Make sure you’re well-versed in the Databricks platform, especially Unity Catalog and DeltaLake. Brush up on your knowledge of how to architect secure and scalable data pipelines, as this will likely be a key focus during your interview.
✨Showcase Your Python and PySpark Skills
Prepare to discuss your experience with Python and PySpark in detail. Be ready to share specific examples of projects where you’ve implemented distributed data processing frameworks, as this will demonstrate your hands-on expertise.
✨Understand CI/CD Pipelines
Familiarise yourself with CI/CD pipeline development using tools like Azure DevOps and Jenkins. Be prepared to explain how you’ve used these tools in past projects to ensure smooth deployments and maintain quality standards.
✨Engage with Client Scenarios
Think about how you would engage with clients to define solution strategies and data governance frameworks. Prepare some scenarios or case studies from your previous work that highlight your client-facing skills and ability to mentor engineering teams.