At a Glance
- Tasks: Lead the design and delivery of innovative data solutions using Databricks on Azure.
- Company: Capco, a forward-thinking tech consultancy in financial services.
- Benefits: Competitive salary, flexible holidays, mental health support, and continuous learning opportunities.
- Why this job: Shape the future of digital transformation in finance while working with cutting-edge technology.
- Qualifications: Experience with Databricks, Python, and CI/CD pipelines; strong client engagement skills.
- Other info: Join a diverse team that values individuality and offers excellent career growth.
The predicted salary is between 72000 - 108000 £ per year.
Location: London (Hybrid) | Practice Area: Technology & Engineering | Type: Permanent
Lead the architecture of data platforms driving the future of financial services.
The Role
As a Principal Azure Data Engineer (Databricks) at Capco, you will architect and lead the delivery of enterprise-grade data solutions using the Databricks platform within the Azure ecosystem. You will drive the design and implementation of streaming and batch data pipelines, helping our clients to modernise their data capabilities. Working with cross-functional teams, you will guide clients on best practices and technical strategy, ensuring quality and security standards are maintained across deployments.
What You’ll Do
- Lead the end-to-end delivery of Databricks-based solutions across Azure environments
- Architect secure and scalable pipelines using DeltaLake, Spark Structured Streaming, and Unity Catalog
- Develop and enforce engineering best practices across the data lifecycle
- Engage with clients to define solution strategies and data governance frameworks
- Mentor engineering teams and contribute to internal capability development initiatives
What We’re Looking For
- Proven experience with the Databricks platform, including Unity Catalog, DeltaLake, and orchestration
- Expertise in Python, PySpark, and distributed data processing frameworks
- Extensive background in CI/CD pipeline development with tools like Azure DevOps, Jenkins, GitHub Actions
- In-depth understanding of data lakehouse principles, data modelling, and GDPR-compliant design
- Experience building robust, production-grade data pipelines from ingestion through to serving
Bonus Points For
- Strong client-facing and commercial skills to support pre-sales and RFP engagements
- Background in coaching and mentoring engineering teams
- Development experience with Scala or Java
- Exposure to PII and sensitive data handling, and regulatory frameworks like GDPR
- Active contribution to Capco thought leadership and internal initiatives
Why Join Capco
- Deliver high-impact technology solutions for Tier 1 financial institutions
- Work in a collaborative, flat, and entrepreneurial consulting culture
- Access continuous learning, training, and industry certifications
- Be part of a team shaping the future of digital financial services
- Help shape the future of digital transformation across FS & Energy.
We offer a competitive, people-first benefits package designed to support every aspect of your life:
Core Benefits:
- Discretionary bonus, competitive pension, health insurance, life insurance and critical illness cover.
Mental Health:
- Easy access to CareFirst, Unmind, Aviva consultations, and in-house first aiders.
Family-Friendly:
- Maternity, adoption, shared parental leave, plus paid leave for sickness, pregnancy loss, fertility treatment, menopause, and bereavement.
Family Care:
- 8 complimentary backup care sessions for emergency childcare or elder care.
Holiday Flexibility:
- 5 weeks of annual leave with the option to buy or sell holiday days based on your needs.
Continuous Learning:
- Your growth, your way - minimum 40 hours of training annually. Take your pick; workshops, certifications, e-learning. Also, Business Coach assigned from Day One: Get one-on-one guidance to fast-track your goals and accelerate your development.
Extra Perks:
- Gympass(Wellhub), travel insurance, Tastecard, season ticket loans, Cycle to Work, and dental insurance.
Inclusion at Capco
We’re committed to making our recruitment process accessible and straightforward for everyone. If you need any adjustments at any stage, just let us know – we’ll be happy to help. We value each person’s unique perspective and contribution. At Capco, we believe that being yourself is your greatest strength. Our #BeYourselfAtWork culture encourages individuality and collaboration – a mindset that shapes how we work with clients and each other every day.
Principal Azure Data Engineer (Databricks) employer: Capco
Contact Detail:
Capco Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Principal Azure Data Engineer (Databricks)
✨Tip Number 1
Network like a pro! Reach out to your connections on LinkedIn or attend industry meetups. You never know who might have the inside scoop on job openings or can put in a good word for you.
✨Tip Number 2
Prepare for interviews by practising common questions and showcasing your expertise with Databricks and Azure. We recommend doing mock interviews with friends or using online platforms to boost your confidence.
✨Tip Number 3
Don’t just apply; engage! When you find a role that excites you, reach out to the hiring manager or team members. Show genuine interest in the company and ask insightful questions about their projects.
✨Tip Number 4
Keep learning and stay updated on the latest trends in data engineering. Join webinars, read articles, and even consider certifications. This not only boosts your skills but also makes you stand out to potential employers.
We think you need these skills to ace Principal Azure Data Engineer (Databricks)
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Principal Azure Data Engineer role. Highlight your experience with Databricks, DeltaLake, and any relevant projects that showcase your skills in Python and PySpark. We want to see how you can bring value to our team!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how your background aligns with our needs. Don’t forget to mention your client-facing experience and any mentoring roles you've had – we love that stuff!
Showcase Your Projects: If you've worked on any cool data projects, make sure to include them in your application. Whether it's building data pipelines or implementing CI/CD practices, we want to see your hands-on experience. It helps us understand your practical skills better!
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way to ensure your application gets into the right hands. Plus, you’ll find all the details about the role and our culture there, which can help you tailor your application even more!
How to prepare for a job interview at Capco
✨Know Your Databricks Inside Out
Make sure you’re well-versed in the Databricks platform, especially Unity Catalog and DeltaLake. Brush up on your knowledge of Spark Structured Streaming and how to architect secure and scalable data pipelines. Being able to discuss specific projects where you've implemented these technologies will really impress.
✨Showcase Your Python and PySpark Skills
Prepare to demonstrate your expertise in Python and PySpark during the interview. Have examples ready that highlight your experience with distributed data processing frameworks. You might even be asked to solve a coding challenge, so practice common algorithms and data manipulation tasks.
✨Understand CI/CD Pipelines
Since the role involves CI/CD pipeline development, make sure you can talk about your experience with tools like Azure DevOps, Jenkins, and GitHub Actions. Be ready to explain how you’ve used these tools to streamline data delivery processes and ensure quality control.
✨Engage with Client Scenarios
As this position requires strong client-facing skills, think about past experiences where you’ve engaged with clients to define solution strategies. Prepare to discuss how you’ve navigated challenges and provided value, as well as your approach to mentoring engineering teams and contributing to internal initiatives.