At a Glance
- Tasks: Design and build scalable data pipelines using GCP for top financial institutions.
- Company: Join Capco, a leader in digital transformation for financial services.
- Benefits: Enjoy competitive pay, flexible holidays, and extensive training opportunities.
- Why this job: Shape the future of finance with innovative cloud solutions and collaborative teams.
- Qualifications: Strong GCP experience, Python, SQL skills, and a passion for data engineering.
- Other info: Inclusive culture that values diversity and offers excellent career growth.
The predicted salary is between 48000 - 84000 £ per year.
Location: London (Hybrid) | Practice Area: Technology & Engineering | Type: Permanent
Shape the future of financial services through cloud-first data innovation.
The Role
We are looking for a Principal GCP Data Engineer to join our growing Technology & Engineering team in London. You will work with Tier 1 financial institutions on cutting-edge transformation programmes, using your deep expertise in Google Cloud Platform to build robust, scalable data solutions. This role is ideal for engineers who thrive on innovation, take ownership of challenges, and enjoy working in collaborative, agile teams.
What You Will Do
- Design and build resilient, scalable data pipelines using GCP services such as BigQuery, Dataflow, and Dataproc.
- Collaborate closely with clients to define business requirements and architect fit-for-purpose data solutions.
- Apply engineering best practices across the SDLC, including CI/CD, testing, and version control using Git.
- Lead in the development of robust, tested, and fault-tolerant data engineering solutions.
- Support and mentor junior engineers, contributing to knowledge sharing across the team.
What We Are Looking For
- Strong hands-on experience with GCP including BigQuery, Pub/Sub, Cloud Composer, and IAM.
- Proven ability in Python, SQL, and PySpark; experience with data lakes, warehousing, and data modelling.
- Demonstrated knowledge of DevOps principles, containerisation, and CI/CD tools such as Jenkins or GitHub Actions.
- A track record of delivering data engineering solutions in agile environments.
- Confident communication skills, able to convey technical concepts to non-technical stakeholders.
Bonus Points For
- Experience with real-time streaming tools like Kafka or Spark Streaming.
- Exposure to MLOps practices and maintaining ML models in production.
- Familiarity with data visualisation tools such as Power BI, Qlik, or Tableau.
- Experience of developing applications on VertexAI is preferable.
- Knowledge of both SQL and NoSQL solutions, and the ability to assess trade-offs between ETL and ELT.
Why Join Capco
- Deliver high-impact technology solutions for Tier 1 financial institutions.
- Work in a collaborative, flat, and entrepreneurial consulting culture.
- Access continuous learning, training, and industry certifications.
- Be part of a team shaping the future of digital financial services.
- Help shape the future of digital transformation across FS & Energy.
We offer a competitive, people-first benefits package designed to support every aspect of your life:
- Core Benefits: Discretionary bonus, competitive pension, health insurance, life insurance and critical illness cover.
- Mental Health: Easy access to CareFirst, Unmind, Aviva consultations, and in-house first aiders.
- Family-Friendly: Maternity, adoption, shared parental leave, plus paid leave for sickness, pregnancy loss, fertility treatment, menopause, and bereavement.
- Family Care: 8 complimentary backup care sessions for emergency childcare or elder care.
- Holiday Flexibility: 5 weeks of annual leave with the option to buy or sell holiday days based on your needs.
- Continuous Learning: Minimum 40 Hours of Training Annually: Take your pick—workshops, certifications, e-learning—your growth, your way. Also, Business Coach assigned from Day One: Get one-on-one guidance to fast-track your goals and accelerate your development.
- Extra Perks: Gympass (Wellhub), travel insurance, Tastecard, season ticket loans, Cycle to Work, and dental insurance.
Inclusion at Capco
We are committed to a barrier-free, inclusive recruitment process. If you need any adjustments at any stage, just let us know – we will be happy to help. We welcome applicants from all backgrounds. At Capco, we value the difference you make, and the differences that make you. Our #BeYourselfAtWork culture champions diversity, equity and inclusivity, and we bring a collaborative mindset to our partnerships with clients and colleagues. #BeYourselfAtWork is the cornerstone of our success and a value that our employees live and breathe every day.
Lead Data Engineer - GCP in London employer: Capco
Contact Detail:
Capco Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Lead Data Engineer - GCP in London
✨Tip Number 1
Network like a pro! Reach out to your connections in the industry, especially those who work at companies you're interested in. A friendly chat can lead to referrals and insider info that could give you the edge.
✨Tip Number 2
Prepare for interviews by practising common questions and showcasing your GCP expertise. Use real-life examples from your past projects to demonstrate your skills and how you tackle challenges.
✨Tip Number 3
Don’t underestimate the power of follow-ups! After an interview, send a quick thank-you email to express your appreciation and reiterate your interest in the role. It keeps you fresh in their minds.
✨Tip Number 4
Apply through our website! We love seeing applications directly from candidates who are genuinely interested in joining us. Plus, it shows initiative and enthusiasm for the role.
We think you need these skills to ace Lead Data Engineer - GCP in London
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Lead Data Engineer role. Highlight your experience with GCP, Python, and data engineering solutions. We want to see how your skills align with what we're looking for!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how you can contribute to our team. Be sure to mention any relevant projects or experiences that showcase your expertise.
Showcase Your Projects: If you've worked on any cool data projects, don't hesitate to include them in your application. We love seeing real-world examples of your work, especially if they involve GCP services like BigQuery or Dataflow!
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it shows you’re keen to join our team!
How to prepare for a job interview at Capco
✨Know Your GCP Inside Out
Make sure you brush up on your Google Cloud Platform knowledge, especially BigQuery, Dataflow, and Pub/Sub. Be ready to discuss how you've used these tools in past projects and how they can be applied to the role you're interviewing for.
✨Showcase Your Problem-Solving Skills
Prepare to share specific examples of challenges you've faced in data engineering and how you overcame them. This is your chance to demonstrate your ownership of projects and your innovative approach to problem-solving.
✨Communicate Clearly with Non-Technical Stakeholders
Practice explaining complex technical concepts in simple terms. You’ll likely need to convey your ideas to clients who may not have a technical background, so being able to communicate effectively is key.
✨Emphasise Collaboration and Mentorship
Highlight your experience working in agile teams and mentoring junior engineers. Discuss how you’ve contributed to knowledge sharing and team success, as this aligns well with the collaborative culture they value.