At a Glance
- Tasks: Design and build scalable data pipelines using GCP for top financial institutions.
- Company: Join Capco, a leader in digital transformation for financial services.
- Benefits: Enjoy a competitive salary, flexible holidays, and extensive training opportunities.
- Why this job: Shape the future of finance with innovative cloud-first data solutions.
- Qualifications: Strong GCP experience and proficiency in Python, SQL, and data engineering.
- Other info: Collaborative culture with a focus on diversity and continuous learning.
The predicted salary is between 48000 - 72000 £ per year.
Location: London (Hybrid) | Practice Area: Technology & Engineering | Type: Permanent
Shape the future of financial services through cloud-first data innovation.
The Role
We’re looking for a Principal GCP Data Engineer to join our growing Technology & Engineering team in London. You’ll work with Tier 1 financial institutions on cutting-edge transformation programmes, using your deep expertise in Google Cloud Platform to build robust, scalable data solutions. This role is ideal for engineers who thrive on innovation, take ownership of challenges, and enjoy working in collaborative, agile teams.
What You’ll Do
- Design and build resilient, scalable data pipelines using GCP services such as BigQuery, Dataflow, and Dataproc.
- Collaborate closely with clients to define business requirements and architect fit-for-purpose data solutions.
- Apply engineering best practices across the SDLC, including CI/CD, testing, and version control using Git.
- Lead in the development of robust, tested, and fault-tolerant data engineering solutions.
- Support and mentor junior engineers, contributing to knowledge sharing across the team.
What We’re Looking For
- Strong hands-on experience with GCP including BigQuery, Pub/Sub, Cloud Composer, and IAM.
- Proven ability in Python, SQL, and PySpark; experience with data lakes, warehousing, and data modelling.
- Demonstrated knowledge of DevOps principles, containerisation, and CI/CD tools such as Jenkins or GitHub Actions.
- A track record of delivering data engineering solutions in agile environments.
- Confident communication skills, able to convey technical concepts to non-technical stakeholders.
Bonus Points For
- Experience with real-time streaming tools like Kafka or Spark Streaming.
- Exposure to MLOps practices and maintaining ML models in production.
- Familiarity with data visualisation tools such as Power BI, Qlik, or Tableau.
- Experience of developing applications on VertixAI is preferable.
- Knowledge of both SQL and NoSQL solutions, and the ability to assess trade-offs between ETL and ELT.
Why Join Capco
- Deliver high-impact technology solutions for Tier 1 financial institutions.
- Work in a collaborative, flat, and entrepreneurial consulting culture.
- Access continuous learning, training, and industry certifications.
- Be part of a team shaping the future of digital financial services.
- Help shape the future of digital transformation across FS & Energy.
We offer a competitive, people-first benefits package designed to support every aspect of your life:
- Core Benefits: Discretionary bonus, competitive pension, health insurance, life insurance and critical illness cover.
- Mental Health: Easy access to CareFirst, Unmind, Aviva consultations, and in-house first aiders.
- Family-Friendly: Maternity, adoption, shared parental leave, plus paid leave for sickness, pregnancy loss, fertility treatment, menopause, and bereavement.
- Family Care: 8 complimentary backup care sessions for emergency childcare or elder care.
- Holiday Flexibility: 5 weeks of annual leave with the option to buy or sell holiday days based on your needs.
- Continuous Learning: Minimum 40 Hours of Training Annually: Take your pick—workshops, certifications, e-learning—your growth, your way. Also, Business Coach assigned from Day One: Get one-on-one guidance to fast-track your goals and accelerate your development.
- Extra Perks: Gympass (Wellhub), travel insurance, Tastecard, season ticket loans, Cycle to Work, and dental insurance.
Inclusion at Capco
We’re committed to a barrier-free, inclusive recruitment process. If you need any adjustments at any stage, just let us know – we’ll be happy to help. We welcome applicants from all backgrounds. At Capco, we value the difference you make, and the differences that make you. Our #BeYourselfAtWork culture champions diversity, equity and inclusivity, and we bring a collaborative mindset to our partnerships with clients and colleagues. #BeYourselfAtWork is the cornerstone of our success and a value that our employees live and breathe every day.
Principal Data Engineer - GCP employer: Capco
Contact Detail:
Capco Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Principal Data Engineer - GCP
✨Tip Number 1
Network like a pro! Reach out to your connections in the industry, especially those who work at Capco or similar firms. A friendly chat can lead to insider info about job openings and even referrals.
✨Tip Number 2
Prepare for interviews by brushing up on your GCP knowledge and data engineering skills. Practice explaining complex concepts in simple terms, as you'll need to communicate effectively with both technical and non-technical folks.
✨Tip Number 3
Showcase your projects! Whether it's a GitHub repo or a portfolio, having tangible examples of your work can set you apart. Make sure to highlight any experience with tools like BigQuery or Dataflow.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen. Plus, it shows you're genuinely interested in joining our team at Capco.
We think you need these skills to ace Principal Data Engineer - GCP
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Principal Data Engineer role. Highlight your experience with GCP, Python, and data engineering solutions. We want to see how your skills align with what we’re looking for!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re passionate about data engineering and how you can contribute to our team. Don’t forget to mention any relevant projects or experiences that showcase your expertise.
Showcase Your Technical Skills: When filling out your application, be sure to highlight your hands-on experience with GCP services like BigQuery and Dataflow. We love seeing specific examples of how you've applied your technical skills in real-world scenarios.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way to ensure your application gets into the right hands. Plus, it shows us you’re serious about joining our awesome team!
How to prepare for a job interview at Capco
✨Know Your GCP Inside Out
Make sure you brush up on your Google Cloud Platform knowledge, especially services like BigQuery, Dataflow, and Pub/Sub. Be ready to discuss how you've used these tools in past projects and how they can be applied to the role you're interviewing for.
✨Showcase Your Coding Skills
Prepare to demonstrate your proficiency in Python, SQL, and PySpark. You might be asked to solve a coding challenge or explain your thought process behind a data engineering solution, so practice articulating your approach clearly.
✨Understand Agile Methodologies
Since the role involves working in agile environments, be prepared to discuss your experience with agile practices. Share examples of how you've contributed to team sprints, collaborated with stakeholders, and adapted to changing requirements.
✨Communicate Effectively
You’ll need to convey complex technical concepts to non-technical stakeholders. Practice explaining your past projects in simple terms, focusing on the impact and benefits rather than just the technical details. This will show your ability to bridge the gap between tech and business.