Senior Data Engineer - GCP
Senior Data Engineer - GCP

Senior Data Engineer - GCP

City of London Full-Time 48000 - 72000 £ / year (est.) Home office (partial)
C

At a Glance

  • Tasks: Design and build scalable data pipelines using GCP for top financial institutions.
  • Company: Join Capco, a leader in digital transformation for financial services.
  • Benefits: Enjoy competitive pay, health insurance, and flexible holiday options.
  • Why this job: Shape the future of finance with innovative cloud solutions and collaborative teams.
  • Qualifications: Strong GCP experience and proficiency in Python, SQL, and data engineering.
  • Other info: Access continuous learning opportunities and a supportive, inclusive culture.

The predicted salary is between 48000 - 72000 £ per year.

Location: London (Hybrid) | Practice Area: Technology & Engineering | Type: Permanent

Overview

Shape the future of financial services through cloud-first data innovation

The Role

We’re looking for a GCP Senior Data Engineer to join our growing Technology & Engineering team in London. You’ll work with Tier 1 financial institutions on cutting-edge transformation programmes, using your deep expertise in Google Cloud Platform to build robust, scalable data solutions. This role is ideal for engineers who thrive on innovation, take ownership of challenges, and enjoy working in collaborative, agile teams.

What You’ll Do

  • Design and build resilient, scalable data pipelines using GCP services such as BigQuery, Dataflow, and Dataproc.
  • Collaborate closely with clients to define business requirements and architect fit-for-purpose data solutions.
  • Apply engineering best practices across the SDLC, including CI/CD, testing, and version control using Git.
  • Support the migration of legacy on-premise systems to cloud-native architectures.
  • Champion data governance, security, and compliance, especially in regulated environments.

What We’re Looking For

  • Strong hands-on experience with GCP including BigQuery, Pub/Sub, Cloud Composer, and IAM.
  • Proven ability in Python, SQL, and PySpark; experience with data lakes, warehousing, and data modelling.
  • Demonstrated knowledge of DevOps principles, containerisation, and CI/CD tools such as Jenkins or GitHub Actions.
  • A track record of delivering data engineering solutions in agile environments.
  • Confident communication skills, able to convey technical concepts to non-technical stakeholders.

Bonus Points For

  • Experience with real-time streaming tools like Kafka or Spark Streaming.
  • Exposure to MLOps practices and maintaining ML models in production.
  • Familiarity with data visualisation tools such as Power BI, Qlik, or Tableau.
  • Experience of developing applications on VertixAI is preferable
  • Knowledge of both SQL and NoSQL solutions, and the ability to assess trade-offs between ETL and ELT.

Why Join Capco

  • Deliver high-impact technology solutions for Tier 1 financial institutions
  • Work in a collaborative, flat, and entrepreneurial consulting culture
  • Access continuous learning, training, and industry certifications
  • Be part of a team shaping the future of digital financial services
  • Help shape the future of digital transformation across FS & Energy

We offer a competitive, people-first benefits package designed to support every aspect of your life:

Core Benefits

  • Discretionary bonus, competitive pension, health insurance, life insurance and critical illness cover.

Mental Health

  • Easy access to CareFirst, Unmind, Aviva consultations, and in-house first aiders.

Family-Friendly

  • Maternity, adoption, shared parental leave, plus paid leave for sickness, pregnancy loss, fertility treatment, menopause, and bereavement.

Family Care

  • 8 complimentary backup care sessions for emergency childcare or elder care.

Holiday Flexibility

  • 5 weeks of annual leave with the option to buy or sell holiday days based on your needs.

Continuous Learning

  • Minimum 40 Hours of Training Annually: Take your pick—workshops, certifications, e-learning—your growth, your way. Also, Business Coach assigned from Day One: Get one-on-one guidance to fast-track your goals and accelerate your development.

Extra Perks

  • Gympass (Wellhub), travel insurance, Tastecard, season ticket loans, Cycle to Work, and dental insurance.

Inclusion at Capco

We’re committed to a barrier-free, inclusive recruitment process. If you need any adjustments at any stage, just let us know – we’ll be happy to help. We welcome applicants from all backgrounds. At Capco, we value the difference you make, and the differences that make you. Our #BeYourselfAtWork culture champions diversity, equity and inclusivity, and we bring a collaborative mindset to our partnerships with clients and colleagues. #BeYourselfAtWork is the cornerstone of our success and a value that our employees live and breathe every day.

#J-18808-Ljbffr

Senior Data Engineer - GCP employer: Capco

Capco is an exceptional employer that fosters a collaborative and entrepreneurial culture, perfect for Senior Data Engineers looking to make a significant impact in the financial services sector. With a strong focus on continuous learning and professional development, employees benefit from a comprehensive training programme, generous leave policies, and a supportive work environment that prioritises mental health and family-friendly initiatives. Located in London, this hybrid role offers the unique opportunity to work with Tier 1 financial institutions on innovative cloud-first data solutions, making it an ideal place for those passionate about technology and transformation.
C

Contact Detail:

Capco Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Senior Data Engineer - GCP

✨Network Like a Pro

Get out there and connect with people in the industry! Attend meetups, webinars, or even just grab a coffee with someone who works at your dream company. Building relationships can open doors that a CV just can't.

✨Show Off Your Skills

Don’t just talk about your experience; showcase it! Create a portfolio of projects you've worked on, especially those using GCP tools like BigQuery or Dataflow. This gives potential employers a tangible look at what you can do.

✨Ace the Interview

Prepare for interviews by practicing common questions and scenarios related to data engineering. Be ready to discuss your experience with Python, SQL, and cloud migrations. Confidence is key, so practice makes perfect!

✨Apply Through Our Website

When you find a role that excites you, apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, we love seeing candidates who are proactive about their job search.

We think you need these skills to ace Senior Data Engineer - GCP

Google Cloud Platform (GCP)
BigQuery
Dataflow
Dataproc
Python
SQL
PySpark
Data Lakes
Data Warehousing
Data Modelling
DevOps Principles
CI/CD
Git
Containerisation
Communication Skills

Some tips for your application 🫡

Tailor Your CV: Make sure your CV reflects the skills and experiences that match the Senior Data Engineer role. Highlight your GCP expertise and any relevant projects you've worked on, especially those involving data pipelines and cloud solutions.

Craft a Compelling Cover Letter: Use your cover letter to tell us why you're passionate about data engineering and how you can contribute to our team. Share specific examples of your work with GCP and how you've tackled challenges in previous roles.

Showcase Your Technical Skills: Don’t shy away from listing your technical proficiencies! Mention your experience with Python, SQL, and any tools like BigQuery or Dataflow. We want to see how you’ve applied these skills in real-world scenarios.

Apply Through Our Website: We encourage you to apply directly through our website for a smoother application process. It helps us keep track of your application and ensures you don’t miss out on any important updates!

How to prepare for a job interview at Capco

✨Know Your GCP Inside Out

Make sure you brush up on your Google Cloud Platform knowledge, especially BigQuery, Dataflow, and Pub/Sub. Be ready to discuss how you've used these tools in past projects and how they can be applied to the role you're interviewing for.

✨Showcase Your Coding Skills

Prepare to demonstrate your proficiency in Python, SQL, and PySpark. You might be asked to solve a coding challenge or explain your thought process behind a data pipeline you've built. Practice common algorithms and data structures to feel confident.

✨Communicate Clearly with Non-Techies

Since you'll need to convey technical concepts to non-technical stakeholders, practice explaining complex ideas in simple terms. Think of examples from your experience where you successfully communicated with clients or team members who weren't as tech-savvy.

✨Be Ready for Agile Discussions

Familiarise yourself with agile methodologies and be prepared to discuss how you've applied them in your previous roles. Share specific examples of how you've collaborated in teams, adapted to changes, and delivered results in fast-paced environments.

Senior Data Engineer - GCP
Capco

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

C
  • Senior Data Engineer - GCP

    City of London
    Full-Time
    48000 - 72000 £ / year (est.)

    Application deadline: 2027-10-15

  • C

    Capco

Similar positions in other companies
UK’s top job board for Gen Z
discover-jobs-cta
Discover now
>