At a Glance
- Tasks: Design and build scalable data pipelines using GCP for top financial institutions.
- Company: Join Capco, a leader in digital transformation for financial services.
- Benefits: Enjoy competitive pay, flexible holidays, and extensive training opportunities.
- Why this job: Shape the future of finance with innovative cloud solutions and collaborative teams.
- Qualifications: Strong GCP experience and proficiency in Python, SQL, and data engineering.
- Other info: Inclusive culture that values diversity and supports your personal growth.
The predicted salary is between 43200 - 72000 £ per year.
Location: London (Hybrid) | Practice Area: Technology & Engineering | Type: Permanent
Shape the future of financial services through cloud-first data innovation.
The Role
We’re looking for a GCP Senior Data Engineer to join our growing Technology & Engineering team in London. You’ll work with Tier 1 financial institutions on cutting-edge transformation programmes, using your deep expertise in Google Cloud Platform to build robust, scalable data solutions. This role is ideal for engineers who thrive on innovation, take ownership of challenges, and enjoy working in collaborative, agile teams.
What You’ll Do
- Design and build resilient, scalable data pipelines using GCP services such as BigQuery, Dataflow, and Dataproc.
- Collaborate closely with clients to define business requirements and architect fit-for-purpose data solutions.
- Apply engineering best practices across the SDLC, including CI/CD, testing, and version control using Git.
- Support the migration of legacy on-premise systems to cloud-native architectures.
- Champion data governance, security, and compliance, especially in regulated environments.
What We’re Looking For
- Strong hands-on experience with GCP including BigQuery, Pub/Sub, Cloud Composer, and IAM.
- Proven ability in Python, SQL, and PySpark; experience with data lakes, warehousing, and data modelling.
- Demonstrated knowledge of DevOps principles, containerisation, and CI/CD tools such as Jenkins or GitHub Actions.
- A track record of delivering data engineering solutions in agile environments.
- Confident communication skills, able to convey technical concepts to non-technical stakeholders.
Bonus Points For
- Experience with real-time streaming tools like Kafka or Spark Streaming.
- Exposure to MLOps practices and maintaining ML models in production.
- Familiarity with data visualisation tools such as Power BI, Qlik, or Tableau.
- Experience of developing applications on Vertex AI is preferable.
- Knowledge of both SQL and NoSQL solutions, and the ability to assess trade-offs between ETL and ELT.
Why Join Capco
- Deliver high-impact technology solutions for Tier 1 financial institutions.
- Work in a collaborative, flat, and entrepreneurial consulting culture.
- Access continuous learning, training, and industry certifications.
- Be part of a team shaping the future of digital financial services.
- Help shape the future of digital transformation across FS & Energy.
We offer a competitive, people-first benefits package designed to support every aspect of your life:
- Core Benefits: Discretionary bonus, competitive pension, health insurance, life insurance and critical illness cover.
- Mental Health: Easy access to CareFirst, Unmind, Aviva consultations, and in-house first aiders.
- Family-Friendly: Maternity, adoption, shared parental leave, plus paid leave for sickness, pregnancy loss, fertility treatment, menopause, and bereavement.
- Family Care: 8 complimentary backup care sessions for emergency childcare or elder care.
- Holiday Flexibility: 5 weeks of annual leave with the option to buy or sell holiday days based on your needs.
- Continuous Learning: Minimum 40 Hours of Training Annually: Take your pick—workshops, certifications, e-learning—your growth, your way. Also, Business Coach assigned from Day One: Get one-on-one guidance to fast-track your goals and accelerate your development.
- Extra Perks: Gympass (Wellhub), travel insurance, Tastecard, season ticket loans, Cycle to Work, and dental insurance.
Inclusion at Capco
We’re committed to a barrier-free, inclusive recruitment process. If you need any adjustments at any stage, just let us know – we’ll be happy to help. We welcome applicants from all backgrounds. At Capco, we value the difference you make, and the differences that make you. Our #BeYourselfAtWork culture champions diversity, equity and inclusivity, and we bring a collaborative mindset to our partnerships with clients and colleagues. #BeYourselfAtWork is the cornerstone of our success and a value that our employees live and breathe every day.
Senior Data Engineer - GCP employer: Capco
Contact Detail:
Capco Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Data Engineer - GCP
✨Tip Number 1
Network like a pro! Reach out to your connections in the industry, especially those who work with GCP or in financial services. A friendly chat can lead to insider info about job openings and even referrals.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your GCP projects, data pipelines, and any cool stuff you've built. This will give potential employers a taste of what you can do and set you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on your technical knowledge and soft skills. Practice explaining complex concepts in simple terms, as you'll need to communicate effectively with both techies and non-techies.
✨Tip Number 4
Don't forget to apply through our website! It's the best way to ensure your application gets seen. Plus, we love seeing candidates who are proactive and engaged with our company.
We think you need these skills to ace Senior Data Engineer - GCP
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Senior Data Engineer role. Highlight your experience with GCP, Python, and SQL, and don’t forget to mention any relevant projects that showcase your skills in building data pipelines.
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re passionate about data engineering and how your background aligns with our mission at StudySmarter. Be sure to mention your collaborative spirit and innovative mindset.
Showcase Your Projects: If you've worked on any cool data projects, make sure to include them in your application. Whether it's a personal project or something from a previous job, we love seeing real-world applications of your skills!
Apply Through Our Website: We encourage you to apply through our website for the best chance of getting noticed. It’s super easy, and you’ll be able to keep track of your application status directly!
How to prepare for a job interview at Capco
✨Know Your GCP Inside Out
Make sure you brush up on your Google Cloud Platform knowledge, especially BigQuery, Dataflow, and Pub/Sub. Be ready to discuss how you've used these tools in past projects and be prepared to explain the architecture of a data pipeline you've built.
✨Showcase Your Coding Skills
Since Python, SQL, and PySpark are key for this role, practice coding challenges that involve data manipulation and pipeline creation. You might even want to prepare a small project or example that demonstrates your skills in these languages.
✨Communicate Clearly
You’ll need to convey complex technical concepts to non-technical stakeholders, so practice explaining your past projects in simple terms. Think about how you can break down your work into relatable examples that anyone can understand.
✨Be Agile and Adaptable
Familiarise yourself with agile methodologies and be ready to discuss how you've worked in agile teams before. Highlight any experience you have with CI/CD practices and how you’ve contributed to improving team workflows.