At a Glance
- Tasks: Design and build scalable data pipelines for a modern cloud-based data platform.
- Company: Fast-growing fintech company leading in innovative data solutions.
- Benefits: Competitive salary, hybrid working, and opportunities for professional growth.
- Why this job: Take ownership of data infrastructure and make a real impact in the fintech space.
- Qualifications: 5+ years as a Data Engineer with strong GCP and programming skills.
- Other info: Dynamic environment with a focus on collaboration and continuous improvement.
The predicted salary is between 43200 - 72000 £ per year.
I'm working with a fast-growing fintech business seeking a Senior Data Engineer to help design, build, and evolve a modern cloud-based data platform. You'll play a key role in architecting and maintaining scalable, reliable data pipelines that support analytics, data science, and business decision-making across the organisation. This is a hands-on role with real ownership over data infrastructure, quality, and delivery. The environment is Google Cloud led, but the team values engineers who bring broader cloud experience and strong engineering fundamentals.
Key responsibilities
- Design, build, and maintain scalable data pipelines for ingestion, transformation, and distribution of large datasets
- Develop and evolve a robust cloud-based data architecture that supports analytical and operational use cases
- Partner closely with data scientists, analysts, and product teams to ensure data is reliable, accessible, and analytics-ready
- Implement best practices around data quality, testing, reliability, and performance
- Optimise data workflows for efficiency, scalability, and cost
- Ensure data security, governance, and compliance with relevant data protection standards
- Contribute to technical standards, architecture decisions, and continuous improvement of the data platform
Required experience
- Strong commercial experience as a Data Engineer, operating at a senior level - 5 years+
- Deep hands-on experience with Google Cloud Platform (GCP) and its data services (e.g. BigQuery, Dataflow, Cloud Storage, Composer / Orchestration tools)
- Solid programming skills in Python and SQL
- Experience designing and maintaining data warehouses, data models, and ETL / ELT pipelines
- Exposure to orchestration, batch and/or streaming data processing patterns
- Experience working in cloud environments beyond GCP (e.g. AWS or Azure)
- Strong understanding of data quality, reliability, and scalable system design
- Comfortable working with multiple stakeholders in a fast-moving product or fintech environment
- Experience in regulated or data-heavy industries (e.g. fintech, payments, financial services)
- Exposure to data governance frameworks and compliance requirements
- Cloud or data engineering certifications (particularly GCP)
Location & working pattern
- London-based
- Hybrid working - typically 2-3 days per week in the office
Data Engineer (GCP) in City of London employer: Xcede
Contact Detail:
Xcede Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer (GCP) in City of London
✨Tip Number 1
Network like a pro! Reach out to your connections in the fintech space and let them know you're on the hunt for a Senior Data Engineer role. You never know who might have the inside scoop on job openings or can refer you directly.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving GCP and data pipelines. This will give potential employers a taste of what you can do and set you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on common data engineering questions and scenarios. Practice explaining your past projects and how you've tackled challenges in data quality and pipeline optimisation.
✨Tip Number 4
Don't forget to apply through our website! We love seeing candidates who are genuinely interested in joining our team. Plus, it makes it easier for us to keep track of your application and get back to you quickly.
We think you need these skills to ace Data Engineer (GCP) in City of London
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Data Engineer role. Highlight your experience with GCP and any relevant projects you've worked on. We want to see how your skills align with what we're looking for!
Showcase Your Projects: Include specific examples of data pipelines or architectures you've designed. We love seeing hands-on experience, so don’t hold back on sharing your achievements in data engineering!
Be Clear and Concise: When writing your application, keep it clear and to the point. Use bullet points where possible to make it easy for us to read through your experience and skills quickly.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, we can’t wait to hear from you!
How to prepare for a job interview at Xcede
✨Know Your GCP Inside Out
Make sure you brush up on your Google Cloud Platform knowledge, especially the data services like BigQuery and Dataflow. Be ready to discuss how you've used these tools in past projects and how they can be applied to build scalable data pipelines.
✨Showcase Your Programming Skills
Prepare to demonstrate your Python and SQL skills during the interview. You might be asked to solve a problem or explain your approach to building ETL/ELT pipelines, so have some examples ready that highlight your technical prowess.
✨Understand the Business Context
Familiarise yourself with the fintech industry and the specific challenges it faces regarding data management. Being able to discuss how your work as a Data Engineer can directly impact business decision-making will show that you understand the bigger picture.
✨Emphasise Collaboration
Since you'll be partnering with data scientists and product teams, be prepared to talk about your experience working in cross-functional teams. Highlight any instances where your collaboration led to improved data quality or project outcomes.