GCP Data Architect in London

GCP Data Architect in London

London Part-Time 60000 - 84000 ÂŁ / year (est.) Home office (partial)
Go Premium
N

At a Glance

  • Tasks: Design and implement cutting-edge data lake solutions for major banking clients.
  • Company: Join Neurons Lab, a leader in innovative tech solutions for financial institutions.
  • Benefits: Flexible part-time role with opportunities for professional growth and skill development.
  • Why this job: Make a real impact in the banking sector while working with advanced GCP technologies.
  • Qualifications: Experience in data architecture and governance, especially in banking domains.
  • Other info: Collaborative environment with a focus on innovation and client success.

The predicted salary is between 60000 - 84000 ÂŁ per year.

Join Neurons Lab as a Senior GCP Data Architect working on banking data lake and reporting systems for large financial institutions. This end‐to‐end role starts with presales and architecture – gathering requirements, designing solutions, establishing governance frameworks – then progresses to implementing your designs through to MVP delivery.

About The Project

We architect data lake solutions for critical use cases such as AML reporting, KYC data management, and regulatory compliance, ensuring robust data governance, metadata management, and data quality frameworks.

Duration & Reporting

Part‐time long‐term engagement with project‐based allocations. Direct report to Head of Cloud.

Objective

  • Architecture Excellence: Design data lake architectures, create technical specifications, lead requirements gathering and solution workshops.
  • MVP Implementation: Build your designs – implement data pipelines, deploy governance frameworks, deliver working MVPs with data quality.
  • Data Governance: Establish and implement comprehensive governance frameworks including metadata management, data cataloging, data lineage, and data quality standards.
  • Client Success: Own the full lifecycle from requirements to MVP delivery, ensuring secure, compliant, scalable solutions aligned with banking regulations and GCP best practices.
  • Knowledge Transfer: Create reusable architectural patterns, data governance blueprints, implementation code, and comprehensive documentation.

KPIs

  • Design data architecture comprehensive documentation and governance framework.
  • Deliver MVP from architecture to working implementation.
  • Establish data governance implementations including metadata catalogs, lineage tracking, and quality monitoring.
  • Achieve 80%+ client acceptance rate on proposed data architectures and technical specifications.
  • Implement data pipelines with data quality and comprehensive monitoring.
  • Create reusable architectural patterns and IaC modules for banking data lakes and regulatory reporting systems.
  • Document solutions aligned with banking regulations (Basel III, MAS TRM, AML/KYC requirements).
  • Deliver cost models and ROI calculations for data lake implementations.

Areas of Responsibility

  • Phase 1: Data Architecture & Presales
  • Elicit and document requirements for data lake, reporting systems, and analytics platforms.
  • Design end‐to‐end data architectures: ingestion patterns, storage strategies, processing pipelines, consumption layers.
  • Create architecture diagrams, data models (dimensional, data vault), technical specifications, and implementation roadmaps.
  • Data Governance Design: Design metadata management frameworks, data cataloging strategies, data lineage implementations, data quality monitoring.
  • Evaluate technology options and recommend optimal GCP and on‐premises data services for specific banking use cases.
  • Calculate ROI, TCO, and cost‐benefit analysis for data lake implementations.
  • Banking Domain: Design solutions for AML reporting, KYC data management, regulatory compliance, risk reporting.
  • Hybrid Cloud Architecture: Design integration patterns between GCP and on‐premise platforms (ex. Oracle, SQL Server).
  • Security & compliance architecture: IAM, VPC Service Controls, encryption, data residency, audit logging.
  • Participate in presales activities: technical presentations, client workshops, demos, proposal support.
  • Create detailed implementation roadmaps and technical specifications for development teams.
  • Phase 2: MVP Implementation & Delivery
    • Build production data pipelines based on approved architectures.
    • Implement data warehouses: schema creation, partitioning, clustering, optimization, security setup.
    • Deploy data governance frameworks: Data Catalog configuration, metadata tagging, lineage tracking, quality monitoring.
    • Develop data ingestion patterns from on‐premise systems.
    • Write production‐grade data transformation, validation, and business logic implementation.
    • Develop Python applications for data processing automation, quality checks, and orchestration.
    • Build data quality frameworks with validation rules, anomaly detection, and alerting.
    • Create sample dashboards and reports for business stakeholders.
    • Implement CI/CD pipelines for data pipeline deployment using Terraform.
    • Deploy monitoring, logging, and alerting for data pipelines and workloads.
    • Performance tuning and cost optimization for production data workloads.
    • Document implementation details, operational runbooks, and knowledge transfer materials.

    Skills & Knowledge

    • Certifications & Core Platform: GCP Professional Cloud Architect (strong plus, not mandatory). GCP Professional Data Engineer (alternative certification). Core GCP data services: BigQuery, Dataflow, Pub/Sub, Data Catalog, Dataplex, Dataform, Composer, Cloud Storage, Data Fusion.
    • Must‐Have Technical Skills: Data Architecture (expert level) – data lakes, lakehouses, data warehouses, modern data architectures. Data Governance (expert level) – metadata management, data cataloging, data lineage, data quality frameworks, hands‐on implementation. SQL (advanced‐expert level) – production‐grade queries, complex transformations, window functions, CTEs, query optimization, performance tuning. Data Modeling (expert level) – dimensional modeling, data vault, entity‐relationship, schema design patterns for banking systems. ETL/ELT Implementation (advanced level) – production data pipelines using Dataflow (Apache Beam), Dataform, Composer, orchestration. Python (advanced level) – production data applications, pandas/numpy for data processing, automation, scripting, testing. Data Quality (advanced level) – validation frameworks, monitoring strategies, anomaly detection, automated testing.
    • BFSI Domain Knowledge (MANDATORY): Banking data domains: AML, KYC, regulatory reporting, risk management. Financial regulations: Basel III, MAS TRM, PCI‐DSS, GDPR. Understanding of banking data flows, reporting requirements, and compliance frameworks. Experience with banking data models and financial services data architecture.
    • Strong Plus: On‐premise data platforms: Oracle, SQL Server, Teradata. Data quality tools: Great Expectations, Soda, dbt tests, custom validation frameworks. Visualization tools: Looker, Looker Studio, Tableau, Power BI. Infrastructure as Code: Terraform for GCP data services. Streaming data processing: Pub/Sub, Dataflow streaming, Kafka integration. Vector databases and search: Vertex AI Vector Search, Elasticsearch (for GenAI use cases).
    • Communication: Advanced English (written and verbal). Client‐facing presentations, workshops, and requirement gathering sessions. Technical documentation and architecture artifacts (diagrams, specifications, data models). Stakeholder management and cross‐functional collaboration.

    Experience

    • 7+ years in data architecture, data engineering, or solution architecture roles.
    • 4+ years hands‐on with GCP data services (BigQuery, Dataflow, Data Catalog, Dataplex) – production implementations.
    • 3+ years in data governance (MANDATORY) – metadata management, data lineage, data quality frameworks, data cataloging.
    • 3+ years in BFSI/Banking domain (MANDATORY) – AML, KYC, regulatory reporting, compliance requirements.
    • 5+ years with SQL and relational databases – complex query writing, optimization, performance tuning.
    • 3+ years in data modeling – dimensional modeling, data vault, or other data warehouse methodologies.
    • 2+ years in presales/architecture roles – requirements gathering, solution design, client presentations.
    • Experience with on‐premise data platforms (MANDATORY) – Teradata, Oracle, SQL Server integration with cloud.

    Seniority Level: Mid‐Senior level

    Employment Type: Part‐time

    Job Function: Engineering and Information Technology

    Industries: IT Services and IT Consulting

    GCP Data Architect in London employer: Neurons Lab

    At Neurons Lab, we pride ourselves on being an exceptional employer, offering a dynamic work culture that fosters innovation and collaboration. As a Senior GCP Data Architect, you will have the opportunity to work on impactful projects within the banking sector, with a focus on data governance and compliance, while enjoying flexible part-time engagement. Our commitment to employee growth is evident through continuous learning opportunities and a supportive environment that encourages knowledge sharing and professional development.
    N

    Contact Detail:

    Neurons Lab Recruiting Team

    StudySmarter Expert Advice 🤫

    We think this is how you could land GCP Data Architect in London

    ✨Tip Number 1

    Network like a pro! Get out there and connect with folks in the banking and data architecture space. Attend meetups, webinars, or even just grab a coffee with someone who’s already in the field. You never know where a casual chat might lead!

    ✨Tip Number 2

    Show off your skills! Create a portfolio showcasing your past projects, especially those related to GCP and data governance. Having tangible examples of your work can really set you apart when you're chatting with potential employers.

    ✨Tip Number 3

    Don’t be shy about reaching out directly! If you see a company you’re keen on, drop them a message through our website. Express your interest in the GCP Data Architect role and share how your experience aligns with their needs.

    ✨Tip Number 4

    Prepare for interviews by brushing up on your technical knowledge and soft skills. Be ready to discuss your approach to data governance and architecture design. Practising common interview questions can help you feel more confident when it’s time to shine!

    We think you need these skills to ace GCP Data Architect in London

    Data Architecture
    Data Governance
    SQL
    Data Modeling
    ETL/ELT Implementation
    Python
    Data Quality
    Banking Domain Knowledge
    Financial Regulations
    GCP Data Services
    Infrastructure as Code
    Communication Skills
    Stakeholder Management
    Technical Documentation

    Some tips for your application 🫡

    Tailor Your Application: Make sure to customise your CV and cover letter for the GCP Data Architect role. Highlight your experience with data lakes, governance frameworks, and any relevant banking domain knowledge. We want to see how your skills align with what we're looking for!

    Showcase Your Technical Skills: Don’t hold back on showcasing your technical expertise! Mention your experience with GCP services like BigQuery and Dataflow, as well as your proficiency in SQL and Python. We love seeing candidates who can demonstrate their hands-on experience.

    Be Clear and Concise: When writing your application, keep it clear and to the point. Use bullet points where possible to make it easy for us to read through your qualifications and experiences. We appreciate a well-structured application that gets straight to the good stuff!

    Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it shows us you’re keen on joining our team at StudySmarter!

    How to prepare for a job interview at Neurons Lab

    ✨Know Your GCP Inside Out

    Make sure you’re well-versed in GCP data services like BigQuery, Dataflow, and Data Catalog. Brush up on your knowledge of how these tools can be applied to banking data lakes and regulatory compliance. Being able to discuss specific use cases will show your expertise and understanding of the role.

    ✨Showcase Your Data Governance Skills

    Prepare to discuss your experience with data governance frameworks, metadata management, and data quality standards. Have examples ready that demonstrate how you've implemented these in past projects, especially in the BFSI domain. This will highlight your ability to meet the specific needs of financial institutions.

    ✨Be Ready for Technical Challenges

    Expect technical questions or case studies during the interview. Practice explaining your thought process when designing data architectures or implementing data pipelines. Use real-world scenarios to illustrate your problem-solving skills and how you approach complex data challenges.

    ✨Communicate Clearly and Confidently

    Since this role involves client-facing presentations and workshops, practice articulating your ideas clearly. Be prepared to explain technical concepts in a way that non-technical stakeholders can understand. Good communication can set you apart from other candidates.

    GCP Data Architect in London
    Neurons Lab
    Location: London
    Go Premium

    Land your dream job quicker with Premium

    You’re marked as a top applicant with our partner companies
    Individual CV and cover letter feedback including tailoring to specific job roles
    Be among the first applications for new jobs with our AI application
    1:1 support and career advice from our career coaches
    Go Premium

    Money-back if you don't land a job in 6-months

    N
    • GCP Data Architect in London

      London
      Part-Time
      60000 - 84000 ÂŁ / year (est.)
    • N

      Neurons Lab

      50-100
    Similar positions in other companies
    UK’s top job board for Gen Z
    discover-jobs-cta
    Discover now
    >