At a Glance
- Tasks: Design and optimise automated data pipelines for Google Cloud Platform.
- Company: Leading tech firm in cloud solutions, fostering innovation and collaboration.
- Benefits: Competitive pay, flexible hybrid work, and opportunities for professional growth.
- Why this job: Join a dynamic team and shape the future of data architecture in the cloud.
- Qualifications: Experience with GCP, BigQuery, and strong data engineering skills required.
- Other info: Exciting projects with potential for career advancement in a fast-paced environment.
The predicted salary is between 60000 - 80000 £ per year.
Location: Leeds/Halifax, UK (Hybrid)
Employment type: Contract
Role Summary
We are seeking an experienced Cloud Data Loading Architect to design, build, and optimise automated pipelines that ingest structured, semi-structured, and unstructured datasets into Google Cloud Platform (GCP), specifically BigQuery. This role will lead end-to-end data ingestion design—from source discovery and schema mapping, through transformation and data quality, to scalable, secure loads into cloud-native analytical warehouses. The ideal candidate combines strong cloud engineering skills with hands-on data integration experience and a deep understanding of BigQuery performance optimisation.
Key Responsibilities
- Design and implement high-throughput, fault-tolerant ingestion pipelines for batch and streaming data landing in BigQuery.
- Lead data ingestion architecture patterns using Cloud Storage (GCS), Dataflow, Dataproc, Composer (Airflow), Pub/Sub, BigQuery Storage Write API, and related services.
- Define data loading frameworks, mapping rules, schema evolution strategy, and metadata management.
- Create reusable ingestion blueprints that ensure governance, lineage, and auditability.
- Establish data quality checks, validation rules, reconciliation logic, and SLAs.
- Optimise BigQuery cost, storage, partitioning, clustering, and access patterns.
- Collaborate with security & platform teams to ensure IAM, service accounts, VPC-SC, and encryption policies are fully applied.
- Automate CI/CD deployments for ingestion pipelines using Cloud Build, GitHub, GitLab or Jenkins.
- Produce detailed technical documentation and coach engineering squads in cloud ingestion standards.
- Troubleshoot ingestion failures, performance bottlenecks, and cross-platform data integration issues.
Top 10 Skillset & Qualities (Ideal Candidate)
- Deep Expertise in Google Cloud Data Services: BigQuery, GCS, Dataflow (Apache Beam), Pub/Sub, Dataproc, Cloud Composer, Storage Write API.
- Data Ingestion Engineering Mastery: Hands-on experience designing frameworks to load data from APIs, files, databases, event streams, and mainframe/legacy systems into cloud stores.
- Strong SQL & BigQuery Optimisation Skills: Partitioning, clustering, materialised views, cost-efficient query design, columnar storage understanding.
- ETL/ELT Architecture Knowledge: Experience building transformation pipelines using Airflow, Dataflow, dbt, or equivalent orchestration tools.
- File & Format Proficiency: Ability to work with Parquet, Avro, ORC, JSON, CSV, nested/repeated structures, and schema evolution.
- Strong Python and/or Java Skills: Used to build Dataflow pipelines, ingestion utilities, automation scripts.
- Cloud Security & Governance Awareness: IAM roles, least-privilege models, VPC-SC, service accounts, artifact signing, audit logging.
- DevOps & CI/CD Familiarity: Cloud Build, GitHub Actions, Terraform, Cloud Deployment Manager or Pulumi.
- Data Quality & Observability Mindset: Experience implementing validation frameworks, anomaly detection, reconciliation rules, logging/monitoring (e.g., Cloud Logging, Cloud Monitoring).
- Excellent Architectural Communication Skills: Ability to document, diagram, and communicate ingestion patterns to stakeholders at technical and non-technical levels.
GCP Data Architect in Leeds employer: Insight International (UK) Ltd
Contact Detail:
Insight International (UK) Ltd Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land GCP Data Architect in Leeds
✨Tip Number 1
Network like a pro! Reach out to folks in the industry on LinkedIn or at local meetups. You never know who might have the inside scoop on job openings or can refer you directly.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving GCP and BigQuery. This gives potential employers a taste of what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on common questions related to data ingestion and cloud architecture. Practise explaining your past projects and how they relate to the role you're applying for.
✨Tip Number 4
Don't forget to apply through our website! We love seeing candidates who are genuinely interested in joining us at StudySmarter. Plus, it makes tracking your application easier for both of us!
We think you need these skills to ace GCP Data Architect in Leeds
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the GCP Data Architect role. Highlight your experience with Google Cloud services, especially BigQuery, and any relevant projects you've worked on. We want to see how your skills match what we're looking for!
Showcase Your Projects: Include specific examples of data ingestion frameworks or pipelines you've designed. Talk about the challenges you faced and how you overcame them. This will help us understand your hands-on experience and problem-solving skills.
Be Clear and Concise: When writing your application, keep it clear and to the point. Use bullet points for key achievements and avoid jargon unless it's relevant. We appreciate straightforward communication that gets to the heart of your experience.
Apply Through Our Website: Don't forget to apply through our website! It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it makes the process smoother for everyone involved.
How to prepare for a job interview at Insight International (UK) Ltd
✨Know Your GCP Inside Out
Make sure you brush up on your knowledge of Google Cloud Platform, especially BigQuery and its related services. Be ready to discuss how you've used these tools in past projects, focusing on specific examples of data ingestion and optimisation.
✨Showcase Your Data Ingestion Mastery
Prepare to talk about your hands-on experience with designing and implementing data ingestion frameworks. Highlight any challenges you've faced and how you overcame them, particularly with batch and streaming data.
✨Demonstrate Your SQL Skills
Be ready to dive deep into SQL and BigQuery optimisation techniques. You might be asked to solve a problem on the spot, so practice explaining your thought process while optimising queries or discussing partitioning and clustering strategies.
✨Communicate Clearly and Confidently
Since this role involves collaborating with various teams, practice explaining complex technical concepts in simple terms. Prepare to share how you've documented and communicated architectural patterns to both technical and non-technical stakeholders.