At a Glance
- Tasks: Design and build scalable data pipelines on Google Cloud Platform.
- Company: Globally established organisation with a focus on cloud data solutions.
- Benefits: Up to £85,000 salary, bonus, hybrid work, and comprehensive benefits.
- Other info: Opportunity for career growth in a dynamic, innovative environment.
- Why this job: Join a team transforming data architecture and make a real impact.
- Qualifications: Experience in GCP, BigQuery, Dataflow, and strong programming skills.
The predicted salary is between 85000 - 85000 £ per year.
A globally established organisation is seeking an experienced GCP Data Engineer to help build modern, scalable cloud data platforms within a complex enterprise environment. This GCP Data Engineer role will focus on designing data architectures on Google Cloud Platform, building high-performance pipelines, and enabling reliable, secure and governed data solutions that support business growth and decision-making.
Responsibilities for the GCP Data Engineer:
- Design and build scalable data pipelines, ETL/ELT workflows and cloud data architectures on GCP
- Develop solutions using services such as BigQuery, Dataflow, Spanner and Cloud Storage
- Build and maintain code using Python, Java or Scala for data transformation and processing
- Optimise data pipelines, queries and workloads for performance and scalability
- Implement data quality, validation and governance controls
- Collaborate with cross-functional teams to deliver end-to-end data solutions
- Ensure alignment with security, privacy and compliance standards
- Troubleshoot issues across pipelines and processing workflows
- Maintain documentation across data flows, platforms and architectures
Essential Skills for the GCP Data Engineer:
- Strong experience as a Data Engineer within GCP environments
- Hands-on experience with BigQuery, Dataflow and Spanner
- Strong programming capability in Python, Java or Scala
- Experience with Apache Beam / Dataflow and distributed processing frameworks
- Experience designing and managing ETL / ELT pipelines
- Solid understanding of data modelling and database design
- Experience with workflow orchestration tools such as Apache Airflow
- Strong understanding of data governance, security and scalability
Desirable Skills for the GCP Data Engineer:
- Experience with Pub/Sub, Cloud Composer or Cloud Data Fusion
- Exposure to CI/CD, DevOps and Infrastructure as Code
- Experience with real-time streaming architectures
- Knowledge of modern data governance frameworks
If you are an experienced GCP Data Engineer looking to build enterprise-scale data platforms using modern Google Cloud technologies, this role offers strong exposure to complex delivery and cloud transformation.
GCP Data Engineer - up to £85,000 + Bonus - Hybrid/London employer: Involved Solutions
Contact Detail:
Involved Solutions Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land GCP Data Engineer - up to £85,000 + Bonus - Hybrid/London
✨Network Like a Pro
Get out there and connect with people in the industry! Attend meetups, webinars, or even just grab a coffee with someone who’s already in the GCP space. You never know who might have a lead on your dream job!
✨Show Off Your Skills
Create a portfolio showcasing your projects, especially those involving BigQuery or Dataflow. Having tangible examples of your work can really set you apart when chatting with potential employers.
✨Ace the Interview
Prepare for technical interviews by brushing up on your Python, Java, or Scala skills. Practice common data engineering problems and be ready to discuss your experience with ETL/ELT workflows and cloud architectures.
✨Apply Through Us!
Don’t forget to check out our website for the latest GCP Data Engineer roles. Applying directly through us not only gives you access to exclusive opportunities but also helps us support you throughout the process!
We think you need these skills to ace GCP Data Engineer - up to £85,000 + Bonus - Hybrid/London
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with GCP, BigQuery, and Dataflow. We want to see how your skills align with the role, so don’t be shy about showcasing relevant projects or achievements!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re passionate about data engineering and how you can contribute to our team. Keep it concise but impactful – we love a good story!
Showcase Your Technical Skills: When filling out your application, make sure to mention your programming skills in Python, Java, or Scala. We’re keen on seeing how you’ve used these languages in real-world scenarios, especially in building ETL/ELT pipelines.
Apply Through Our Website: We encourage you to apply directly through our website for the best chance of getting noticed. It’s super easy, and you’ll be one step closer to joining our awesome team at StudySmarter!
How to prepare for a job interview at Involved Solutions
✨Know Your GCP Inside Out
Make sure you brush up on your knowledge of Google Cloud Platform, especially BigQuery and Dataflow. Be ready to discuss how you've used these tools in past projects, as well as any challenges you faced and how you overcame them.
✨Showcase Your Coding Skills
Since programming is a big part of this role, be prepared to demonstrate your coding abilities in Python, Java, or Scala. You might be asked to solve a problem on the spot, so practice coding challenges beforehand to boost your confidence.
✨Understand Data Governance
Familiarise yourself with data governance principles and be ready to discuss how you've implemented data quality and validation controls in previous roles. This will show that you understand the importance of security and compliance in data engineering.
✨Collaborate and Communicate
This role involves working with cross-functional teams, so be prepared to talk about your experience collaborating with others. Share examples of how you’ve effectively communicated technical concepts to non-technical stakeholders to ensure everyone is on the same page.