At a Glance
- Tasks: Design and build scalable data pipelines on Google Cloud Platform.
- Company: Join a forward-thinking tech company focused on data innovation.
- Benefits: Competitive pay, remote work, and opportunities for professional growth.
- Other info: Dynamic role with a focus on modern data architectures and automation.
- Why this job: Make an impact by optimising data workflows and collaborating with talented teams.
- Qualifications: Experience with ETL/ELT pipelines and Google Cloud Dataflow is essential.
The predicted salary is between 46800 - 65000 £ per year.
6 Month Contract Inside IR35
Up to £650 per day
Remote
We’re hiring a Senior Data Engineer to design, build, and optimise scalable data pipelines on Google Cloud Platform. This is a hands-on role for someone with a self-starter mindset who enjoys working with modern data architectures and open-source frameworks.
What you’ll be doing:
- Designing and building ETL/ELT pipelines
- Developing scalable data workflows on GCP, with a strong focus on Google Cloud Dataflow
- Implementing robust data ingestion frameworks using batch and streaming approaches
- Working with structured and semi-structured data
- Collaborating with Data Modelling & Analytics teams
- Driving data reliability, monitoring, and observability
- Automating deployments and workflows
- Contributing to tooling and framework decisions
What we’re looking for:
- Strong ETL/ELT pipeline experience
- Proven GCP data services expertise, including hands-on experience with Google Cloud Dataflow (essential)
- Strong SQL and data transformation skills
- Experience with orchestration and pipeline automation
- Background in modern data architectures (lakehouse/warehouse)
- Proactive, ownership-driven mindset
Nice to have:
- Data Vault 2.0 exposure
- BigQuery optimisation experience
- Open-source data framework experience
- CI/CD for data pipelines
GCP Data Engineer - Dataflow in Maidstone employer: Norton Blake
Contact Detail:
Norton Blake Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land GCP Data Engineer - Dataflow in Maidstone
✨Tip Number 1
Network like a pro! Reach out to folks in your industry on LinkedIn or at meetups. We all know that sometimes it’s not just what you know, but who you know that can land you that GCP Data Engineer gig.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your ETL/ELT pipelines and any projects you've done with Google Cloud Dataflow. We love seeing real-world applications of your expertise, so make sure to highlight those!
✨Tip Number 3
Prepare for the interview by brushing up on your SQL and data transformation skills. We recommend practising common data engineering scenarios and being ready to discuss how you’ve tackled challenges in past projects.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, we’re always on the lookout for proactive, ownership-driven candidates like you!
We think you need these skills to ace GCP Data Engineer - Dataflow in Maidstone
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with ETL/ELT pipelines and Google Cloud Dataflow. We want to see how your skills match the job description, so don’t be shy about showcasing relevant projects!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re the perfect fit for this role. Mention your hands-on experience with GCP and how you’ve tackled data challenges in the past.
Showcase Your Projects: If you’ve worked on any cool data projects, make sure to mention them! We love seeing real-world applications of your skills, especially if they involve modern data architectures or open-source frameworks.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you don’t miss out on any important updates from our team!
How to prepare for a job interview at Norton Blake
✨Know Your GCP Inside Out
Make sure you brush up on your Google Cloud Platform knowledge, especially around Dataflow. Be ready to discuss how you've designed and built ETL/ELT pipelines in the past, and have specific examples at hand to showcase your expertise.
✨Showcase Your SQL Skills
Since strong SQL skills are a must for this role, prepare to demonstrate your data transformation abilities. You might be asked to solve a problem on the spot, so practice common SQL queries and be ready to explain your thought process.
✨Understand Modern Data Architectures
Familiarise yourself with modern data architectures like lakehouses and warehouses. Be prepared to discuss how these concepts apply to the role and how you've worked with structured and semi-structured data in previous projects.
✨Emphasise Your Proactive Mindset
This role requires a self-starter attitude, so be ready to share examples of how you've taken ownership of projects. Highlight instances where you've driven data reliability or contributed to tooling decisions, as this will show you're the right fit for their team.