At a Glance
- Tasks: Design and build scalable data pipelines on Google Cloud Platform.
- Company: Join a forward-thinking tech company focused on data innovation.
- Benefits: Competitive daily rate, remote work, and a dynamic contract role.
- Other info: Exciting opportunity for career growth in a fast-paced environment.
- Why this job: Make an impact by optimising data workflows and collaborating with talented teams.
- Qualifications: Experience with ETL/ELT pipelines and Google Cloud Dataflow is essential.
The predicted salary is between 46800 - 65000 £ per year.
6 Month Contract Inside IR35
Up to £650 per day
Remote
We’re hiring a Senior Data Engineer to design, build, and optimise scalable data pipelines on Google Cloud Platform. This is a hands-on role for someone with a self-starter mindset who enjoys working with modern data architectures and open-source frameworks.
What you’ll be doing:
- Designing and building ETL/ELT pipelines
- Developing scalable data workflows on GCP, with a strong focus on Google Cloud Dataflow
- Implementing robust data ingestion frameworks using batch and streaming approaches
- Working with structured and semi-structured data
- Collaborating with Data Modelling & Analytics teams
- Driving data reliability, monitoring, and observability
- Automating deployments and workflows
- Contributing to tooling and framework decisions
What we’re looking for:
- Strong ETL/ELT pipeline experience
- Proven GCP data services expertise, including hands-on experience with Google Cloud Dataflow (essential)
- Strong SQL and data transformation skills
- Experience with orchestration and pipeline automation
- Background in modern data architectures (lakehouse/warehouse)
- Proactive, ownership-driven mindset
Nice to have:
- Data Vault 2.0 exposure
- BigQuery optimisation experience
- Open-source data framework experience
- CI/CD for data pipelines
Locations
GCP Data Engineer - Dataflow in Cheshire, Warrington employer: Norton Blake
Contact Detail:
Norton Blake Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land GCP Data Engineer - Dataflow in Cheshire, Warrington
✨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field, especially those who work with GCP. A friendly chat can lead to insider info about job openings or even referrals.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your ETL/ELT pipelines and any projects you've done on Google Cloud Dataflow. This gives potential employers a taste of what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on your SQL and data transformation skills. Be ready to discuss your experience with modern data architectures and how you've tackled challenges in previous roles.
✨Tip Number 4
Don't forget to apply through our website! We love seeing candidates who are proactive and take the initiative. Plus, it makes it easier for us to keep track of your application and get back to you quickly.
We think you need these skills to ace GCP Data Engineer - Dataflow in Cheshire, Warrington
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with GCP and Dataflow. We want to see how your skills match the job description, so don’t be shy about showcasing your ETL/ELT pipeline expertise!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re the perfect fit for this role. Share specific examples of your work with data pipelines and how you've driven data reliability in past projects.
Show Off Your Projects: If you’ve worked on any relevant projects, make sure to mention them! We love seeing real-world applications of your skills, especially if they involve modern data architectures or open-source frameworks.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you don’t miss out on any important updates from our team!
How to prepare for a job interview at Norton Blake
✨Know Your GCP Inside Out
Make sure you brush up on your Google Cloud Platform knowledge, especially around Dataflow. Be ready to discuss how you've designed and built ETL/ELT pipelines in the past, and have specific examples at hand to showcase your expertise.
✨Showcase Your SQL Skills
Since strong SQL and data transformation skills are crucial for this role, prepare to demonstrate your proficiency. You might be asked to solve a problem or optimise a query during the interview, so practice common SQL challenges beforehand.
✨Highlight Your Proactive Mindset
This role requires a self-starter attitude, so be prepared to share instances where you've taken ownership of projects. Discuss how you've driven data reliability and observability in previous roles, as this will resonate well with the interviewers.
✨Familiarise Yourself with Modern Data Architectures
Understanding modern data architectures like lakehouses and warehouses is key. Be ready to talk about your experience with these frameworks and how they relate to the role. If you have any exposure to Data Vault 2.0 or BigQuery optimisation, make sure to mention that too!