At a Glance
- Tasks: Design and build scalable data pipelines on Google Cloud Platform.
- Company: Join a forward-thinking tech company focused on data innovation.
- Benefits: Competitive pay, remote work, and opportunities for professional growth.
- Other info: Dynamic role with a focus on modern data architectures and automation.
- Why this job: Make an impact by optimising data workflows and collaborating with talented teams.
- Qualifications: Experience with ETL/ELT pipelines and Google Cloud Dataflow is essential.
The predicted salary is between 46800 - 65000 £ per year.
6 Month Contract Inside IR35
Up to £650 per day
Remote
We’re hiring a Senior Data Engineer to design, build, and optimise scalable data pipelines on Google Cloud Platform. This is a hands-on role for someone with a self-starter mindset who enjoys working with modern data architectures and open-source frameworks.
What you’ll be doing:
- Designing and building ETL/ELT pipelines
- Developing scalable data workflows on GCP, with a strong focus on Google Cloud Dataflow
- Implementing robust data ingestion frameworks using batch and streaming approaches
- Working with structured and semi-structured data
- Collaborating with Data Modelling & Analytics teams
- Driving data reliability, monitoring, and observability
- Automating deployments and workflows
- Contributing to tooling and framework decisions
What we’re looking for:
- Strong ETL/ELT pipeline experience
- Proven GCP data services expertise, including hands-on experience with Google Cloud Dataflow (essential)
- Strong SQL and data transformation skills
- Experience with orchestration and pipeline automation
- Background in modern data architectures (lakehouse/warehouse)
- Proactive, ownership-driven mindset
Nice to have:
- Data Vault 2.0 exposure
- BigQuery optimisation experience
- Open-source data framework experience
- CI/CD for data pipelines
GCP Data Engineer - Dataflow in Bolton employer: Norton Blake
Contact Detail:
Norton Blake Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land GCP Data Engineer - Dataflow in Bolton
✨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field, especially those who work with GCP. A friendly chat can lead to insider info about job openings or even referrals.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your ETL/ELT pipelines and any projects you've done on Google Cloud Dataflow. This gives potential employers a taste of what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on your SQL and data transformation skills. Be ready to discuss your experience with modern data architectures and how you've tackled challenges in previous roles.
✨Tip Number 4
Don’t forget to apply through our website! We’re always on the lookout for talented individuals like you, and applying directly can sometimes give you an edge over other candidates.
We think you need these skills to ace GCP Data Engineer - Dataflow in Bolton
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with GCP and Dataflow. We want to see how your skills match the job description, so don’t be shy about showcasing your ETL/ELT pipeline expertise!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re the perfect fit for this role. Share specific examples of your work with data pipelines and how you've driven data reliability in past projects.
Show Off Your Projects: If you’ve worked on any relevant projects, make sure to mention them! We love seeing real-world applications of your skills, especially if they involve modern data architectures or open-source frameworks.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you don’t miss out on any important updates from our team!
How to prepare for a job interview at Norton Blake
✨Know Your GCP Inside Out
Make sure you brush up on your Google Cloud Platform knowledge, especially around Dataflow. Be ready to discuss how you've designed and built ETL/ELT pipelines in the past, and have specific examples at hand to showcase your expertise.
✨Showcase Your SQL Skills
Since strong SQL skills are a must-have for this role, prepare to demonstrate your data transformation abilities. You might be asked to solve a problem on the spot, so practice common SQL queries and be ready to explain your thought process.
✨Discuss Modern Data Architectures
Familiarise yourself with modern data architectures like lakehouses and warehouses. Be prepared to talk about your experience with these frameworks and how they can be applied to improve data workflows and reliability.
✨Emphasise Your Proactive Mindset
This role requires a self-starter attitude, so highlight instances where you've taken ownership of projects or driven improvements. Share examples of how you've automated deployments or contributed to tooling decisions in previous roles.