At a Glance
- Tasks: Design and build scalable data pipelines on Google Cloud Platform.
- Company: Join a forward-thinking tech company focused on modern data solutions.
- Benefits: Competitive daily rate, remote work, and opportunities for professional growth.
- Why this job: Be at the forefront of data engineering and make a real impact.
- Qualifications: Experience with ETL/ELT pipelines and GCP data services required.
- Other info: Dynamic role with a focus on innovation and collaboration.
The predicted salary is between 45000 - 60000 £ per year.
We’re hiring a Senior Data Engineer to design, build, and optimise scalable data pipelines on Google Cloud Platform. This is a hands-on role for someone with a self-starter mindset who enjoys working with modern data architectures and open-source frameworks.
What you’ll be doing:
- Designing and building ETL/ELT pipelines
- Developing scalable data workflows on GCP
- Implementing robust data ingestion frameworks
- Working with structured and semi-structured data
- Collaborating with Data Modelling & Analytics teams
- Driving data reliability, monitoring, and observability
- Automating deployments and workflows
- Contributing to tooling and framework decisions
What we’re looking for:
- Strong ETL/ELT pipeline experience
- Proven GCP data services expertise
- Strong SQL and data transformation skills
- Experience with orchestration and pipeline automation
- Background in modern data architectures (lakehouse/warehouse)
- Proactive, ownership-driven mindset
Nice to have:
- Data Vault 2.0 exposure
- BigQuery optimisation experience
- Open-source data framework experience
- CI/CD for data pipelines
If you’re interested in building scalable, modern data platforms, feel free to reach out or apply.
GCP Data Engineer in Lincoln employer: Norton Blake
Contact Detail:
Norton Blake Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land GCP Data Engineer in Lincoln
✨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field, especially those who work with GCP. A friendly chat can lead to insider info about job openings or even referrals.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your ETL/ELT projects and any GCP work you've done. This gives potential employers a taste of what you can bring to the table.
✨Tip Number 3
Prepare for interviews by brushing up on your SQL and data transformation skills. Be ready to discuss your experience with modern data architectures and how you've tackled challenges in past projects.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, we love seeing candidates who are proactive about their job search!
We think you need these skills to ace GCP Data Engineer in Lincoln
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with ETL/ELT pipelines and GCP data services. We want to see how your skills align with the role, so don’t be shy about showcasing your relevant projects!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Tell us why you’re passionate about building modern data platforms and how your proactive mindset makes you a great fit for our team.
Showcase Your Technical Skills: Don’t forget to mention your SQL prowess and any experience with orchestration tools. We love seeing candidates who can demonstrate their technical expertise in data transformation and automation.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for this exciting opportunity!
How to prepare for a job interview at Norton Blake
✨Know Your GCP Inside Out
Make sure you brush up on your Google Cloud Platform knowledge. Familiarise yourself with the specific data services and tools mentioned in the job description, like BigQuery and ETL/ELT processes. Being able to discuss these confidently will show that you're not just a candidate, but a potential asset.
✨Showcase Your Pipeline Experience
Prepare to discuss your past experiences with designing and building data pipelines. Have specific examples ready that highlight your skills in automation and orchestration. This is your chance to demonstrate how you've tackled challenges and optimised workflows in previous roles.
✨Emphasise Collaboration Skills
Since the role involves working closely with Data Modelling & Analytics teams, be ready to talk about your collaborative experiences. Share examples of how you've worked with cross-functional teams to drive data reliability and observability. This will illustrate your ability to fit into their team dynamic.
✨Ask Insightful Questions
Prepare some thoughtful questions about the company's data architecture and future projects. This shows your genuine interest in the role and helps you gauge if the company aligns with your career goals. Plus, it gives you a chance to engage in a meaningful conversation during the interview.