Senior Data Engineer

Senior Data Engineer

Full-Time 60000 - 80000 £ / year (est.) No home office possible
Gigaclear

At a Glance

  • Tasks: Build and optimise scalable data pipelines on Snowflake for impactful analytics.
  • Company: Join Gigaclear, a pioneering Fibre Broadband company transforming connectivity in the UK.
  • Benefits: Competitive salary, flexible working, and opportunities for professional growth.
  • Other info: Be part of a dynamic team focused on innovation and continuous improvement.
  • Why this job: Make a real difference by enabling data-driven insights and supporting advanced analytics.
  • Qualifications: Experience in cloud-based data engineering, SQL, Python, and Snowflake is essential.

The predicted salary is between 60000 - 80000 £ per year.

As a Senior Data Engineer within the Data Engineering team, you will play a key role in building, enhancing, and maintaining our enterprise data platform on Snowflake. You will develop and optimise scalable data pipelines and models that bring data from core business systems into Snowflake, enabling analytics, reporting, and data-driven insights across the organisation. You will translate the data platform strategy into high-quality technical solutions, ensuring our Snowflake environment is reliable, well-structured, and performant. You will champion engineering best practices and contribute to standards that improve the quality, consistency, and usability of data assets. Your work will ensure the business has access to trusted, timely, and well-modelled data to support decision-making, operational reporting, and the foundations for advanced analytics and future AI/ML capabilities.

Key Accountabilities & Responsibilities

  • Snowflake Data Engineering Delivery: Design, build, and maintain high-quality data pipelines and models in Snowflake to support business analytics, BI, and operational reporting needs.
  • Data Architecture Implementation: Translate the defined data architecture and standards into implemented solutions—including ingestion, transformation, storage, and performance optimisation.
  • Pipeline Development & Orchestration: Develop robust ELT/ETL pipelines using dbt and workflow/orchestration tools (e.g., Argo Workflows), ensuring reliability, maintainability, and adherence to engineering best practices.
  • Performance & Cost Optimisation: Implement Snowflake warehouse configurations and query optimisation techniques to ensure efficient usage and predictable cost.
  • Data Quality & Governance Execution: Apply data quality checks, lineage tracking, and security standards across the data estate. Ensure compliance with data policies, InfoSec controls, and regulatory requirements as required.
  • Tooling & Feature Adoption: Leverage Snowflake capabilities (Tasks, Streams, Snowpark, Time Travel, Secure Data Sharing) to improve automation, reduce manual effort, and enhance data accessibility across the business.
  • Collaboration & Support: Work closely with analysts, data consumers, and business stakeholders to support data product delivery, troubleshoot data issues, and enable effective usage of Snowflake datasets.
  • Enablement for Analytics & Data Science: Implement dimensional models that provide clean, well-structured, reusable datasets for reporting, scenario modelling, and emerging ML/AI use cases.
  • Monitoring, Reliability & Operations: Implement and maintain monitoring, alerting, logging, and cost-management processes for Snowflake and data pipelines to ensure a stable and well-maintained platform.
  • Continuous Improvement of Engineering Practices: Contribute to shared engineering standards to simplify development and accelerate delivery across the team.

Knowledge & Skills

  • Proven experience in delivering cloud-based data engineering solutions, ideally with Snowflake.
  • Strong hands-on proficiency with SQL, Python, and dbt for data transformations, modelling, and pipeline automation.
  • Practical experience with Snowflake and RBAC management.
  • Experience with data ingestion and replication tools such as Airbyte, Fivetran, Hevo, or similar.
  • Working knowledge of cloud services (AWS preferred).
  • Strong understanding of data modelling and data governance principles.
  • Experience supporting BI/reporting tools (Power BI) and enabling them through well-designed Snowflake data models.
  • Solid knowledge of CI/CD and version-controlled development practices in git.

Desirable

  • Enterprise System Familiarity: Exposure to CRM (Salesforce), BSS/OSS (Netadmin), Call Centre, Telephony, or similar enterprise data sources.
  • Data Migration Experience: Participation in migrating data platforms (e.g., PostgreSQL or other cloud RDBMS) into a data warehouse like Snowflake with minimal disruption and strong data validation controls.
  • Change & Adoption Support: Experience supporting business teams during platform transitions (e.g., training, documentation, user onboarding, issue resolution).
  • Best Practice Contribution: Experience contributing to naming conventions, schema standards, environment management, testing frameworks, and security patterns for data platforms.
  • Continuous Learning & Innovation: Interest in staying up to date with the latest technologies, modern data stack tooling, and best practices to contribute to ongoing platform evolution.
  • Infrastructure as Code: Exposure to Terraform would be advantageous.

Gigaclear is a growing Fibre Broadband (FTTP / FTTH) company, developing our fibre-to-the-premises broadband infrastructure to some of the most difficult to reach areas of the UK, empowering those communities with broadband to rival any city.

Senior Data Engineer employer: Gigaclear

At Gigaclear, we pride ourselves on being an exceptional employer, offering a dynamic work culture that fosters innovation and collaboration. As a Senior Data Engineer, you will have the opportunity to work with cutting-edge technologies in a supportive environment that prioritises employee growth and development. Our commitment to empowering communities through advanced broadband solutions not only makes your work meaningful but also positions you at the forefront of transforming connectivity across the UK.
Gigaclear

Contact Detail:

Gigaclear Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Senior Data Engineer

✨Tip Number 1

Network like a pro! Get out there and connect with folks in the industry. Attend meetups, webinars, or even just grab a coffee with someone who works in data engineering. You never know who might have a lead on your dream job!

✨Tip Number 2

Show off your skills! Create a portfolio showcasing your projects, especially those involving Snowflake and data pipelines. This gives potential employers a taste of what you can do and sets you apart from the crowd.

✨Tip Number 3

Prepare for interviews by brushing up on common data engineering questions and scenarios. Practice explaining your thought process when building data models or optimising pipelines. Confidence is key, so let your expertise shine!

✨Tip Number 4

Don’t forget to apply through our website! We love seeing candidates who are genuinely interested in joining us at StudySmarter. Tailor your application to highlight how your experience aligns with our needs, and let’s get the conversation started!

We think you need these skills to ace Senior Data Engineer

Snowflake
Data Pipeline Development
ELT/ETL Processes
SQL
Python
dbt
Data Modelling
Data Governance
Cloud Services (AWS)
Data Ingestion Tools (Airbyte, Fivetran, Hevo)
CI/CD Practices
Version Control (git)
BI Tools (Power BI)
Monitoring and Alerting
Infrastructure as Code (Terraform)

Some tips for your application 🫡

Tailor Your CV: Make sure your CV is tailored to the Senior Data Engineer role. Highlight your experience with Snowflake, SQL, and Python, and showcase any relevant projects that demonstrate your skills in building data pipelines and optimising performance.

Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how your background aligns with our mission at StudySmarter. Don’t forget to mention specific experiences that relate to the job description.

Showcase Your Projects: If you've worked on any cool data projects, make sure to include them! Whether it's a personal project or something from a previous job, showing off your hands-on experience with tools like dbt or Airbyte can really set you apart.

Apply Through Our Website: We encourage you to apply through our website for the best chance of getting noticed. It’s super easy, and you’ll be able to keep track of your application status directly. Plus, we love seeing candidates who take that extra step!

How to prepare for a job interview at Gigaclear

✨Know Your Snowflake Inside Out

Make sure you brush up on your Snowflake knowledge before the interview. Be ready to discuss how you've designed and maintained data pipelines in Snowflake, and be prepared to share specific examples of how you've optimised performance and cost.

✨Show Off Your SQL and Python Skills

Since strong hands-on proficiency with SQL and Python is crucial for this role, practice coding challenges or projects that showcase your skills. Be ready to explain your thought process and the decisions you made while developing data transformations and models.

✨Understand Data Governance Principles

Familiarise yourself with data governance principles and be prepared to discuss how you've applied data quality checks and lineage tracking in your previous roles. This will demonstrate your commitment to maintaining high-quality data assets.

✨Collaborate and Communicate

This role involves working closely with analysts and business stakeholders, so be ready to talk about your collaboration experiences. Share examples of how you've supported data product delivery and resolved data issues, highlighting your communication skills and teamwork.

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

>