Senior Data Engineer in Abingdon

Senior Data Engineer in Abingdon

Abingdon Full-Time 60000 - 80000 ÂŁ / year (est.) No home office possible
Gigaclear

At a Glance

  • Tasks: Build and optimise data pipelines on Snowflake for impactful analytics and reporting.
  • Company: Join Gigaclear, a pioneering Fibre Broadband company transforming connectivity in the UK.
  • Benefits: Competitive salary, flexible working, and opportunities for professional growth.
  • Other info: Be part of a dynamic team focused on innovation and community empowerment.
  • Why this job: Make a real difference by enabling data-driven insights in a growing tech environment.
  • Qualifications: Experience with Snowflake, SQL, Python, and data engineering best practices.

The predicted salary is between 60000 - 80000 ÂŁ per year.

As a Senior Data Engineer within the Data Engineering team, you will play a key role in building, enhancing, and maintaining our enterprise data platform on Snowflake. You will develop and optimise scalable data pipelines and models that bring data from core business systems into Snowflake, enabling analytics, reporting, and data-driven insights across the organisation. You will translate the data platform strategy into high-quality technical solutions, ensuring our Snowflake environment is reliable, well-structured, and performant. You will champion engineering best practices and contribute to standards that improve the quality, consistency, and usability of data assets. Your work will ensure the business has access to trusted, timely, and well-modelled data to support decision‑making, operational reporting, and the foundations for advanced analytics and future AI/ML capabilities.

Key Accountabilities & Responsibilities

  • Design, build, and maintain high-quality data pipelines and models in Snowflake to support business analytics, BI, and operational reporting needs.
  • Translate the defined data architecture and standards into implemented solutions—including ingestion, transformation, storage, and performance optimisation.
  • Develop robust ELT/ETL pipelines using dbt and workflow/orchestration tools (e.g., Argo Workflows), ensuring reliability, maintainability, and adherence to engineering best practices.
  • Implement Snowflake warehouse configurations and query optimisation techniques to ensure efficient usage and predictable cost.
  • Apply data quality checks, lineage tracking, and security standards across the data estate.
  • Ensure compliance with data policies, InfoSec controls, and regulatory requirements as required.
  • Leverage Snowflake capabilities (Tasks, Streams, Snowpark, Time Travel, Secure Data Sharing) to improve automation, reduce manual effort, and enhance data accessibility across the business.
  • Work closely with analysts, data consumers, and business stakeholders to support data product delivery, troubleshoot data issues, and enable effective usage of Snowflake datasets.
  • Implement dimensional models that provide clean, well‑structured, reusable datasets for reporting, scenario modelling, and emerging ML/AI use cases.
  • Implement and maintain monitoring, alerting, logging, and cost‑management processes for Snowflake and data pipelines to ensure a stable and well‑maintained platform.
  • Contribute to shared engineering standards to simplify development and accelerate delivery across the team.

Knowledge & Skills

  • Proven experience in delivering cloud‑based data engineering solutions, ideally with Snowflake.
  • Strong hands‑on proficiency with SQL, Python, and dbt for data transformations, modelling, and pipeline automation.
  • Practical experience with Snowflake and RBAC management.
  • Experience with data ingestion and replication tools such as Airbyte, Fivetran, Hevo, or similar.
  • Working knowledge of cloud services (AWS preferred).
  • Strong understanding of data modelling and data governance principles.
  • Experience supporting BI/reporting tools (Power BI) and enabling them through well‑designed Snowflake data models.
  • Solid knowledge of CI/CD and version‑controlled development practices in git.

Enterprise System Familiarity

  • Exposure to CRM (Salesforce), BSS/OSS (Netadmin), Call Centre, Telephony, or similar enterprise data sources.

Data Migration Experience

  • Participation in migrating data platforms (e.g., PostgreSQL or other cloud RDBMS) into a data warehouse like Snowflake with minimal disruption and strong data validation controls.

Change & Adoption Support

  • Experience supporting business teams during platform transitions (e.g., training, documentation, user onboarding, issue resolution).

Best Practice Contribution

  • Experience contributing to naming conventions, schema standards, environment management, testing frameworks, and security patterns for data platforms.
  • Interest in staying up to date with the latest technologies, modern data stack tooling, and best practices to contribute to ongoing platform evolution.

Infrastructure as Code

  • Exposure to Terraform would be advantageous.

Gigaclear is a growing Fibre Broadband (FTTP / FTTH) company, developing our fibre‑to‑the‑premises broadband infrastructure to some of the most difficult to reach areas of the UK, empowering those communities with broadband to rival any city.

Senior Data Engineer in Abingdon employer: Gigaclear

Gigaclear is an exceptional employer, offering a dynamic work environment where innovation meets community impact. As a Senior Data Engineer, you will not only enhance your technical skills by working with cutting-edge technologies like Snowflake but also contribute to bridging the digital divide in rural areas of the UK. With a strong focus on employee growth, collaborative culture, and commitment to engineering best practices, Gigaclear provides a rewarding opportunity for those looking to make a meaningful difference while advancing their careers.
Gigaclear

Contact Detail:

Gigaclear Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Senior Data Engineer in Abingdon

✨Tip Number 1

Network like a pro! Reach out to current employees at Gigaclear on LinkedIn or other platforms. Ask them about their experiences and any tips they might have for landing the Senior Data Engineer role. Personal connections can give you an edge!

✨Tip Number 2

Show off your skills in action! If you’ve got a portfolio of projects, especially those involving Snowflake, SQL, or Python, make sure to highlight them during interviews. Real-world examples can really impress hiring managers.

✨Tip Number 3

Prepare for technical interviews by brushing up on your data engineering concepts. Be ready to discuss your experience with ELT/ETL pipelines and how you’ve optimised performance in past roles. Confidence in your knowledge will shine through!

✨Tip Number 4

Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, it shows you’re genuinely interested in joining the team at Gigaclear.

We think you need these skills to ace Senior Data Engineer in Abingdon

Snowflake
Data Pipeline Development
ELT/ETL Pipelines
dbt
SQL
Python
Data Governance
Data Modelling
Cloud Services (AWS)
CI/CD
Version Control (git)
Data Quality Checks
Monitoring and Alerting
Data Migration
Infrastructure as Code (Terraform)

Some tips for your application 🫡

Tailor Your CV: Make sure your CV is tailored to the Senior Data Engineer role. Highlight your experience with Snowflake, SQL, and Python, and showcase any relevant projects that demonstrate your skills in building data pipelines and optimising performance.

Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how your background aligns with our mission at StudySmarter. Don’t forget to mention specific tools and techniques you’ve used that relate to the job description.

Showcase Your Problem-Solving Skills: In your application, include examples of how you've tackled challenges in data engineering. Whether it's optimising a pipeline or ensuring data quality, we want to see how you approach problems and find solutions that benefit the business.

Apply Through Our Website: We encourage you to apply directly through our website for the best chance of getting noticed. It’s the easiest way for us to keep track of your application and ensure it reaches the right people!

How to prepare for a job interview at Gigaclear

✨Know Your Snowflake Inside Out

Make sure you brush up on your Snowflake knowledge before the interview. Be ready to discuss how you've built and optimised data pipelines in Snowflake, and be prepared to share specific examples of how you've implemented features like Tasks and Streams.

✨Show Off Your SQL and Python Skills

Since strong hands-on proficiency with SQL and Python is crucial for this role, practice coding challenges or projects that showcase your skills. Be ready to explain your thought process and how you approach data transformations and pipeline automation.

✨Understand Data Governance and Quality

Familiarise yourself with data governance principles and quality checks. Be prepared to discuss how you've ensured compliance with data policies and InfoSec controls in your previous roles, as this will demonstrate your commitment to maintaining high standards.

✨Be Ready to Discuss CI/CD Practices

As the role involves version-controlled development practices, make sure you can talk about your experience with CI/CD processes and tools like git. Share examples of how you've contributed to shared engineering standards and improved team delivery.

Senior Data Engineer in Abingdon
Gigaclear
Location: Abingdon

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

>