At a Glance
- Tasks: Own end-to-end data engineering projects and build scalable data pipelines.
- Company: Join a next-gen sports betting platform with bold branding and fast execution.
- Benefits: Up to £100,000 salary, private health insurance, flexible working, and enhanced holidays.
- Other info: Dynamic environment with opportunities for mentorship and career growth.
- Why this job: Shape a modern data platform and make a real impact in the gaming industry.
- Qualifications: Strong skills in Python, SQL, and modern data orchestration tools.
The predicted salary is between 100000 - 100000 £ per year.
This is a great opportunity to join a high‑growth organisation where you can take ownership of end‑to‑end data engineering projects and play a key role in shaping a modern, scalable data platform.
The Company
This a next‑generation sports betting and gaming platform built for a new wave of players. Combining sharp product thinking, bold branding and fast execution.
The Role
You will take ownership of the full data engineering lifecycle. Key responsibilities include:
- Owning end‑to‑end data engineering projects across the platform
- Designing, building and optimising scalable data pipelines using Python, SQL and modern orchestration tools
- Developing robust data models aligned with industry best practices
- Ensuring high standards of data quality through testing, monitoring and alerting
- Driving engineering best practices, contributing to code reviews and mentoring other engineers
Your skills and experience
You will bring strong capability in:
- Python and advanced SQL
- Building and maintaining production‑grade data pipelines
- Modern data orchestration tools (e.g. Dagster, Airflow, Prefect)
- Data modelling methodologies (Kimball, Data Vault, etc.)
- Engineering best practices including testing, version control and clean code
- AWS experience
The Benefits
You will receive a salary of up to £100,000 depending on experience, along with a comprehensive benefits package including private health insurance, income protection, flexible working, enhanced holiday entitlement and a fully supported home‑office setup.
Senior Data Engineer (AWS, Airflow, DBT) employer: Harnham
Contact Detail:
Harnham Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Data Engineer (AWS, Airflow, DBT)
✨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field, especially those who work with AWS or Airflow. A friendly chat can lead to insider info about job openings that aren't even advertised yet.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving Python and SQL. This gives potential employers a taste of what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on engineering best practices. Be ready to discuss your experience with data pipelines and orchestration tools like Airflow. We want to see how you tackle real-world problems!
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen. Plus, we love seeing candidates who are proactive about their job search.
We think you need these skills to ace Senior Data Engineer (AWS, Airflow, DBT)
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Senior Data Engineer role. Highlight your experience with Python, SQL, and any orchestration tools like Airflow or DBT. We want to see how your skills align with our needs!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Share your passion for data engineering and how you can contribute to our modern, scalable data platform. Let us know why you're excited about this opportunity at StudySmarter.
Showcase Your Projects: If you've worked on relevant projects, don’t hold back! Include links or descriptions of your end-to-end data engineering projects. We love seeing real-world applications of your skills and how you’ve tackled challenges.
Apply Through Our Website: We encourage you to apply through our website for a smoother process. It helps us keep track of your application and ensures you’re considered for the role. Plus, it’s super easy!
How to prepare for a job interview at Harnham
✨Know Your Tech Stack
Make sure you’re well-versed in Python, SQL, and the orchestration tools mentioned in the job description. Brush up on your knowledge of Airflow and DBT, and be ready to discuss how you've used these technologies in past projects.
✨Showcase Your Projects
Prepare to talk about specific end-to-end data engineering projects you've owned. Highlight your role in designing and optimising data pipelines, and be ready to explain the challenges you faced and how you overcame them.
✨Understand Data Modelling
Familiarise yourself with data modelling methodologies like Kimball and Data Vault. Be prepared to discuss how you’ve applied these practices in your work to ensure robust data models and high data quality.
✨Emphasise Best Practices
Demonstrate your commitment to engineering best practices. Talk about your experience with code reviews, testing, and version control. This shows that you not only write clean code but also value collaboration and continuous improvement.