At a Glance
- Tasks: Design and build data pipelines using Apache Airflow for trading and analytics.
- Company: Dynamic B2B energy supplier leading the UK market.
- Benefits: Hybrid work model, competitive salary, and opportunities for professional growth.
- Why this job: Join a small team and shape the future of data architecture in energy trading.
- Qualifications: Experience with Apache Airflow, Python, SQL, and data pipeline development.
- Other info: Collaborative environment with real ownership of projects and modern tech exposure.
The predicted salary is between 36000 - 60000 £ per year.
A growing B2B energy supplier operating at the forefront of the UK energy market is investing heavily in its data platform and engineering capability as the business continues to scale its trading operations. They are looking for a Data Engineer with strong Apache Airflow experience to help design and build the pipelines that power trading, analytics and commercial decision-making. This is a great opportunity to join a small, highly experienced team where engineers have real ownership of the data platform and the opportunity to work with modern open-source technologies.
What You’ll Be Doing
- Designing and building data pipelines orchestrated with Apache Airflow
- Developing ETL/ELT workflows in Python and SQL to process trading and operational data
- Building reliable, scalable data infrastructure to support analytics and business insight
- Collaborating with analysts, product teams and trading teams to deliver high-value datasets
- Improving data quality, monitoring and pipeline reliability
- Helping shape the future architecture of the data platform
Tech Stack - You don’t need everything here, but experience with some of the following is expected:
- Apache Airflow (core focus)
- Python
- SQL
- Apache Spark or Kafka
- PostgreSQL or similar relational databases
- Docker or containerised environments
- Git and CI/CD workflows
- Data modelling and data warehousing concepts
What They’re Looking For
- Experience building and maintaining data pipelines in production
- Strong Python and SQL skills
- Hands-on experience with Apache Airflow
- Someone who enjoys solving data engineering and infrastructure challenges
- A collaborative engineer comfortable working with both technical and business stakeholders
Why This Role
- Work in a data-driven trading environment
- Join a small team where you can influence architecture
- Build modern data infrastructure from the ground up
- Exposure to energy trading data and real-time market information
Experience: Typically 2–5 years in Data Engineering or similar roles.
Data Engineer - Apache Airflow in London employer: Norton Blake
Contact Detail:
Norton Blake Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer - Apache Airflow in London
✨Tip Number 1
Network like a pro! Reach out to people in the energy trading sector or those already working at the company. A friendly chat can open doors and give you insider info that could help you stand out.
✨Tip Number 2
Show off your skills! If you’ve got a GitHub or portfolio showcasing your data pipelines or projects, make sure to share it during interviews. It’s a great way to demonstrate your hands-on experience with Apache Airflow and Python.
✨Tip Number 3
Prepare for technical challenges! Brush up on your SQL and Python skills, and be ready to tackle some real-world problems during interviews. Practising common data engineering scenarios can really boost your confidence.
✨Tip Number 4
Apply through our website! We’re always on the lookout for talented Data Engineers. By applying directly, you’ll ensure your application gets the attention it deserves, and you might just land that dream job with us!
We think you need these skills to ace Data Engineer - Apache Airflow in London
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Apache Airflow and any relevant data engineering projects. We want to see how your skills align with the role, so don’t be shy about showcasing your Python and SQL expertise!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re excited about this role and how your background makes you a perfect fit. We love seeing genuine enthusiasm for the energy trading sector and data engineering.
Showcase Your Projects: If you've worked on any cool data pipelines or ETL workflows, make sure to mention them! We appreciate hands-on experience, so share specific examples that demonstrate your problem-solving skills and technical know-how.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it shows you’re keen to join our team!
How to prepare for a job interview at Norton Blake
✨Know Your Tech Stack
Make sure you’re familiar with the technologies mentioned in the job description, especially Apache Airflow, Python, and SQL. Brush up on your knowledge of ETL/ELT workflows and be ready to discuss how you've used these tools in past projects.
✨Showcase Your Problem-Solving Skills
Prepare examples of how you've tackled data engineering challenges in previous roles. Think about specific problems you faced, the solutions you implemented, and the impact they had on the project or team.
✨Collaborate and Communicate
Since this role involves working with both technical and business stakeholders, practice explaining complex technical concepts in simple terms. Be ready to discuss how you’ve collaborated with analysts and product teams to deliver high-value datasets.
✨Ask Insightful Questions
Prepare thoughtful questions about the company’s data platform and future architecture. This shows your genuine interest in the role and helps you understand how you can contribute to their goals.