At a Glance
- Tasks: Design and build data pipelines using Apache Airflow for trading and analytics.
- Company: Dynamic B2B energy supplier at the forefront of the UK energy market.
- Benefits: Hybrid work model, competitive salary, and opportunities for professional growth.
- Why this job: Join a small team and influence the future of data architecture in energy trading.
- Qualifications: Experience with Python, SQL, and Apache Airflow; collaborative mindset required.
- Other info: Work with modern technologies and gain exposure to real-time market data.
The predicted salary is between 36000 - 60000 £ per year.
A growing B2B energy supplier operating at the forefront of the UK energy market is investing heavily in its data platform and engineering capability as the business continues to scale its trading operations.
They are looking for a Data Engineer with strong Apache Airflow experience to help design and build the pipelines that power trading, analytics and commercial decision-making. This is a great opportunity to join a small, highly experienced team where engineers have real ownership of the data platform and the opportunity to work with modern open-source technologies.
What You’ll Be Doing
- Designing and building data pipelines orchestrated with Apache Airflow
- Developing ETL/ELT workflows in Python and SQL to process trading and operational data
- Building reliable, scalable data infrastructure to support analytics and business insight
- Collaborating with analysts, product teams and trading teams to deliver high-value datasets
- Improving data quality, monitoring and pipeline reliability
- Helping shape the future architecture of the data platform
Tech Stack - You don’t need everything here, but experience with some of the following is expected:
- Apache Airflow (core focus)
- Python
- SQL
- Apache Spark or Kafka
- PostgreSQL or similar relational databases
- Docker or containerised environments
- Git and CI/CD workflows
- Data modelling and data warehousing concepts
What They’re Looking For
- Experience building and maintaining data pipelines in production
- Strong Python and SQL skills
- Hands-on experience with Apache Airflow
- Someone who enjoys solving data engineering and infrastructure challenges
- A collaborative engineer comfortable working with both technical and business stakeholders
Why This Role
- Work in a data-driven trading environment
- Join a small team where you can influence architecture
- Build modern data infrastructure from the ground up
- Exposure to energy trading data and real-time market information
Experience: Typically 2–5 years in Data Engineering or similar roles.
Data Engineer : Hybrid in London employer: Norton Blake
Contact Detail:
Norton Blake Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer : Hybrid in London
✨Tip Number 1
Network like a pro! Reach out to people in the energy trading sector or data engineering roles on LinkedIn. A friendly chat can open doors that a CV just can't.
✨Tip Number 2
Show off your skills! If you’ve got a GitHub or portfolio showcasing your projects, make sure to share it during interviews. It’s a great way to demonstrate your hands-on experience with tools like Apache Airflow and Python.
✨Tip Number 3
Prepare for technical challenges! Brush up on your SQL and Python skills, and be ready to tackle some real-world problems during interviews. Practice makes perfect, so don’t skip this step!
✨Tip Number 4
Apply through our website! We’re always on the lookout for talented Data Engineers. By applying directly, you’ll get noticed faster and have a better chance of landing that dream job.
We think you need these skills to ace Data Engineer : Hybrid in London
Some tips for your application 🫡
Tailor Your CV: Make sure your CV reflects the skills and experiences that match the Data Engineer role. Highlight your Apache Airflow experience and any relevant projects you've worked on, especially those involving Python and SQL.
Craft a Compelling Cover Letter: Use your cover letter to tell us why you're passionate about data engineering and how you can contribute to our team. Mention specific examples of how you've solved data challenges in the past and your enthusiasm for working with modern technologies.
Showcase Your Projects: If you've built any data pipelines or worked on relevant projects, don’t hesitate to include them in your application. We love seeing practical examples of your work, especially if they involve the tech stack mentioned in the job description.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it gives you a chance to explore more about our company culture!
How to prepare for a job interview at Norton Blake
✨Know Your Tech Stack
Make sure you’re familiar with the technologies mentioned in the job description, especially Apache Airflow, Python, and SQL. Brush up on your knowledge of data pipelines and be ready to discuss how you've used these tools in past projects.
✨Showcase Your Problem-Solving Skills
Prepare examples of challenges you've faced in data engineering and how you solved them. This role requires someone who enjoys tackling infrastructure challenges, so highlight your analytical thinking and creativity in overcoming obstacles.
✨Collaborate Like a Pro
Since this position involves working closely with analysts and trading teams, be ready to discuss your experience collaborating with different stakeholders. Share specific instances where your teamwork led to successful outcomes or improved processes.
✨Ask Insightful Questions
Prepare thoughtful questions about the company’s data platform and future architecture. This shows your genuine interest in the role and helps you understand how you can contribute to their goals, making you stand out as a candidate.