At a Glance
- Tasks: Build and optimise data pipelines for risk analytics in a dynamic trading environment.
- Company: Join a leading firm in energy trading with a focus on innovation.
- Benefits: Attractive salary, comprehensive benefits, and opportunities for professional growth.
- Other info: Collaborative culture with a focus on career development and learning.
- Why this job: Make an impact in data engineering while working with cutting-edge technologies.
- Qualifications: Experience in data engineering, Python/SQL, and Azure services required.
The predicted salary is between 60000 - 80000 £ per year.
Full-time in West London office 5 days a week, great salary and benefits package offered. The Data Engineer with Commodity Trading and Data Risk experience will focus on building, managing, and optimizing risk analytics workflows. You will play a key role in designing and evolving data pipelines, models, and tooling that support critical risk processes, while also contributing to the broader data platform leveraging Microsoft Fabric. Working closely with the platform team, you will help develop shared infrastructure, scalable ingestion frameworks, and high-quality data products.
This role sits at the intersection of data engineering, risk technology and modern data platform practices within a global, always-on trading environment. Understand risk workflows end-to-end and translate them into reliable, production-grade data pipelines and products.
- Build and maintain batch and near-real-time data ingestion pipelines from diverse sources including relational databases, REST APIs, FTP/SFTP feeds, and cloud storage.
- Contribute to the data platform delivering harmonised, governed data products that serve multiple business functions.
- Collaborate with risk, analytics, and engineering teams to productionize and maintain risk models and scripts.
- Implement best practices for code quality, testing, and release management across the data platform.
- Build and support Power BI semantic models and DirectLake datasets.
- Manage and maintain Azure DevOps pipelines for deployment, version control, and CI/CD of risk-related scripts and data workflows.
- Monitor system performance and troubleshoot issues related to data pipelines and deployments.
- Ensure proper data governance, security, and compliance standards are applied.
Hands-on experience with Microsoft Fabric, Azure data services (e.g., Synapse Analytics, Data Factory), or Databricks for large-scale data processing.
- Proficiency in Python / SQL for data engineering and scripting.
- Familiarity with risk analytics environments or financial data.
- Strong experience with Apache Spark (Spark Engine), including performance optimization and distributed data processing.
- Strong experience ingesting data from diverse sources including relational databases, REST APIs, FTP/SFTP file feeds, and cloud storage.
- Experience managing data pipelines and production workflows.
- Experience with Azure DevOps (CI/CD pipelines, repos, release management).
- Experience with version control (Git) and software development lifecycle practices.
- Experience with streaming data technologies (e.g. Kafka, Azure Event Hubs).
- Knowledge of non-relational databases (e.g. MongoDB, Cosmos DB).
- Familiarity with Data Mesh principles and domain-oriented data ownership.
- Experience with monitoring/logging tools in Azure.
Senior Engineer, Data Engineering employer: Eaglecliff Recruitment
Contact Detail:
Eaglecliff Recruitment Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Engineer, Data Engineering
✨Tip Number 1
Network like a pro! Reach out to your connections in the industry, especially those who work in data engineering or risk analytics. A friendly chat can lead to insider info about job openings that aren't even advertised yet.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving data pipelines and risk analytics. This gives potential employers a taste of what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on common technical questions related to data engineering and risk technology. We recommend practicing coding challenges and discussing your past experiences with data workflows to impress your interviewers.
✨Tip Number 4
Don't forget to apply through our website! It’s the best way to ensure your application gets noticed. Plus, we love seeing candidates who are proactive and engaged in their job search.
We think you need these skills to ace Senior Engineer, Data Engineering
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with data engineering, especially in risk analytics and the technologies mentioned in the job description. We want to see how your skills align with what we're looking for!
Craft a Compelling Cover Letter: Use your cover letter to tell us why you're passionate about data engineering and how your background fits into our team. Share specific examples of your work with data pipelines and risk models to really stand out.
Showcase Relevant Projects: If you've worked on projects involving Microsoft Fabric, Azure services, or Apache Spark, make sure to include those! We love seeing real-world applications of your skills that relate to the role.
Apply Through Our Website: For the best chance of getting noticed, apply directly through our website. It helps us keep track of your application and ensures it reaches the right people quickly!
How to prepare for a job interview at Eaglecliff Recruitment
✨Know Your Tech Stack
Make sure you’re well-versed in the technologies mentioned in the job description, especially Microsoft Fabric, Azure data services, and Apache Spark. Brush up on your Python and SQL skills, as you'll likely be asked to demonstrate your proficiency during the interview.
✨Understand Risk Workflows
Since this role focuses on risk analytics, take the time to understand end-to-end risk workflows. Be prepared to discuss how you would translate these workflows into reliable data pipelines, showcasing your ability to connect technical skills with business needs.
✨Showcase Collaboration Skills
This position requires working closely with various teams. Think of examples from your past experiences where you collaborated effectively with risk, analytics, or engineering teams. Highlight how you contributed to productionising models and maintaining workflows.
✨Prepare for Problem-Solving Questions
Expect questions that assess your troubleshooting abilities, especially related to data pipelines and deployments. Prepare specific examples of challenges you've faced in previous roles and how you resolved them, particularly in a fast-paced trading environment.