Data Engineer

Data Engineer

Full-Time 60000 - 80000 £ / year (est.) No home office possible
Eaglecliff Recruitment

At a Glance

  • Tasks: Build and optimise data pipelines for risk analytics in a dynamic trading environment.
  • Company: Leading financial firm in West London with a focus on innovation.
  • Benefits: Attractive salary, comprehensive benefits, and opportunities for professional growth.
  • Other info: Collaborative culture with a focus on continuous learning and career advancement.
  • Why this job: Join a cutting-edge team and make an impact in the world of data engineering.
  • Qualifications: Experience with Microsoft Fabric, Azure services, Python, SQL, and data pipelines.

The predicted salary is between 60000 - 80000 £ per year.

Full-time in West London office 5 days a week, great salary and benefits package offered. The Data Engineer with Commodity Trading and Data Risk experience will focus on building, managing, and optimizing risk analytics workflows. You will play a key role in designing and evolving data pipelines, models, and tooling that support critical risk processes, while also contributing to the broader data platform leveraging Microsoft Fabric. Working closely with the platform team, you will help develop shared infrastructure, scalable ingestion frameworks, and high-quality data products. This role sits at the intersection of data engineering, risk technology and modern data platform practices within a global, always-on trading environment.

Job Accountabilities

  • Understand risk workflows end-to-end and translate them into reliable, production-grade data pipelines and products.
  • Build and maintain batch and near-real-time data ingestion pipelines from diverse sources including relational databases, REST APIs, FTP/SFTP feeds, and cloud storage.
  • Contribute to the data platform delivering harmonised, governed data products that serve multiple business functions.
  • Collaborate with risk, analytics, and engineering teams to productionise and maintain risk models and scripts.
  • Implement best practices for code quality, testing, and release management across the data platform.
  • Build and support Power BI semantic models and DirectLake datasets.
  • Manage and maintain Azure DevOps pipelines for deployment, version control, and CI/CD of risk-related scripts and data workflows.
  • Monitor system performance and troubleshoot issues related to data pipelines and deployments.
  • Ensure proper data governance, security, and compliance standards are applied.

Required Skills & Experience

  • Hands‑on experience with Microsoft Fabric, Azure data services (e.g., Synapse Analytics, Data Factory), or Databricks for large‑scale data processing.
  • Proficiency in Python / SQL for data engineering and scripting.
  • Familiarity with risk analytics environments or financial data.
  • Strong experience with Apache Spark (Spark Engine), including performance optimisation and distributed data processing.
  • Strong experience ingesting data from diverse sources including relational databases, REST APIs, FTP/SFTP file feeds, and cloud storage.
  • Experience managing data pipelines and production workflows.
  • Experience with Azure DevOps (CI/CD pipelines, repos, release management).
  • Experience with version control (Git) and software development lifecycle practices.

Nice to Have

  • Experience with streaming data technologies (e.g. Kafka, Azure Event Hubs).
  • Exposure to metadata‑driven framework design and config‑driven pipeline development.
  • Knowledge of non‑relational databases (e.g. MongoDB, Cosmos DB).
  • Familiarity with Data Mesh principles and domain‑oriented data ownership.
  • Experience with monitoring/logging tools in Azure.

Data Engineer employer: Eaglecliff Recruitment

Join a dynamic and innovative team in West London, where as a Data Engineer, you will thrive in a collaborative work culture that prioritises employee growth and development. With a competitive salary and comprehensive benefits package, you'll have the opportunity to work on cutting-edge data technologies while contributing to critical risk processes in a global trading environment. Our commitment to fostering a supportive atmosphere ensures that your contributions are valued, making this an excellent place for meaningful and rewarding employment.
Eaglecliff Recruitment

Contact Detail:

Eaglecliff Recruitment Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Data Engineer

✨Tip Number 1

Network like a pro! Reach out to folks in the industry, attend meetups, and connect with people on LinkedIn. You never know who might have the inside scoop on job openings or can refer you directly.

✨Tip Number 2

Show off your skills! Create a portfolio showcasing your data engineering projects, especially those involving Microsoft Fabric or Azure services. This will give potential employers a taste of what you can do and set you apart from the crowd.

✨Tip Number 3

Prepare for interviews by brushing up on common data engineering questions and scenarios. Practice explaining your past projects and how they relate to risk analytics and data pipelines. Confidence is key!

✨Tip Number 4

Don’t forget to apply through our website! We’ve got some fantastic opportunities waiting for you, and applying directly can sometimes give you a leg up in the process. Let’s get you that dream job!

We think you need these skills to ace Data Engineer

Microsoft Fabric
Azure Data Services
Databricks
Python
SQL
Apache Spark
Data Ingestion
Azure DevOps
CI/CD Pipelines
Version Control (Git)
Data Governance
Risk Analytics
Power BI
Performance Optimisation
Troubleshooting

Some tips for your application 🫡

Tailor Your CV: Make sure your CV is tailored to the Data Engineer role. Highlight your experience with Microsoft Fabric, Azure data services, and any relevant projects that showcase your skills in building data pipelines and risk analytics workflows.

Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how your background aligns with our needs. Mention specific experiences that relate to risk technology and data governance.

Showcase Your Technical Skills: Don’t forget to list your technical skills clearly. We want to see your proficiency in Python, SQL, and any experience with Apache Spark or Azure DevOps. Be specific about the tools and technologies you've used in past roles.

Apply Through Our Website: We encourage you to apply through our website for the best chance of getting noticed. It’s super easy, and you’ll be able to upload your CV and cover letter directly. Plus, it helps us keep track of your application!

How to prepare for a job interview at Eaglecliff Recruitment

✨Know Your Data Inside Out

Make sure you understand the end-to-end risk workflows and how they translate into data pipelines. Brush up on your knowledge of Microsoft Fabric, Azure services, and the specific tools mentioned in the job description. Being able to discuss these confidently will show that you're ready to hit the ground running.

✨Showcase Your Technical Skills

Prepare to demonstrate your proficiency in Python and SQL during the interview. You might be asked to solve a problem or explain how you've used these skills in past projects. Bring examples of your work with data ingestion from various sources and any experience with Apache Spark to the table.

✨Collaboration is Key

This role involves working closely with different teams, so be ready to talk about your collaborative experiences. Share examples of how you've worked with analytics, engineering, or risk teams to productionise models and scripts. Highlight your ability to communicate complex ideas clearly.

✨Emphasise Best Practices

Discuss your approach to code quality, testing, and release management. Be prepared to explain how you implement best practices in your work, especially regarding CI/CD pipelines and data governance. This will show that you not only know how to build data products but also how to maintain them effectively.

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

>