At a Glance
- Tasks: Build and optimise data pipelines for risk analytics in a dynamic trading environment.
- Company: Join a leading firm in energy trading with a focus on innovation.
- Benefits: Attractive salary, comprehensive benefits, and opportunities for professional growth.
- Other info: Collaborative culture with a focus on career development and cutting-edge technology.
- Why this job: Make an impact in the fast-paced world of data engineering and risk technology.
- Qualifications: Experience in data engineering, Python/SQL, and Azure services required.
The predicted salary is between 60000 - 80000 £ per year.
Full-time in West London office 5 days a week, great salary and benefits package offered. The Data Engineer with Commodity Trading and Data Risk experience will focus on building, managing, and optimizing risk analytics workflows. You will play a key role in designing and evolving data pipelines, models, and tooling that support critical risk processes, while also contributing to the broader data platform leveraging Microsoft Fabric. Working closely with the platform team, you will help develop shared infrastructure, scalable ingestion frameworks, and high-quality data products.
This role sits at the intersection of data engineering, risk technology and modern data platform practices within a global, always-on trading environment. Understand risk workflows end-to-end and translate them into reliable, production-grade data pipelines and products.
- Build and maintain batch and near-real-time data ingestion pipelines from diverse sources including relational databases, REST APIs, FTP/SFTP feeds, and cloud storage.
- Contribute to the data platform delivering harmonised, governed data products that serve multiple business functions.
- Collaborate with risk, analytics, and engineering teams to productionize and maintain risk models and scripts.
- Implement best practices for code quality, testing, and release management across the data platform.
- Build and support Power BI semantic models and DirectLake datasets.
- Manage and maintain Azure DevOps pipelines for deployment, version control, and CI/CD of risk-related scripts and data workflows.
- Monitor system performance and troubleshoot issues related to data pipelines and deployments.
- Ensure proper data governance, security, and compliance standards are applied.
Hands-on experience with Microsoft Fabric, Azure data services (e.g., Synapse Analytics, Data Factory), or Databricks for large-scale data processing.
- Proficiency in Python / SQL for data engineering and scripting.
- Familiarity with risk analytics environments or financial data.
- Strong experience with Apache Spark (Spark Engine), including performance optimization and distributed data processing.
- Strong experience ingesting data from diverse sources including relational databases, REST APIs, FTP/SFTP file feeds, and cloud storage.
- Experience managing data pipelines and production workflows.
- Experience with Azure DevOps (CI/CD pipelines, repos, release management).
- Experience with version control (Git) and software development lifecycle practices.
- Experience with streaming data technologies (e.g. Kafka, Azure Event Hubs).
- Knowledge of non-relational databases (e.g. MongoDB, Cosmos DB).
- Familiarity with Data Mesh principles and domain-oriented data ownership.
- Experience with monitoring/logging tools in Azure.
With a focus within Energy Trading, Oil & Gas, Financial Markets and Commodities, we offer a transparent Recruitment Service that has proven to be reliable and effective for over 40 years.
Ingegnere dei dati (f/m) in City of London employer: Eaglecliff Recruitment
Contact Detail:
Eaglecliff Recruitment Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Ingegnere dei dati (f/m) in City of London
✨Tip Number 1
Network like a pro! Reach out to people in the industry, attend meetups, and connect with professionals on LinkedIn. You never know who might have the inside scoop on job openings or can refer you directly.
✨Tip Number 2
Prepare for interviews by practising common questions and showcasing your skills. Use real-life examples from your experience with data pipelines and risk analytics to demonstrate your expertise. We want to see how you think on your feet!
✨Tip Number 3
Don’t just apply anywhere; focus on companies that align with your values and career goals. Check out our website for roles that excite you, and tailor your approach to each one. It shows you’re genuinely interested!
✨Tip Number 4
Follow up after interviews! A quick thank-you email can go a long way in keeping you top of mind. Mention something specific from your conversation to remind them why you’re the perfect fit for the role.
We think you need these skills to ace Ingegnere dei dati (f/m) in City of London
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Data Engineer role. Highlight your experience with data pipelines, risk analytics, and any relevant technologies like Microsoft Fabric or Azure services. We want to see how your skills match what we're looking for!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how your background fits into our team. Be sure to mention specific projects or experiences that relate to the job description.
Showcase Your Technical Skills: Don’t forget to showcase your technical skills in Python, SQL, and any experience with tools like Apache Spark or Azure DevOps. We love seeing practical examples of how you've used these skills in past roles, so include those details!
Apply Through Our Website: We encourage you to apply through our website for a smoother application process. It helps us keep track of your application and ensures you don’t miss out on any important updates from us!
How to prepare for a job interview at Eaglecliff Recruitment
✨Know Your Tech Stack
Make sure you’re well-versed in the technologies mentioned in the job description, especially Microsoft Fabric, Azure services, and Apache Spark. Brush up on your Python and SQL skills, as you'll likely be asked to demonstrate your proficiency during the interview.
✨Understand Risk Workflows
Since this role focuses on risk analytics, take the time to understand end-to-end risk workflows. Be prepared to discuss how you would translate these workflows into reliable data pipelines, showcasing your ability to connect technical skills with business needs.
✨Showcase Collaboration Skills
This position requires working closely with various teams. Think of examples from your past experiences where you collaborated effectively with others, particularly in engineering or analytics contexts. Highlight your communication skills and how you can contribute to a team environment.
✨Prepare for Problem-Solving Questions
Expect questions that assess your troubleshooting abilities, especially related to data pipelines and system performance. Prepare to discuss specific challenges you've faced in previous roles and how you resolved them, demonstrating your analytical thinking and problem-solving skills.