At a Glance
- Tasks: Lead data engineering strategy and build state-of-the-art data products on Azure and Snowflake.
- Company: Dynamic firm at the forefront of data engineering in trading and finance.
- Benefits: Competitive salary, flexible working options, and opportunities for professional growth.
- Other info: Join a high-performing team and thrive in a fast-paced, innovative environment.
- Why this job: Make a real impact by transforming complex data into valuable insights and products.
- Qualifications: 12+ years in data engineering with leadership experience in regulated environments.
The predicted salary is between 80000 - 100000 £ per year.
Own the strategy and execution of best-in-class data engineering to deliver state-of-the-art data products and services at scale. Build and operate a modern estate on Azure + Snowflake, centred on event-driven architectures and high-throughput ingestion/pipelines that feed analytics, risk, and AI/ML safely and cost-effectively. Establish the standards, tooling and talent model that convert complex trading data into fast, reliable, governed, and reusable products, aligned to the firm’s semantic/knowledge-graph backbone.
Define and execute the global data engineering strategy (ingest → govern → serve → observe), aligned with enterprise architecture and governance. Standardise event patterns (Kafka/Flink), ELT (dbt/Spark/SQL), and serving layers (APIs/SQL/Graph) across regions. Build and coach high-performing squads; manage the engineering capacity plan; anticipate peaks and scale out via vetted staff-augmentation partners without lowering the bar. Run an objective skills framework, hiring rubric, and career paths; ensure global follow-the-sun support on critical flows.
Partner with Trading, Risk, Ops/Logistics, Finance/Settlement and Compliance to prioritise a value-backlog; communicate trade-offs on latency, cost and control. Align with Architecture on ontology/knowledge-graph mapping; with Governance on evidence and controls; with Platform/Operations on environments, access and DR.
Reliability: SLOs met on market-critical paths; deterministic replay proven quarterly; MTTR trending down.
Speed & Reuse: Time-to-first-value for new products reduced by >50%; adoption of golden paths/templates across squads >60%.
Cost: Unit economics (cost per product/feature/inference) visible; ≥15–25% cost-to-serve reduction through optimisation/deprecation.
Compliance: Zero critical audit findings on lineage, access, retention; automated evidence packs.
Talent & Capacity: Bench strength in core skills; surge capacity activated without quality or security regressions.
12+ years in data engineering/platform roles, 5+ years leading multi-region teams in real-time, regulated environments (ideally commodity trading/energy/financial markets).
Global Head of Data: strategy, budget, risk appetite, and executive reporting. Lead Data Solution Architect & team: domain roadmaps, solution assurance, reuse/adoption metrics. Platform Ops and Data Engineering: CI/CD, observability, identity/secrets, DR/BCP and FinOps. Data and AI Governance, Risk, Compliance & Internal Audit: model risk, evidence automation, regulatory readiness, fine-grained FinOps Enablement. Business Lines: Trading, Risk, Ops/Logistics, Asset Ops/SCADA, Finance/Settlement, Market Analysis, value mapping and SLO reviews.
Commodity depth: Power/gas, Oil, Derivatives, time-series operational dashboards.
Knowledge-graph awareness: semantic layers (entity/relationship drill-through, lineage/impact views, consistent business terms).
Advanced geospatial: Mapbox/Leaflet, tiling strategies, clustering, and projection choices for assets, routes, and weather overlays.
LLM-assisted UX: Patterns for in-workflow assistants, retrieval-augmented explanations, and safe inline summarisation of alerts/incidents.
Design for low-latency streams: Live updates, batching, and diff-only rendering for high-frequency market data.
BI engineering partnership: Custom visual specs, semantic model constraints (star/snowflake), and row-level security/RBAC considerations.
Security basics: Understanding of ABAC/RBAC, PII handling, export controls, and auditability of user actions.
Performance tuning: p95/p99 render targets, bundle hygiene, virtualisation of large grids, and caching strategies.
Quant empathy: Comfortable discussing VaR/PFE math at a conceptual level to avoid misrepresenting risk semantics.
Prototyping breadth: Interactive prototypes wired to mock APIs; ability to script lightweight data fixtures.
Change management: Training kits, walkthroughs, and adoption campaigns for front-office and operations users.
Highly numerate, rigorous, and resilient in problem-solving. Ability to prioritise, multitask, and deliver under time constraints. Strong written and verbal communication in English. Self-motivated, proactive, and detail-oriented. Comfortable working under pressure in a fast-paced environment. Excellent communication skills, ability to explain technical topics clearly. Team player with ability to collaborate across engineering, quant, and trading teams.
Data Engineer Manager employer: Gunvor Group
Contact Detail:
Gunvor Group Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer Manager
✨Tip Number 1
Network like a pro! Get out there and connect with folks in the industry. Attend meetups, webinars, or even just grab a coffee with someone who’s already in the data engineering space. You never know who might have the inside scoop on job openings!
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your best projects, especially those involving Azure, Snowflake, or event-driven architectures. This will give potential employers a taste of what you can do and set you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on your technical knowledge and soft skills. Be ready to discuss your experience with high-throughput ingestion/pipelines and how you’ve tackled challenges in past roles. Practice makes perfect!
✨Tip Number 4
Don’t forget to apply through our website! We’re always on the lookout for talented individuals who can help us deliver state-of-the-art data products. Plus, it’s a great way to ensure your application gets seen by the right people.
We think you need these skills to ace Data Engineer Manager
Some tips for your application 🫡
Tailor Your CV: Make sure your CV reflects the specific skills and experiences that align with the Data Engineer Manager role. Highlight your expertise in Azure, Snowflake, and event-driven architectures to catch our eye!
Craft a Compelling Cover Letter: Use your cover letter to tell us why you're passionate about data engineering and how you can contribute to our mission. Share examples of your past successes in building high-performing teams and delivering data products.
Showcase Your Technical Skills: Don’t shy away from detailing your technical prowess! Mention your experience with Kafka, dbt, and CI/CD processes. We love seeing candidates who can demonstrate their hands-on experience with the tools we use.
Apply Through Our Website: We encourage you to apply directly through our website for the best chance of getting noticed. It’s the easiest way for us to keep track of your application and ensure it reaches the right people!
How to prepare for a job interview at Gunvor Group
✨Know Your Data Engineering Fundamentals
Brush up on your data engineering principles, especially around Azure and Snowflake. Be ready to discuss event-driven architectures and high-throughput ingestion pipelines, as these are crucial for the role.
✨Showcase Your Leadership Skills
Prepare examples of how you've built and coached high-performing teams in the past. Highlight your experience with managing capacity plans and scaling teams effectively without compromising quality.
✨Understand the Business Context
Familiarise yourself with the trading, risk, and compliance aspects of the business. Be prepared to discuss how your data products can add value and improve decision-making in these areas.
✨Communicate Clearly and Confidently
Practice explaining complex technical concepts in simple terms. Strong communication skills are essential, so be ready to demonstrate your ability to collaborate across different teams and articulate your ideas clearly.