Head of Data in Slough

Head of Data in Slough

Slough Full-Time 80000 - 100000 £ / year (est.) No home office possible
PrismFP

At a Glance

  • Tasks: Lead a data team while designing and maintaining data pipelines for financial analytics.
  • Company: PrismFP Analytics, a leader in quantitative analytics for institutional investors.
  • Benefits: Competitive salary, bonuses, private medical insurance, and generous leave.
  • Other info: Hybrid work policy, continuous learning, and exciting travel opportunities.
  • Why this job: Make a real impact in finance with cutting-edge data technology and a dynamic team.
  • Qualifications: 5-10 years in data engineering, with leadership experience and financial dataset knowledge.

The predicted salary is between 80000 - 100000 £ per year.

About company

PrismFP Analytics builds quantitative analytics products for institutional investors from offices in London, New York, Copenhagen and Tallinn. We combine deep practitioner knowledge of financial markets with cloud-scale data infrastructure to power proprietary derivatives analytics, portfolio construction, and risk tooling for some of the largest names in finance.

The role

This is a balanced role - roughly 50% people leadership, 50% hands-on engineering. You will own our data strategy and roadmap while staying close enough to implementation to unblock your team, make architecture decisions with conviction, and ship code yourself when it matters.

Team

You will manage a team of 3 data professionals today, with a hire planned this year.

Reporting line

This is a high-profile role that reports directly to the CEO and includes hiring, performance review and team development responsibilities.

What you’ll do

  • Set and evolve the data strategy and roadmap in line with company priorities, balancing quick wins with scalable foundations.
  • Lead the data team (currently 3 people), including hiring (with a plan to add 1 more this year), performance reviews, coaching, and day-to-day delivery management.
  • Stay hands-on: design, build, and maintain data pipelines, datasets, and internal tooling that support quantitative research and product development.
  • Establish and own data quality, availability, and coverage metrics (KPIs), along with monitoring and alerting to keep data reliable.
  • Partner closely with quantitative researchers, software engineers, product owners, and the business to deliver end-to-end data capabilities.
  • Improve data engineering standards and practices (testing, code quality, documentation, and operational excellence) to help the team scale sustainably.

What success looks like (first 6 months)

  • Clear quality/availability/coverage KPIs are defined, visible, and used to guide priorities.
  • A clean taxonomy and stable interfaces/contracts exist for core datasets (definitions, naming, ownership, consumption patterns).
  • Incidents and manual fixes drop through better validation, monitoring/alerts, and clear ownership.
  • Strong data engineering practices are the default: testing, code review, documentation, and operational ownership.
  • A sustainable team cadence is in place: prioritisation, planning, and support expectations are clear.
  • Cost and performance are managed intentionally, with visibility into drivers and explicit trade-offs—without hurting reliability.

Who you are and what you have

  • You have 5–10+ years of experience in data engineering, with at least 2 years leading or managing a team—guiding, motivating, and developing colleagues to deliver shared outcomes.
  • You have experience working with financial datasets, ideally including market data feeds, derivatives reference data, and time-series pricing or similar domains.
  • You can align data initiatives with overarching business strategy, translating business priorities into a clear data roadmap your team can execute.
  • You have deep experience designing, building, and operating production data pipelines, including owning reliability, monitoring, and failure recovery for orchestration systems such as Apache Airflow (or equivalent).
  • You are fluent in modern data and cloud ecosystems, with hands-on experience (or the ability to ramp quickly) in technologies such as Iceberg, Spark, PostgreSQL, and cloud-native infrastructure on AWS (or equivalent).
  • You write clean, readable, and testable Python code, applying sound abstractions, naming, and review discipline to build systems that scale beyond the first version.
  • You bring strong analytical and problem-solving skills and can communicate ideas clearly—both in writing (e.g. concise design documents) and in discussion with engineers, product owners, and business stakeholders.
  • You exercise pragmatic judgment: knowing when to build robust foundations and when a well-scoped, pragmatic solution is the right trade-off, and you can converge when many possible solutions exist.
  • You work well within a team and believe in open discussion, inclusion, and diversity.
  • You like to explore new approaches and technologies to solve problems but can move decisively from exploration to execution.
  • A high university degree in Computer Science, Mathematics, Engineering, Physics, or similar.

Our approach and technology stack

  • Part hybrid work policy - min 4 days a week in office
  • Lean principles and Agile development practices
  • Continuous deployment across all microservices
  • Big Data ecosystem and SQL Databases (Apache Airflow, Iceberg, Spark/Thrift, PostgreSQL, Azure Hyperscale)
  • Python as a primary backend language
  • High-level frameworks (Flask, SQLAlchemy, Alembic, Pytest, Socket.IO)
  • Amazon Web Services and ecosystem (AWS core services, Lambda, RDS, EKS, S3, etc.)
  • Cloud-native technologies (Kubernetes, Helm, Docker)
  • Observability technologies (Prometheus, Jaeger, Loki, Grafana, Sentry)
  • Infrastructure automation and containerized CI/CD (GitLab, Terraform, Atlantis)

Benefits

  • Competitive salary / discretionary bonus / ESOP
  • Work closely with financial market practitioners
  • Private medical insurance
  • Pension salary sacrifice and contribution match
  • 25 days annual leave plus bank holidays
  • Regular team events - joint dinners, drinks
  • Travel opportunities to Estonia

Head of Data in Slough employer: PrismFP

PrismFP Analytics is an exceptional employer, offering a dynamic work environment in the heart of London where innovation meets financial expertise. With a strong focus on employee growth, competitive salaries, and a hybrid work policy, we empower our team to thrive both personally and professionally while collaborating closely with industry leaders. Our commitment to a diverse and inclusive culture, alongside regular team events and travel opportunities, makes PrismFP a rewarding place to build your career in data engineering.
PrismFP

Contact Detail:

PrismFP Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Head of Data in Slough

✨Tip Number 1

Network like a pro! Reach out to your connections in the finance and data engineering sectors. Attend industry events or webinars, and don’t be shy about introducing yourself. You never know who might have the inside scoop on job openings at PrismFP Analytics!

✨Tip Number 2

Show off your skills! If you’ve got a portfolio of projects or contributions to open-source, make sure to highlight them during interviews. Demonstrating your hands-on experience with data pipelines and cloud technologies can really set you apart from the crowd.

✨Tip Number 3

Prepare for those tricky questions! Brush up on your knowledge of financial datasets and data engineering best practices. Be ready to discuss how you would tackle real-world problems that PrismFP Analytics faces, showing that you’re not just a fit for the role but also passionate about their mission.

✨Tip Number 4

Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, it shows you’re genuinely interested in joining the team at PrismFP Analytics. Let’s get you that interview!

We think you need these skills to ace Head of Data in Slough

Data Strategy Development
Team Leadership
Data Pipeline Design and Maintenance
Data Quality Assurance
KPI Monitoring and Reporting
Collaboration with Cross-Functional Teams
Data Engineering Standards Improvement
Experience with Financial Datasets
Production Data Pipeline Management
Apache Airflow
Cloud Technologies (AWS)
Python Programming
Analytical Skills
Problem-Solving Skills
Agile Development Practices

Some tips for your application 🫡

Tailor Your CV: Make sure your CV reflects the skills and experiences that align with the Head of Data role. Highlight your leadership experience and hands-on engineering skills, especially in data strategy and pipeline management.

Craft a Compelling Cover Letter: Use your cover letter to tell us why you're the perfect fit for this role. Share specific examples of how you've led teams and tackled data challenges in the past, and don’t forget to mention your passion for financial datasets!

Showcase Your Technical Skills: We want to see your technical prowess! Include details about your experience with tools like Apache Airflow, PostgreSQL, and any cloud technologies you’ve worked with. This will help us understand your hands-on capabilities.

Apply Through Our Website: Don’t forget to submit your application through our website! It’s the best way for us to receive your materials and ensures you’re considered for the role. We can’t wait to see what you bring to the table!

How to prepare for a job interview at PrismFP

✨Know Your Data Inside Out

Make sure you’re well-versed in the specifics of data engineering, especially in financial datasets. Brush up on your experience with market data feeds and time-series pricing, as these will likely come up during the interview.

✨Showcase Your Leadership Skills

Since this role involves managing a team, be prepared to discuss your leadership style. Share examples of how you've guided and developed your team in the past, and how you plan to do so at PrismFP Analytics.

✨Demonstrate Hands-On Experience

Be ready to talk about your hands-on experience with data pipelines and cloud technologies. Highlight specific projects where you’ve designed or maintained systems using tools like Apache Airflow, Spark, or AWS.

✨Align with Company Goals

Understand PrismFP Analytics' business strategy and be prepared to discuss how you can align data initiatives with their goals. Show that you can translate business priorities into actionable data roadmaps.

Head of Data in Slough
PrismFP
Location: Slough

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

>