Senior Data Engineer

Senior Data Engineer

Full-Time 70000 - 90000 £ / year (est.) Home office possible
Keyrock

At a Glance

  • Tasks: Build and optimise data pipelines for trading and portfolio data in a dynamic environment.
  • Company: Join Keyrock, a leading innovator in the digital asset space with a diverse global team.
  • Benefits: Enjoy flexible remote work, competitive salary, and opportunities for professional growth.
  • Other info: Be part of a collaborative team culture with regular online and offline events.
  • Why this job: Shape the future of data engineering in a pioneering company at the forefront of digital assets.
  • Qualifications: 8+ years in data systems, strong Python and SQL skills, and a passion for problem-solving.

The predicted salary is between 70000 - 90000 £ per year.

About Keyrock

Since our beginnings in 2017, we've grown to be a leading change-maker in the digital asset space, renowned for our partnerships and innovation. Today, we partner with over 250 team members around the world. Our diverse team hails from 42 nationalities, with backgrounds ranging from self-taught DeFi natives to PhDs. Predominantly remote, we have hubs in London, Brussels, and Singapore, and host regular online and offline hangouts to keep the crew tight. We are trading on more than 80 trading venues, and working with a wide array of asset issuers. As a well-established market maker, our distinctive expertise led us to expand rapidly. Today, our services span market making, options trading, high-frequency trading, OTC, and DeFi trading desks. But we’re more than a service provider. We’re an initiator. We’re pioneers in adopting the Rust Development language for our algorithmic trading, and champions of its use in the industry. We support the growth of Web3 startups through our Accelerator Program. We upgrade ecosystems by injecting liquidity into promising DeFi, RWA, and NFT protocols. And we push the industry's progress with our research and governance initiatives. At Keyrock, we’re not just envisioning the future of digital assets. We’re actively building it.

About the team and why the role exists

The Central Data Team (CDT) is only a few months old, but data has been Keyrock's lifeblood since day one. We’re now building the Keyrock Data Platform to give Keyrockers, and the AI agents working alongside them, the data and context they need to act fast and autonomously within agreed boundaries and aligned with our shared goals. Doing that means taking data from across the company and making sense of it in real time for all the functions that depend on it: trading desks, wealth and asset management, product, risk, finance, compliance, and research to name a few. You'd be one of the early hires in CDT and we expect you to contribute to most of the bigger decisions and build work. The standards we settle on and the way the team ends up working are still open.

What You’ll Do

  • Build streaming and batch pipelines that ingest, normalise, and distribute market, trading, and portfolio data, resilient to feed and exchange failures.
  • Build the self-serve tooling (SDKs, patterns, templates, AI agents) so other teams publish, consume, and build on data products without waiting on us.
  • Own data contracts and schema evolution. Keep schema changes from turning into multi-team coordination events.
  • Design the lakehouse and time-series layer around consumer query patterns.
  • Build and evolve the Data Governance and Data Quality Framework: stale-feed detection, schema validation, range checks, idempotent writes, lineage, ownership, self-healing.
  • Build the derived analytics the business runs on: cross-exchange spreads, VWAP at depth, order book microstructure for the desks; portfolio views, exposure, performance for wealth and asset management.
  • Make observability, cost, and performance first-class from day one.
  • Treat infrastructure as code (Docker, Terraform, CI/CD) alongside our Central Infrastructure Team.
  • Work in the open: write things down, partner closely with Architecture, Infrastructure, Platform, and the rest of the teams.

What We’re Looking For

Tools matter less to us than how you think about problems. That said, here’s the shape of the person we have in mind.

Engineering Craft
  • 8+ years of building production data systems that other people rely on.
  • Strong proficiency in Python and SQL: not just being able to write a query, but being able to reason about what the engine is doing with it.
  • Code that’s easy for someone else to read, test, and delete later.
  • Strong understanding of data modelling for both streaming and analytical workloads.
  • Efficiency, quality, idempotency, and observability are taken seriously by default.
Systems Design
  • You’ve designed and operated streaming systems on Kafka, Redpanda, MSK, or Kinesis, and you have opinions about partitioning, consumer groups, offsets, and schema registries.
  • You’ve used a time-series store in production (ClickHouse ideally; TimescaleDB, QuestDB, or similar are fine too) and can talk about table design as a function of query patterns.
  • You’ve worked with a lakehouse architecture and reason about table layout, partitioning, and compaction as design choices that shape query performance and storage cost.
  • You build for self-healing and idempotency. Reprocessing is safe, retries don’t double-write, and the system recovers without a human in the loop.
Operational Readiness
  • Docker, Terraform, and CI/CD are how you work, not a separate 'DevOps' thing.
  • You think about cost and performance early.
  • You instrument as you build: logs, metrics, and traces are part of the system from day one.
  • You design for data quality and governance up front covering contracts, validation, lineage, and ownership.
How You Think and Work
  • You reason from first principles when a problem is new, stay pragmatic when it isn’t, and update your view when you learn more.
  • You treat the trading desks, wealth and asset management, product, risk, finance, compliance, and research as customers of what you build, and communicate with them that way.
  • You optimise for outcomes over output. A smaller, simpler thing that ships and works beats a bigger thing that doesn’t.
  • You take ownership end-to-end: design, ship, operate, improve.
  • You say what you think including when it’s an unpopular take. You change your mind when the argument is better.
  • You make the people around you better. Reviews are real, juniors grow from working with you, and peers want to work with you again.
  • You’re curious about how markets work. Data engineering on its own won’t keep you interested here.
  • You’re honest about what you know and what you don’t, and quick to close the gap.
  • You understand financial market data: order books, trades, reference data, portfolios, exposures. Crypto, TradFi, or both are a strong plus.

Nice to Have

  • Lakehouse experience with Apache Iceberg or Delta Lake.
  • Familiarity with DataHub or similar metadata/lineage platforms.
  • Rust. Some of our performance-critical services are written in it. Interest is welcome; fluency isn’t required.

What You’ll Get

  • A from-scratch mandate. You’ll be among the first hires in CDT, and the platform, the standards, and the team culture are yours to shape with us.
  • Strong partners. Close working relationships with Architecture, Infrastructure, Platform, and the desks themselves. You won’t be building data in a vacuum.
  • Autonomy on how you work. Flexible hours, remote-first, business-hours on-call shared across the team.
  • A competitive salary package, with various benefits.
  • A team that likes each other. Regular online get-togethers and a yearly onsite where everyone’s in the same room.

Our Recruitment Process

We’re after those with the right skills and a conscious choice to join our field. The perfect fit? Someone who’s driven, collaborative, acts with ownership and delivers solid, scalable outcomes. As an employer we are committed to building a positive and collaborative work environment. We welcome employees of all backgrounds, and hire, reward and promote entirely based on merit and performance. Due to the nature of our business and external requirements, we perform background checks on all potential employees, passing which is a prerequisite to join Key.

Senior Data Engineer employer: Keyrock

At Keyrock, we pride ourselves on being a pioneering force in the digital asset space, offering a dynamic and inclusive work culture that fosters innovation and collaboration. As a Senior Data Engineer, you'll have the unique opportunity to shape our Central Data Team from its inception, working alongside talented professionals from diverse backgrounds while enjoying flexible remote work options and competitive benefits. We are committed to your growth, providing a platform for you to make impactful contributions and advance your career in a rapidly evolving industry.
Keyrock

Contact Detail:

Keyrock Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Senior Data Engineer

✨Tip Number 1

Network like a pro! Reach out to current employees at Keyrock on LinkedIn or other platforms. Ask them about their experiences and the company culture. This not only shows your interest but can also give you insider info that might help you stand out in interviews.

✨Tip Number 2

Prepare for technical interviews by brushing up on your Python and SQL skills. Practice coding challenges that focus on data engineering problems. We want to see how you think, so be ready to explain your thought process as you solve problems.

✨Tip Number 3

Showcase your projects! If you've built any data systems or tools, make sure to discuss them during interviews. Highlight how they relate to the role at Keyrock, especially if they involve streaming data or governance frameworks.

✨Tip Number 4

Don’t forget to apply through our website! It’s the best way to ensure your application gets seen. Plus, it shows you’re genuinely interested in being part of our team and contributing to the exciting work we’re doing in the digital asset space.

We think you need these skills to ace Senior Data Engineer

Python
SQL
Data Modelling
Streaming Systems Design
Kafka
Redpanda
MSK
Kinesis
Time-Series Databases
ClickHouse
Lakehouse Architecture
Docker
Terraform
CI/CD
Data Quality and Governance

Some tips for your application 🫡

Tailor Your CV: Make sure your CV is tailored to the Senior Data Engineer role. Highlight your experience with data systems, Python, and SQL, and don’t forget to showcase any relevant projects that demonstrate your engineering craft.

Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re excited about joining Keyrock and how your skills align with our mission. Be genuine and let your personality come through!

Showcase Your Problem-Solving Skills: In your application, give examples of how you've tackled complex problems in the past. We love candidates who can reason from first principles and think critically about their work.

Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it shows you’re keen on joining our team!

How to prepare for a job interview at Keyrock

✨Know Your Data Inside Out

As a Senior Data Engineer, you’ll be expected to have a strong grasp of data modelling and systems design. Brush up on your knowledge of streaming systems like Kafka or Kinesis, and be ready to discuss how you’ve designed and operated these systems in the past. Show them you can reason about data flows and query patterns!

✨Showcase Your Problem-Solving Skills

Keyrock values how you think about problems more than the tools you use. Prepare to discuss specific challenges you've faced in previous roles and how you approached solving them. Use examples that highlight your ability to reason from first principles and adapt your views based on new information.

✨Communicate Like a Pro

You’ll be working closely with various teams, so it’s crucial to demonstrate your communication skills. Practice explaining complex technical concepts in simple terms. Be prepared to discuss how you treat different teams as customers and how you ensure their needs are met through your work.

✨Emphasise Ownership and Collaboration

Keyrock is looking for someone who takes ownership of their work and collaborates effectively. Share examples of projects where you took end-to-end responsibility, from design to deployment. Highlight how you’ve helped others grow and improved team dynamics, showing that you’re not just a lone wolf but a team player.

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

>