DataOps Engineer

DataOps Engineer

Full-Time 60000 - 80000 ÂŁ / year (est.) No home office possible
CoreWeave

At a Glance

  • Tasks: Own and enhance data observability and operations for our cutting-edge platform.
  • Company: Join a forward-thinking tech company focused on innovative disruption.
  • Benefits: Enjoy comprehensive health insurance, generous pension contributions, and tuition reimbursement.
  • Other info: Collaborative environment with opportunities for mentorship and career growth.
  • Why this job: Make a real impact by optimising data pipelines and enhancing operational practices.
  • Qualifications: 5-6+ years in DataOps or similar roles with strong technical skills.

The predicted salary is between 60000 - 80000 ÂŁ per year.

About the Role

We’re seeking a Senior DataOps Engineer II who can act as the hands‑on owner for Monolith’s data observability and operational surface: from batch and streaming pipelines running on our platform, through to the lineage, quality, and runbooks that keep customer environments healthy.

In This Role, You Will

  • Own Monolith’s Data Observability & Operations Surface Design and implement the end‑to‑end observability stack for data workloads (metrics, logs, traces, and data‑quality signals) across batch and streaming pipelines. Define and maintain operational SLOs/SLAs for critical data flows powering training, inference, and analytics, and ensure they are measurable and actionable. Build dashboards, alerts, and runbooks that allow engineers and on‑call responders to quickly detect, triage, and remediate data incidents. Standardise “golden paths” for how teams instrument pipelines, expose health signals, and respond to data‑related failures.
  • Implement Data Lineage, Quality & Governance Deploy and maintain end‑to‑end data lineage for key domains — from client sources through transformations to features, models, and downstream analytics so teams can debug, audit, and reason about change. Define and roll out data quality checks (schema, freshness, completeness, distribution, drift) and ensure failures integrate cleanly into alerting and incident workflows. Partner with Security, Compliance, and customer‑facing teams to encode data governance requirements (e.g., retention, residency, access controls) into our pipelines and tooling. Help shape metadata models and catalog conventions so that producers and consumers can reliably discover, understand, and use shared datasets.
  • Enable DataOps Practices Across Teams Establish CI/CD patterns for data pipelines and related infrastructure, including testing strategies, promotion workflows, and change‑management guardrails. Drive adoption of infra‑as‑code for data infrastructure (e.g., pipeline orchestration, storage, observability components), reducing manual drift across environments. Define and continuously improve DataOps processes — incident response, post‑incident review, change review, on‑call rotations — with a focus on learning rather than blame. Evaluate and integrate best‑of‑breed DataOps and observability tooling where it accelerates our teams, balancing build vs. buy pragmatically.
  • Partner Across Monolith, CoreWeave & Clients Work with Monolith platform, data, agent, and reliability teams to expose observability and lineage as shared services and patterns other engineers can build on. Collaborate with CoreWeave infrastructure and AI platform teams to leverage underlying storage, compute, networking, and observability in service of robust data flows. Serve as a technical escalation point for forward‑deployed and customer‑facing engineers when data issues cross service boundaries or require deeper architectural insight. Mentor data producers (product teams, integrations, forward‑deployed engineers) and data consumers (data scientists, analysts, client engineers) on resilient schemas, contracts, and operational practices.

Who You Are

  • Experience & Level Typically 5–6+ years of experience in DataOps, Data Engineering, DevOps/SRE for data platforms, or similar roles, including end‑to‑end ownership of production data pipelines and their operations. Proven track record of operating at Senior IC scope: leading cross‑team initiatives, introducing new practices/tooling, and improving reliability at the platform level.
  • DataOps, Pipelines & Tooling Strong hands‑on experience designing, deploying, and operating data pipelines in production (batch and/or streaming), including failure modes, retries, and backfills. Practical experience with data orchestration and ETL/ELT tooling (e.g., Airflow, Dagster, dbt, Temporal, or similar) and comfort evaluating and integrating new tools where appropriate. Solid SQL and/or Spark skills and experience with at least one major analytical database or warehouse; familiarity with time‑series / telemetry data is a plus.
  • Observability, Lineage & Data Quality Extensive experience implementing data observability — metrics, logging, tracing, dashboards, and alerting — for data‑centric workloads. Hands‑on work with data quality frameworks and/or observability platforms to monitor freshness, completeness, schema changes, and anomalies. Experience deploying and using data lineage or metadata/catalog solutions, and applying them to debugging, compliance, and change‑impact analysis.
  • Platform, Infrastructure & Automation Comfortable working in containerised, cloud‑native environments (Kubernetes plus at least one major cloud provider); experience with GPU‑ or compute‑intensive workloads is a bonus. Strong automation mindset: infra‑as‑code, CI/CD, and configuration management for data infrastructure and observability components. Proficient in Python for building tooling, pipeline glue, and platform integrations; additional languages are a plus.
  • Collaboration, Mentorship & Communication Clear communicator who can explain complex data flows and failure modes to both deeply technical and non‑specialist audiences. Experience mentoring engineers and data practitioners on better data management, observability, and operational hygiene — through documentation, examples, reviews, and office hours. Comfortable working in a fast‑moving, high‑ambiguity environment where we balance rapid iteration with the safety and reliability demanded by enterprise engineering clients.

Preferred

  • Experience in ML/AI platforms or MLOps environments where data pipelines power experimentation, training, and inference at scale.
  • Background with test, simulation, or time‑series data (e.g., physical test benches, battery labs, automotive/aerospace R&D).
  • Familiarity with feature stores, experiment tracking, or model registries and their interaction with upstream data pipelines.
  • Prior work in multi‑tenant SaaS platforms, especially those with strong compliance, observability, and uptime requirements.
  • Experience supporting or partnering closely with forward‑deployed / professional services teams in complex customer environments.

Benefits

  • Family-level Medical Insurance
  • Family-level Dental Insurance
  • Generous Pension Contribution
  • Life Assurance at 4x Salary
  • Critical Illness Cover
  • Employee Assistance Programme
  • Tuition Reimbursement
  • Work culture focused on innovative disruption

Equal Employment Opportunity

CoreWeave is an equal opportunity employer, committed to fostering an inclusive and supportive workplace. All qualified applicants and candidates will receive consideration for employment without regard to race, color, religion, sex, disability, age, sexual orientation, gender identity, national origin, veteran status, or genetic information.

DataOps Engineer employer: CoreWeave

At Monolith, we pride ourselves on being an exceptional employer that champions innovation and collaboration in the heart of a dynamic tech environment. Our commitment to employee growth is reflected in our comprehensive benefits package, including family-level medical and dental insurance, generous pension contributions, and tuition reimbursement, all within a work culture that encourages creative disruption and inclusivity. Join us to not only advance your career as a Senior DataOps Engineer but also to be part of a team that values your contributions and fosters a supportive atmosphere for professional development.
CoreWeave

Contact Detail:

CoreWeave Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land DataOps Engineer

✨Tip Number 1

Network like a pro! Reach out to folks in the industry, attend meetups, and connect with people on LinkedIn. You never know who might have the inside scoop on job openings or can refer you directly.

✨Tip Number 2

Show off your skills! Create a portfolio or GitHub repository showcasing your projects, especially those related to DataOps and data pipelines. This gives potential employers a taste of what you can do beyond your CV.

✨Tip Number 3

Prepare for interviews by practising common DataOps scenarios. Think about how you'd handle data incidents or implement observability stacks. The more you rehearse, the more confident you'll feel when it’s showtime!

✨Tip Number 4

Don’t forget to apply through our website! It’s the best way to ensure your application gets seen. Plus, we love seeing candidates who are proactive about their job search.

We think you need these skills to ace DataOps Engineer

Data Observability
Operational SLOs/SLAs
Dashboard Creation
Data Quality Checks
Data Lineage
CI/CD for Data Pipelines
Data Orchestration
ETL/ELT Tooling
SQL
Spark
Metrics and Logging
Kubernetes
Python
Collaboration
Mentorship

Some tips for your application 🫡

Tailor Your CV: Make sure your CV is tailored to the DataOps Engineer role. Highlight your relevant experience with data pipelines, observability, and any tools mentioned in the job description. We want to see how your skills align with what we're looking for!

Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about DataOps and how your background makes you a great fit for our team. Don’t forget to mention any specific projects or achievements that relate to the role.

Showcase Your Technical Skills: Since this role is quite technical, make sure to showcase your hands-on experience with data orchestration tools and programming languages like Python. We love seeing practical examples of your work, so feel free to include links to any relevant projects or GitHub repositories.

Apply Through Our Website: We encourage you to apply through our website for the best chance of getting noticed. It’s super easy, and you’ll be able to submit all your materials in one go. Plus, it helps us keep track of your application better!

How to prepare for a job interview at CoreWeave

✨Know Your DataOps Inside Out

Make sure you brush up on your DataOps knowledge before the interview. Be ready to discuss your hands-on experience with data pipelines, observability, and tooling. Prepare specific examples of how you've implemented data quality checks or designed observability stacks in previous roles.

✨Showcase Your Collaboration Skills

Since this role involves partnering across teams, be prepared to share examples of how you've successfully collaborated with different departments. Highlight any mentoring experiences you've had, especially in guiding engineers on data management and operational practices.

✨Demonstrate Your Problem-Solving Abilities

Expect questions that assess your ability to troubleshoot data incidents. Think of scenarios where you had to triage and remediate data issues. Discuss the tools and processes you used to ensure data reliability and how you handled failures in production environments.

✨Be Ready for Technical Questions

Prepare for technical questions related to data orchestration, SQL, and cloud-native environments. Brush up on your knowledge of tools like Airflow or dbt, and be ready to explain how you've used them in past projects. This will show your depth of understanding and readiness for the role.

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

>