At a Glance
- Tasks: Design and maintain scalable data pipelines for telemetry and observability.
- Company: Dynamic tech firm in Sheffield with a focus on innovation.
- Benefits: Competitive daily rate, flexible contract, and opportunity to work with cutting-edge tech.
- Why this job: Join a team that drives operational insight and automation in a modern tech environment.
- Qualifications: Experience with Kafka, OpenShift, and strong Python skills required.
- Other info: 9+ month contract with potential for career growth and diverse team collaboration.
9+ Month Contract based in Sheffield
Β£395 - Β£442 per day Inside IR35
BPSS clearance required - candidates must be eligible
My client is seeking a Data Engineer to design and operate large-scale telemetry and observability data pipelines within a modern OpenShift and Kafka ecosystem. This role is central to enabling proactive, Level 4 observability, delivering high-quality metrics, logs, and traces to support platform reliability, operational insight, and automation.
Responsibilities:
- Design, implement and maintain scalable data pipelines to ingest and process OpenShift telemetry (metrics, logs, traces)
- Stream telemetry through Kafka (producers, topics, schemas) and build resilient consumer services for enrichment and transformation
- Engineer multi-tenant observability data models, ensuring data lineage, quality controls and SLAs across streaming layers
- Integrate processed telemetry into Splunk for dashboards, analytics, alerting and operational insights
- Implement schema management and governance using Avro/Protobuf, including versioning and compatibility strategies
- Build automated validation, replay and backfill mechanisms to ensure data reliability and recovery
- Instrument services using OpenTelemetry, standardising tracing, metrics and structured logging
- Apply LLMs to enhance observability, such as query assistance, anomaly summarisation and runbook generation
- Collaborate with Platform, SRE and Application teams to align telemetry, alerts and SLOs
- Ensure pipelines meet security, compliance and best-practice standards
- Produce clear documentation covering data flows, schemas, dashboards and operational runbooks
Skills & Experience:
- Strong hands-on experience building streaming data pipelines with Kafka (producers/consumers, schema registry, Kafka Connect, KSQL/KStreams)
- Experience with OpenShift/Kubernetes telemetry, including OpenTelemetry and Prometheus
- Proven capability integrating telemetry into Splunk (HEC, Universal Forwarders, sourcetypes, CIM, dashboards, alerting)
- Solid data engineering skills in Python (or similar) for ETL/ELT, enrichment and validation
Please apply for immediate interview!
Data Engineer - Contract - 9+ Months in Sheffield employer: CBS Butler
Contact Detail:
CBS Butler Recruiting Team
StudySmarter Expert Advice π€«
We think this is how you could land Data Engineer - Contract - 9+ Months in Sheffield
β¨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field, especially those who work with OpenShift and Kafka. A friendly chat can lead to insider info about job openings or even referrals.
β¨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects related to data pipelines and telemetry. This is your chance to demonstrate your hands-on experience with Kafka and OpenTelemetry, making you stand out to potential employers.
β¨Tip Number 3
Prepare for interviews by brushing up on common data engineering scenarios. Be ready to discuss how you've tackled challenges in building scalable data pipelines and integrating telemetry into platforms like Splunk.
β¨Tip Number 4
Donβt forget to apply through our website! Weβve got loads of opportunities that might be perfect for you. Plus, applying directly can sometimes give you an edge over other candidates.
We think you need these skills to ace Data Engineer - Contract - 9+ Months in Sheffield
Some tips for your application π«‘
Tailor Your CV: Make sure your CV highlights your experience with data pipelines, Kafka, and OpenShift. We want to see how your skills match the job description, so donβt be shy about showcasing relevant projects!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why youβre the perfect fit for this Data Engineer role. Share specific examples of your work with telemetry and observability that align with what weβre looking for.
Showcase Your Technical Skills: Donβt forget to mention your hands-on experience with Python, ETL/ELT processes, and integrating telemetry into Splunk. We love seeing candidates who can demonstrate their technical prowess clearly and confidently.
Apply Through Our Website: We encourage you to apply directly through our website for a smoother application process. It helps us keep track of your application and ensures you donβt miss out on any updates from us!
How to prepare for a job interview at CBS Butler
β¨Know Your Tech Stack
Make sure youβre well-versed in the technologies mentioned in the job description, especially Kafka, OpenShift, and Python. Brush up on your experience with streaming data pipelines and be ready to discuss specific projects where you've implemented these technologies.
β¨Showcase Your Problem-Solving Skills
Prepare to share examples of how you've tackled challenges in data engineering. Think about times when you had to ensure data reliability or implement schema management. This will demonstrate your ability to handle the responsibilities outlined in the role.
β¨Understand the Importance of Observability
Since this role focuses on telemetry and observability, be prepared to discuss how youβve designed or improved observability in past projects. Highlight your experience with tools like Splunk and OpenTelemetry, and how they contribute to operational insights.
β¨Ask Insightful Questions
At the end of the interview, donβt forget to ask questions that show your interest in the role and the company. Inquire about their current data pipeline challenges or how they measure success in their observability efforts. This shows youβre engaged and thinking critically about the position.